Research on the Impact of an AI Voice Assistant's Gender and Self-Disclosure Strategies on User Self-Disclosure in Chinese Postpartum Follow-Up Phone Calls

AI语音助手性别和自我披露策略对中国产后随访电话中用户自我披露的影响研究

阅读:1

Abstract

This study examines the application of AI voice assistants in Chinese postpartum follow-up phone calls, with particular focus on how interaction design strategies influence users' self-disclosure intention. A 2 (voice gender: female/male) × 3 (self-disclosure strategies: normal conversation without additional disclosure/objective factual disclosure/emotional and opinion-based disclosure) mixed experimental design (n = 395) was conducted to analyze how the gender and self-disclosure strategies of voice assistants affect users' stereotypes (perceived warmth and competence), and how these stereotypes, mediated by privacy calculus dimensions (perceived risks and perceived benefits), influence self-disclosure intention. The experiment measured various indicators using a 7-point Likert scale and performed data analysis through analysis of variance (ANOVA) and structural equation modeling (SEM). The results demonstrate that female voice assistants significantly enhance users' perceived warmth and competence, while emotional self-disclosure strategies significantly improve perceived warmth. Stereotypes about the voice assistant positively affect users' self-disclosure intention through the mediating effects of perceived risk and benefit, with perceived benefit exerting a stronger effect than perceived risk. These findings provide valuable insights for the design and application of AI voice assistants in healthcare, offering actionable guidance for enhancing user interaction and promoting self-disclosure in medical contexts.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。