The conflict between need and fear: how privacy concerns moderate the influence of depression on university students' acceptance of AI music therapy

需求与恐惧之间的冲突:隐私顾虑如何调节抑郁症对大学生接受人工智能音乐疗法的影响

阅读:2

Abstract

BACKGROUND: AI-driven music therapy offers a promising, accessible digital intervention for the growing mental health crisis in universities. The "Deficiency Compensation Hypothesis" suggests that depression may drive students toward such digital help-seeking. However, the inherent data sensitivity of AI tools triggers the "Privacy Calculus," potentially inhibiting adoption. This study investigates the interplay between depression severity, privacy concerns, and the intention to use AI music therapy among university students. METHODS: A cross-sectional survey was conducted with 612 university students in China. The study measured depression levels (PHQ-8), AI-specific privacy concerns, perceived usefulness, and intention to use. A hierarchical regression model with moderation analysis was employed to examine whether privacy concerns weaken the association between distress and help-seeking motivation. RESULTS: Participants exhibited mild depression on average (PHQ-8 Mean = 6.07). Regression analysis revealed that depression positively predicted the intention to use AI music therapy (β = 0.128, p < 0.001), supporting the distress-driven help-seeking hypothesis. Crucially, privacy concerns acted as a significant negative moderator (β = -0.086, p = 0.015). Simple slope analysis indicated that the motivating effect of depression on usage intention was significant only for students with low privacy concerns but was nullified in those with high privacy concerns. CONCLUSION: The findings highlight a critical paradox in digital mental health: while depressive symptoms are positively associated with students' intention to seek AI-based help, privacy fears can significantly attenuate this association. For highly privacy-sensitive individuals, the need for therapeutic relief is overridden by the fear of surveillance. Consequently, developers and universities must prioritize "privacy by design" and transparent trust mechanisms, rather than relying solely on algorithmic precision, to ensure these tools can serve as effective emotional support for vulnerable students.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。