Skip to main content
Journal cover image

The way you assess matters: User interaction design of survey chatbots for mental health

Publication ,  Journal Article
Jin, Y; Chen, L; Zhao, X; Cai, W
Published in: International Journal of Human Computer Studies
September 1, 2024

The global pandemic has pushed human society into a mental health crisis, prompting the development of various chatbots to supplement the limited mental health workforce. Several organizations have employed mental health survey chatbots for public mental status assessments. These survey chatbots typically ask closed-ended questions (Closed-EQs) to assess specific psychological issues like anxiety, depression, and loneliness, followed by open-ended questions (Open-EQs) for deeper insights. While Open-EQs are naturally presented conversationally in a survey chatbot, Closed-EQs can be delivered as embedded forms or within conversations, with the length of the questionnaire varying according to the psychological assessment. This study investigates how the interaction style of Closed-EQs and the questionnaire length affect user perceptions regarding survey credibility, enjoyment, and self-awareness, as well as their responses to Open-EQs in terms of quality and self-disclosure in a survey chatbot. We conducted a 2 (interaction style: form-based vs. conversation-based) × 3 (questionnaire length: short vs. middle vs. long) between-subjects study (N=213) with a loneliness survey chatbot. The results indicate that the form-based interaction significantly enhances the perceived credibility of the assessment, thereby improving response quality and self-disclosure in subsequent Open-EQs and fostering self-awareness. We discuss our findings for the interaction design of psychological assessment in a survey chatbot for mental health.

Duke Scholars

Published In

International Journal of Human Computer Studies

DOI

EISSN

1095-9300

ISSN

1071-5819

Publication Date

September 1, 2024

Volume

189

Related Subject Headings

  • Human Factors
  • 46 Information and computing sciences
  • 1702 Cognitive Sciences
  • 0806 Information Systems
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Jin, Y., Chen, L., Zhao, X., & Cai, W. (2024). The way you assess matters: User interaction design of survey chatbots for mental health. International Journal of Human Computer Studies, 189. https://doi.org/10.1016/j.ijhcs.2024.103290
Jin, Y., L. Chen, X. Zhao, and W. Cai. “The way you assess matters: User interaction design of survey chatbots for mental health.” International Journal of Human Computer Studies 189 (September 1, 2024). https://doi.org/10.1016/j.ijhcs.2024.103290.
Jin Y, Chen L, Zhao X, Cai W. The way you assess matters: User interaction design of survey chatbots for mental health. International Journal of Human Computer Studies. 2024 Sep 1;189.
Jin, Y., et al. “The way you assess matters: User interaction design of survey chatbots for mental health.” International Journal of Human Computer Studies, vol. 189, Sept. 2024. Scopus, doi:10.1016/j.ijhcs.2024.103290.
Jin Y, Chen L, Zhao X, Cai W. The way you assess matters: User interaction design of survey chatbots for mental health. International Journal of Human Computer Studies. 2024 Sep 1;189.
Journal cover image

Published In

International Journal of Human Computer Studies

DOI

EISSN

1095-9300

ISSN

1071-5819

Publication Date

September 1, 2024

Volume

189

Related Subject Headings

  • Human Factors
  • 46 Information and computing sciences
  • 1702 Cognitive Sciences
  • 0806 Information Systems