Consumer perspectives on the use of artificial intelligence technology and automation in crisis support services: mixed methods study

Jennifer S. Ma, Megan O’Riordan, Kelly Mazzer, Philip J. Batterham, Sally Bradford, Kairi Kõlves, Nickolai Titov, Britt Klein, Debra J. Rickwood*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)
35 Downloads (Pure)

Abstract

Background: Emerging technologies, such as artificial intelligence (AI), have the potential to enhance service responsiveness and quality, improve reach to underserved groups, and help address the lack of workforce capacity in health and mental health care. However, little research has been conducted on the acceptability of AI, particularly in mental health and crisis support, and how this may inform the development of responsible and responsive innovation in the area. Objective: This study aims to explore the level of support for the use of technology and automation, such as AI, in Lifeline’s crisis support services in Australia; the likelihood of service use if technology and automation were implemented; the impact of demographic characteristics on the level of support and likelihood of service use; and reasons for not using Lifeline’s crisis support services if technology and automation were implemented in the future. Methods: A mixed methods study involving a computer-assisted telephone interview and a web-based survey was undertaken from 2019 to 2020 to explore expectations and anticipated outcomes of Lifeline’s crisis support services in a nationally representative community sample (n=1300) and a Lifeline help-seeker sample (n=553). Participants were aged between 18 and 93 years. Quantitative descriptive analysis, binary logistic regression models, and qualitative thematic analysis were conducted to address the research objectives. Results: One-third of the community and help-seeker participants did not support the collection of information about service users through technology and automation (ie, via AI), and approximately half of the participants reported that they would be less likely to use the service if automation was introduced. Significant demographic differences were observed between the community and help-seeker samples. Of the demographics, only older age predicted being less likely to endorse technology and automation to tailor Lifeline’s crisis support service and use such services (odds ratio 1.48-1.66, 99% CI 1.03-2.38; P<.001 to P=.005). The most common reason for reluctance, reported by both samples, was that respondents wanted to speak to a real person, assuming that human counselors would be replaced by automated robots or machine services. Conclusions: Although Lifeline plans to always have a real person providing crisis support, help-seekers automatically fear this will not be the case if new technology and automation such as AI are introduced. Consequently, incorporating innovative use of technology to improve help-seeker outcomes in such services will require careful messaging and assurance that the human connection will continue.

Original languageEnglish
Article numbere34514
Pages (from-to)1-17
Number of pages17
JournalJMIR Human Factors
Volume9
Issue number3
DOIs
Publication statusPublished - 1 Jul 2022

Bibliographical note

Copyright the Author(s) 2022. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • acceptability
  • artificial intelligence
  • community
  • consumer
  • crisis
  • help-seeker
  • perspective
  • support
  • technology

Fingerprint

Dive into the research topics of 'Consumer perspectives on the use of artificial intelligence technology and automation in crisis support services: mixed methods study'. Together they form a unique fingerprint.

Cite this