Pragmatic AI-augmentation in mental healthcare: Key technologies, potential benefits, and real-world challenges and solutions for frontline clinicians

Katherine C Kellogg, Shiri Sadeh-Sharvit, Katherine C Kellogg, Shiri Sadeh-Sharvit

Abstract

The integration of artificial intelligence (AI) technologies into mental health holds the promise of increasing patient access, engagement, and quality of care, and of improving clinician quality of work life. However, to date, studies of AI technologies in mental health have focused primarily on challenges that policymakers, clinical leaders, and data and computer scientists face, rather than on challenges that frontline mental health clinicians are likely to face as they attempt to integrate AI-based technologies into their everyday clinical practice. In this Perspective, we describe a framework for "pragmatic AI-augmentation" that addresses these issues by describing three categories of emerging AI-based mental health technologies which frontline clinicians can leverage in their clinical practice-automation, engagement, and clinical decision support technologies. We elaborate the potential benefits offered by these technologies, the likely day-to-day challenges they may raise for mental health clinicians, and some solutions that clinical leaders and technology developers can use to address these challenges, based on emerging experience with the integration of AI technologies into clinician daily practice in other healthcare disciplines.

Keywords: AI-augmentation; artificial intelligence; automation technologies; clinical practice; decision support technologies; engagement technologies; mental healthcare.

Conflict of interest statement

Author SS-S is the Chief Clinical Officer of Eleos Health. The remaining author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Copyright © 2022 Kellogg and Sadeh-Sharvit.

References

    1. Graham S, Depp C, Lee EE, Nebeker C, Tu X, Kim H-C, et al. . Artificial intelligence for mental health and mental illnesses: an overview. Curr Psychiatry Rep. (2019) 2111:1–18. 10.1007/s11920-019-1094-0
    1. Crigger E, Reinbold K, Hanson C, Kao A, Blake K, Irons M. Trustworthy augmented intelligence in health care. J Med Syst. (2022) 462:1–11. 10.1007/s10916-021-01790-z
    1. Bazoukis G, Hall J, Loscalzo J, Antman EM, Fuster V, Armoundas AA. The inclusion of augmented intelligence in medicine: a framework for successful implementation. Cell Rep Med. (2022) 31:100485–100485. 10.1016/j.xcrm.2021.100485
    1. Jotterand F, Ienca M. Artificial intelligence in brain and mental health: philosophical, ethical and policy issues. Adv Neuroethics. (2021). 2021:1–7. 10.1007/978-3-030-74188-4_1
    1. Plis SM, Hjelm DR, Salakhutdinov R, Allen EA, Bockholt HJ, Long JD, et al. . Deep learning for neuroimaging: a validation study. Front Neurosci. (2014) 8:229. 10.3389/fnins.2014.00229
    1. Xiao T, Albert MV. Big Data in Medical AI: How Larger Data Sets Lead to Robust, Automated Learning for Medicine. Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical & Policy Issues. Cham, Switzerland: Springer; (2021). p. 11–25. 10.1007/978-3-030-74188-4_2
    1. Luxton DD, Hudlicka E. Intelligent Virtual Agents in Behavioral and Mental Healthcare: Ethics and Application Considerations. Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical & Policy Issues. Cham, Switzerland: Springer; (2021). p. 41–55. 10.1007/978-3-030-74188-4_4
    1. Benbya H, Pachidi S, Jarvenpaa S. Special issue editorial: artificial intelligence in organizations: implications for information systems research. J Assoc Inform Syst. (2021) 222:10. 10.17705/1jais.00662
    1. Benbya H, Nan N, Tanriverdi H, Yoo Y. Complexity and information systems research in the emerging digital world. Mis Q. (2020) 441:1–17. 10.25300/MISQ/2020/13304
    1. Bickman L. Improving mental health services: a 50-year journey from randomized experiments to artificial intelligence and precision mental health. Adm Policy Ment Health. (2020) 475:795–843.
    1. Chang Z, Di Martino JM, Aiello R, Baker J, Carpenter K, Compton S, et al. . Computational methods to measure patterns of gaze in toddlers with autism spectrum disorder. JAMA Pediatr. (2021) 1758:827–36. 10.1001/jamapediatrics.2021.0530
    1. Sadeh-Sharvit S, Hollon SD. Leveraging the power of nondisruptive technologies to optimize mental health treatment: case study. JMIR Mental Health. (2020) 711:e20646. 10.2196/preprints.20646
    1. Connors EH, Douglas S, Jensen-Doss A, Landes SJ, Lewis CC, McLeod BD, et al. . What gets measured gets done: how mental health agencies can leverage measurement-based care for better patient care, clinician supports, and organizational goals. Admin Policy Mental Health Mental Health Serv Res. (2021) 482:250–65. 10.1007/s10488-020-01063-w
    1. Shahamiri SR, Thabtah F. Autism AI: a new autism screening system based on artificial intelligence. Cogn Comput. (2020) 124:766–77. 10.1007/s12559-020-09743-3
    1. Sadeh-Sharvit S, Fitzsimmons-Craft EE, Taylor CB, Yom-Tov E. Predicting eating disorders from Internet activity. Int J Eat Disord. (2020) 539:1526–33. 10.1002/eat.23338
    1. Smrke U, Mlakar I, Lin S, Musil B, Plohl N. Language, speech, and facial expression features for artificial intelligence-based detection of cancer survivors' depression: scoping meta-review. JMIR Mental Health. (2021) 812:e30439. 10.2196/preprints.30439
    1. Balcombe L, De Leo D. Digital mental health challenges and the horizon ahead for solutions. JMIR Mental Health. (2021) 83:e26811. 10.2196/26811
    1. Lim HM, Teo CH, Ng CJ, Chiew TK, Ng WL, Abdullah A, et al. . An automated patient self-monitoring system to reduce health care system burden during the Covid-19 pandemic in Malaysia: development and implementation study. JMIR Med Inform. (2021) 92:e23427. 10.2196/23427
    1. Goddard K, Roudsari A, Wyatt JC. Automation bias: a systematic review of frequency, effect mediators, and mitigators. J Am Med Inform Assoc. (2012) 191:121–7. 10.1136/amiajnl-2011-000089
    1. Parasuraman R, Manzey DH. Complacency and bias in human use of automation: an attentional integration. Hum Factors. (2010) 523:381–410. 10.1177/0018720810376055
    1. Sartori L, Bocca G. Minding the gap (s): public perceptions of AI and socio-technical imaginaries. AI Soc. (2022) 2022:1–16. 10.1007/s00146-022-01422-1
    1. Fiske A, Henningsen P, Buyx A. Your robot therapist will see you now: ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. J Med Internet Res. (2019) 215:e13216. 10.2196/13216
    1. Martinengo L, Van Galen L, Lum E, Kowalski M, Subramaniam M, Car J. Suicide prevention and depression apps' suicide risk assessment and management: a systematic assessment of adherence to clinical guidelines. BMC Med. (2019) 171:1–12. 10.1186/s12916-019-1461-z
    1. Colman N, Stone K, Arnold J, Doughty C, Reid J, Younker S, et al. . Prevent safety threats in new construction through integration of simulation and FMEA. Pediatr Qual Saf. (2019) 44:e189. 10.1097/pq9.0000000000000189
    1. Grissinger M. Understanding human over-reliance on technology. Pharm Therap. (2019) 446:320.
    1. Ziewitz M. Governing algorithms: myth, mess, and methods. Sci Technol Human Values. (2016) 411:3–16. 10.1177/0162243915608948
    1. Ekbia HR, Nardi BA. Heteromation, and other stories of computing and capitalism. Cambridge, MA: MIT Press. (2017). 10.7551/mitpress/10767.001.0001
    1. Autor D. Why are there still so many jobs? The history and future of workplace automation. J Econ Perspect. (2015) 293:3–30. 10.1257/jep.29.3.3
    1. Autor DH. The Paradox of Abundance: Automation Anxiety Returns. Performance and Progress: Essays on Capitalism, Business, and Society. New York, NY: Oxford University Press; (2015). p. 237–60. 10.1093/acprof:oso/9780198744283.003.0017
    1. Johnson DG, Verdicchio M. Reframing AI discourse. Minds Mach. (2017) 274:575–90. 10.1007/s11023-017-9417-6
    1. Lebovitz S, Lifshitz-Assaf H, Levina N. To engage or not to engage with AI for critical judgments: how professionals deal with opacity when using AI for medical diagnosis. Organ Sci. (2022) 331:126–48. 10.1287/orsc.2021.1549
    1. Fitzsimmons-Craft EE, Balantekin KN, Eichen DM, Graham AK, Monterubio GE, Sadeh-Sharvit S, et al. . Screening and offering online programs for eating disorders: reach, pathology, and differences across eating disorder status groups at 28 US universities. Int J Eat Disord. (2019) 5210:1125–36. 10.1002/eat.23134
    1. Cruz Rivera S, Liu X, Chan A-W, Denniston AK, Calvert MJ. Guidelines for clinical trial protocols for interventions involving artificial intelligence: the SPIRIT-AI extension. Nat Med. (2020) 269:1351–63. 10.1136/bmj.m3210
    1. Wilson A, Saeed H, Pringle C, Eleftheriou I, Bromiley PA, Brass A. Artificial intelligence projects in healthcare: 10 practical tips for success in a clinical environment. BMJ Health Care Inform. (2021) 281:e100323. 10.1136/bmjhci-2021-100323
    1. Hudlicka E. Virtual Affective Agents and Therapeutic Games. Artificial Intelligence in Behavioral and Mental Health Care. San Diego, CA: Elsevier; (2016). p. 81–115. 10.1016/B978-0-12-420248-1.00004-0
    1. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR mental health. (2017) 42:e7785. 10.2196/mental.7785
    1. Taylor CB, Graham AK, Flatt RE, Waldherr K, Fitzsimmons-Craft EE. Current state of scientific evidence on internet-based interventions for the treatment of depression, anxiety, eating disorders and substance abuse: an overview of systematic reviews and meta-analyses. Eur J Public Health. (2021) 31:i3–i10. 10.1093/eurpub/ckz208
    1. Torous J, Myrick KJ, Rauseo-Ricupero N, Firth J. Digital mental health and COVID-19: using technology today to accelerate the curve on access and quality tomorrow. JMIR Mental Health. (2020) 73:e18848. 10.2196/18848
    1. Funk B, Sadeh-Sharvit S, Fitzsimmons-Craft EE, Trockel MT, Monterubio GE, Goel NJ, et al. . A framework for applying natural language processing in digital health interventions. J Med Internet Res. (2020) 222:e13855. 10.2196/13855
    1. Singh S, Germine L. Technology meets tradition: a hybrid model for implementing digital tools in neuropsychology. Int Rev Psychiatry. (2021) 334:382–93. 10.1080/09540261.2020.1835839
    1. Boucher EM, Harake NR, Ward HE, Stoeckl SE, Vargas J, Minkel J, et al. . Artificially intelligent chatbots in digital mental health interventions: a review. Expert Rev Med Devices. (2021) 18:37–49. 10.1080/17434440.2021.2013200
    1. He L, Basar E, Wiers RW, Antheunis ML, Krahmer E. Can chatbots help to motivate smoking cessation? A study on the effectiveness of motivational interviewing on engagement and therapeutic alliance. BMC Public Health. (2022) 221:1–14. 10.1186/s12889-022-13115-x
    1. Miner AS, Milstein A, Schueller S, Hegde R, Mangurian C, Linos E. Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence, and physical health. JAMA Intern Med. (2016) 1765:619–25. 10.1001/jamainternmed.2016.0400
    1. D'Alfonso S, Lederman R, Bucci S, Berry K. The digital therapeutic alliance and human-computer interaction. JMIR Mental Health. (2020) 712:e21895. 10.2196/preprints.21895
    1. D'Hotman D, Loh E. AI enabled suicide prediction tools: a qualitative narrative review. BMJ Health Care Inform. (2020) 273:e100715. 10.1136/bmjhci-2020-100175
    1. Mulder R, Newton-Howes G, Coid JW. The futility of risk prediction in psychiatry. Br J Psychiatry. (2016) 2094:271–2. 10.1192/bjp.bp.116.184960
    1. Chan MK, Bhatti H, Meader N, Stockton S, Evans J, O'Connor RC, et al. . Predicting suicide following self-harm: systematic review of risk factors and risk scales. Br J Psychiatry. (2016) 2094:277–83. 10.1192/bjp.bp.115.170050
    1. Singer SJ, Kellogg KC, Galper AB, Viola D. Enhancing the value to users of machine learning-based clinical decision support tools: a framework for iterative, collaborative development and implementation. Health Care Manage Rev. (2022) 472:E21–31. 10.1097/HMR.0000000000000324
    1. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. (2015) 521:436–44. 10.1038/nature14539
    1. Sutton RS, Barto AG. Reinforcement Learning: An Introduction. Massachusetts: MIT Press; (2018).
    1. Hosny A, Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL. Artificial intelligence in radiology. Nat Rev Cancer. (2018) 188:500–10. 10.1038/s41568-018-0016-5
    1. Fletcher S, Spittal MJ, Chondros P, Palmer VJ, Chatterton ML, Densley K, et al. . Clinical efficacy of a Decision Support Tool (Link-me) to guide intensity of mental health care in primary practice: a pragmatic stratified randomised controlled trial. Lancet Psychiatry. (2021) 83:202–14. 10.1016/S2215-0366(20)30517-4
    1. Amann J. Machine Learning in Stroke Medicine: Opportunities and Challenges for Risk Prediction and Prevention. Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical & Policy Issues. Cham, Switzerland: Springer; (2021). p. 57–71. 10.1007/978-3-030-74188-4_5
    1. Rundo L, Pirrone R, Vitabile S, Sala E, Gambino O. Recent advances of HCI in decision-making tasks for optimized clinical workflows and precision medicine. J Biomed Inform. (2020) 108:103479. 10.1016/j.jbi.2020.103479
    1. Balcombe L, De Leo D. editors. Human-computer interaction in digital mental health. Informatics. (2022) 9:14. 10.3390/informatics9010014
    1. Kellogg KC, Sendak M, Balu S. AI on the front lines. MIT Sloan Manag Rev. (2022) 634:44–50.
    1. Cohen ZD, DeRubeis RJ. Treatment selection in depression. Annu Rev Clin Psychol. (2018) 141:209–36. 10.1146/annurev-clinpsy-050817-084746
    1. Dawoodbhoy FM, Delaney J, Cecula P, Yu J, Peacock I, Tan J, et al. . AI in patient flow: applications of artificial intelligence to improve patient flow in NHS acute mental health inpatient units. Heliyon. (2021) 75:e06993. 10.1016/j.heliyon.2021.e06993
    1. Ammar N, Shaban-Nejad A. Explainable Artificial Intelligence recommendation system by leveraging the semantics of adverse childhood experiences: proof-of-concept prototype development. JMIR Med Inform. (2020) 811:e18752. 10.2196/preprints.18752
    1. Arrieta AB, Díaz-Rodríguez N, Del Ser J, Bennetot A, Tabik S, Barbado A, et al. . Explainable Artificial Intelligence (XAI): concepts, taxonomies, opportunities and challenges toward responsible AI. Inform Fusion. (2020) 58:82–115. 10.1016/j.inffus.2019.12.012
    1. Chandler C, Foltz PW, Elvevåg B. Improving the applicability of AI for psychiatric applications through human-in-the-loop methodologies. Schizophr Bull. (2022) sbac038. 10.1093/schbul/sbac038
    1. Balcombe L, De Leo D. The potential impact of adjunct digital tools and technology to help distressed and suicidal men: an integrative review. Front Psychol. (2021) 12:796371. 10.3389/fpsyg.2021.796371

Source: PubMed

3
Předplatit