top of page

On Mental Health AIs...

Updated: May 22, 2023

A paper published in The Lancet in November 2021 estimated that the pandemic triggered an additional 53 million cases of depression and 76 million cases of anxiety disorders across the globe. Mental health resources are scarce and the development and use of Ais is on the rise.

Mental Health AI apps offer a chat-bot service, typically for free, as well as human teletherapy services for a fee. They ask questions like, "How are you feeling?" or "What's bothering you?" Generally, machine learning tracks, analyzes and responds to human emotions, monitor mood, and mimic a human therapist's interaction with a therapist. The responses for most of these chatbots are pre-determined and triggered by if-then situations, based on a psychologist's CBT perspective. The therapeutic effects of mental health AIs are being studied around the world. An app, like Wysa, has received FDA designation to treat depression, anxiety, and chronic musculokeletal pain, showing clinical efficacy in reducing symptoms of depression and anxiety. Other studies show conflicting results, stating that mental health AIs "[create] an illusion of help."[1].

A case FOR mental health AIs:

Mental health care needs are at an all time high which is also strained by a shortage of mental health care professionals as well as financial and logistical barriers to seeking professional help. Therapy-goers may spend upwards of $100-200+ per session, and insurance benefits may or may not provide coverage. Sometimes, the quest to find a therapist that is a good fit is daunting and time-consuming. Factors to consider include, convenience of location (in-person or virtually), scheduling, and transportation. It may even be triggering if you have to repeat your story to multiple therapists.

Mental health AIs can be accessed at your convenience and from your electronic device, any time and anywhere. There may be a greater sense of anonymity because you are not interacting with a human - there is less fear of being judged. AIs are able to present techniques and strategies in concise way, gathering information from various sources on the internet - the suggested ways to manage include, deep breathing, listening to calming music, and trying simple exercises.

A case AGAINST mental health AIs

On the other hand, mental health issues and its supports are often individualized, and most of the time, require personalized treatment or management plans. There are many therapeutic approaches and not everyone will respond to CBT, or one sole approach. When interacting through an electronic device, AIs are unable to interpret body language or tone of voice, which is important in developing meaningful therapeutic relationships and determining the therapeutic direction.

Mental health clinicians are bound to Standards of Practice and a Code of Ethics. Mental health AIs are unable to obtain on-going informed consent, nor are they able to ensure complete confidentiality, which are both required in the profession. There is also limited scope for complex situations, crisis intervention and situations where there is a duty to report.

It is also in our shared humanity to seek out connection with others, and turning to mental health AIs may result in an overreliance and/or substitution for human interaction and support. There are currently no studies on whether therapeutic effects have long-term advantages.

In conclusion…

All in all, no human, or AI bot, is perfect. AI and machine learning focused on improving mental health may fill some gaps in the short-term, however, having a human mental health professional to address your specific needs might be more beneficial in the long term.

What are your thoughts? Where do you see mental health AI going in the near future?

Sources:

  1. Abd-Alrazaq, A. et al. (2020). Effectiveness and Safety of using chatbots to improve mental health: Systematic review and meta-analysis. J Med Internet Res, 22(7). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7385637/

  2. Botello-Harbaum, M., Wetterneck, C. T., & Zayfert, C. (2019). Artificial intelligence and mental health: Current applications and future directions. Cognitive and Behavioral Practice, 26(4), 851-863.

  3. Browne, G (2022). The problem with mental health bots. Wired UK. Retrieved from: https://www.wired.co.uk/article/mental-health-chatbots#:~:text=With%20human%20therapists%20in%20short,and%20money%2C%E2%80%9D%20she%20says

  4. Fulmer, R. et. Al (2018). Using psychological artificial intelligence (Tess) to relieve symptoms of depression and anxiety: Randomized Controlled Trial. JMIR MentHealth, 5(4). https://pubmed.ncbi.nlm.nih.gov/30545815/

  5. Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed Methods Study. JMIR Mhealth Uhealth. 6(11). https://pubmed.ncbi.nlm.nih.gov/30470676/

  6. Noguchi, Y (2023). Therapy by chatbot? The promise and challenges in using AI for mental health. NPR. Retrieved from: https://www.npr.org/sections/health-shots/2023/01/19/1147081115/therapy-by-chatbot-the-promise-and-challenges-in-using-ai-for-mental-health

  7. O'Keeffe, S. (2019). Pros and cons of using AI in psychotherapy. Psychology Today. Retrieved from https://www.psychologytoday.com/us/blog/reading-between-the-headlines/201908/pros-and-cons-using-ai-in-psychotherapy

  8. Oprescu, F., Campo, A., & Acioly-Régnier, N. (2021). Artificial intelligence and psychotherapy: a systematic review. Journal of Technology in Behavioral Science, 6(1), 4-17.

  9. Sloat, S (2022). The strange, nervous rise of the therapist chatbox. The Daily Beast. Retrieved from: https://www.thedailybeast.com/chatbots-are-taking-over-the-world-of-therapy

  10. Santomauro, DF et. Al (2021). Global prevalence and burden of depression and anxiety disorders in 204 countries and territories in 2020 due to the COVID-19 pandemic. The Lancet, 398(10312). https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(21)02143-7/fulltext#%20

bottom of page