The Future of AI Therapy
I often joke with my clients that my goal as a therapist is to no longer be needed. This, of course, is in reference to supporting my clients in achieving their treatment goals.
This idea now has new meaning as AI therapy has begun to hit the mainstream with new technology available to the public. Websites such as freeaitherapist.com, ChatGPT, and even the new Google smart phone with “Gemini” AI voice assistant allows you to connect with mental health support programs 24/7.
The evolution of AI therapy certainly has many benefits including accessibility for many populations that may not have the opportunity to access traditional therapy services.
One recent study1 found that the application of AI in psychological interventions and diagnosis has the potential to revolutionize mental health by providing early detection and accurate diagnosis.
Great! What could go wrong?
There are many potential benefits as to how AI can progress the future of psychological interventions, research, and diagnostic accuracy. However, the risks may potentially outweigh the benefits.
On the tech side, algorithms used to generate responses are influenced by “algorithmic bias2.” This essentially means that humans help machines learn information and humans are highly biased. AI systems use algorithms to discover patterns and insights in data, or to predict outcomes based on available data. Bias impacts these algorithms as the machine responses may perpetuate or promote discrimination and inequality. At present, these algorithms are not designed and trained with diversity and inclusivity in mind. Instead, they reflect society’s values and may reinforce existing socioeconomic, racial, and gender biases.
AI algorithms are also biased to the input of the client. Most people are unaware of biases they may hold when engaging in AI therapy. An example of this came up during a recent session with a client who told me they used AI therapy between sessions for guidance with a relationship conflict. The client asked the AI chatbot if their relationship was healthy or unhealthy (something that we were working on in therapy) and provided the chatbot with the list of pros and cons. The chatbot responded with a data driven response given the input of information – a response the client did not like. The client then debated with that chatbot until it produced the result the client wanted to hear. It is clear in this example that even our own biases can impact the outcome of these automated services.
Artificial intelligence cannot experience empathy, the ability to understand and share the feelings of another person. Empathy is the heart and soul of therapy. A therapy chatbot cannot see what may be happening beneath the surface, a skill that therapists spend many years cultivating. So much of what therapists see in session is not just what is verbally expressed, but what a client’s micro expressions are communicating (posture, tone of voice, eye contact, facial expression) that may not be aligned with what a client states. Think of how often we might respond to the question “how are you?” with “I’m fine.” Think of how often that answer is actually true. This is where human therapy is unparallelled by artificial algorithms and metric driven data.
A meta-analysis released in 2019 by The American Psychological Association found that the client-therapist relationship (therapeutic alliance), is the most important predictor of positive treatment outcomes3.
As we move into an era where AI therapy is becoming increasingly accessible, it is essential to acknowledge both its potential and its limitations. While AI can provide convenient, around-the-clock mental health support and enhance accessibility for underserved populations, it lacks the depth and nuance of human connection that is foundational to effective therapy. The risks of algorithmic bias and the absence of empathy highlight the limitations of AI in addressing the complexities of human experience. Ultimately, while AI therapy may serve as a valuable supplement to traditional therapy, it cannot replace the profound impact of a genuine therapeutic alliance. The future of mental health care must find a balance between leveraging technological advancements and preserving the irreplaceable human touch that lies at the heart of healing.
References:
- Zhou, S., Zhao, J., & Zhang, L. (2022). Application of Artificial Intelligence on Psychological Interventions and Diagnosis: An Overview. Frontiers in Psychiatry, 13, 811665. https://doi.org/10.3389/fpsyt.2022.811665
- Panch, T., Mattie, H., & Atun, R. (2019). Artificial intelligence and algorithmic bias: implications for health systems. Journal of global health, 9(2), 010318. https://doi.org/10.7189/jogh.09.020318
- DeAngelis, T. (2019, November 1). Better relationships with patients lead to better outcomes. Monitor on Psychology. https://www.apa.org/monitor/2019/11/ce-corner-relationships