Statement on AI & Therapy
“I unequivocally discourage the use of AI for personal guidance due to risks that are obvious, and risks that are unforeseen.
I reject the idea that AI can be utilized for psychotherapy; which is and has always been a relationship between two human beings “
With the widespread utilization of LLMs and AI chatbots and the creation of therapy-specific bots many have turned to AI for emotional support.
In this short segment I will express my professional thoughts on this process. In short; the idea that a chatbot can give therapy demonstrates a misunderstanding about what therapy is and what makes it effective.
While some folks will inevitably feel supported by these bots, they can never practice psychotherapy - which can only happen between human beings.
Let’s talk about some reasons why people will use AI in this way…
Therapy is expensive. Not everyone has insurance that covers it, and not everyone can afford it. Chatbots may fill a niche in that way.
Many individuals have had poor experiences in therapy or with therapists. The overall quality of training for therapists is quite variable, and undoubtedly has been impacted by the movement of the profession online. Even with the best training, not every therapist is right for every client. No matter how talented a therapist may be, the client may just not connect with them.
Therapy is challenging. It is not fun, and often is not easy. It takes time, work, and often asks us to feel things that we have spent most of our lives trying NOT to feel. As I will get into later; chatbots can not, at least currently, notice these patterns of avoidance and confront them.
Therapy takes time. I can’t tell you how many people come to therapy because they think the therapist can give them a quick solution. There are no shortcuts. There is no solution that will allow you to not feel, unless you want to numb yourself. Understanding and becoming conscious of our emotional patterns means that we have to understand the logic of the emotions, not the logic of the rational mind. As long as we try to make our emotions be rational, we will be caught in cycles of avoidance, shame, and inner-conflict.
Talking with an actual person is vulnerable. What will they think? What if I don’t like what they say? What if they don’t like what I say? There is more relational risk when we open up to a human being than to an inert language processing program.
Now for some of the risks:
I am not going to give all the citations on this page, because they are not hard to find. Furthermore, I don’t think I can capture all the risks because I think there are hidden risks we are not aware of yet. So I will just give some general risks, some of which are obvious and well-decoumented, some of which may be more subtle.
The most obvious dangers are the ways that AI can amplify delusions, reinforce patterns, lack the ability to assess risk, encourage suicidal or homicidal thinking and behavior. AI panders to whatever it is presented with. Some people can create a relationship with AI but there is no actual relationship possible. Whatever relationship one has with AI is based solely on fantasy.
There are also significant concerns about privacy; who has access to our data? What are they using it for?
What about ethics? For example, ChatGPT has been implicated in several suicides - for even fostering suicidal ideation, and arguably encouraging the act. There are or have been lawsuits against Open AI pertaining to this. Does that mean Open AI has lost it’s ability to allow users to use ChatGPT as a therapist? No, not at all. They state they are creating more ‘guardrails,’ but where is the line? If a therapist engaged in the conduct that ChatGPT had, that therapist would lose their license and likely never be able to practice therapy again. Think about that for a second. Would you entrust that therapist if they were a human? How is it different with a chatbot? Because they can be updated? Chatbots will never be held responsible for their behavior, because they are not an individual. We will see if the organizations that create them will be responsible for their behavior, but I for one don’t expect that.
If you look at my blog, you will find another risk which is that AI reinforces our narcissism
The misunderstanding about what psychotherapy is and why AI can not practice it
If we think the therapy is about the therapist just giving us information, then of course we might think that AI can replace therapy. If we think that therapy is just about the therapist parroting what we say; then yep, AI can do that too. But neither of those things are actually therapy.
Therapy isn’t just the accumulation of some strategies to address our issue and make ourselves feel better so we can go back to being a good little worker bee. Therapy is about the raw truth of our reality. It is about facing what we don’t want to face in emotionally; sometimes when we are ready to face it and sometimes when we are not. It is about seeing the ways we disconnect from ourselves and others. Seeing how we protect ourselves by suppression, denial, disavowal, minimization, rationalization, and projection. Therapy is about recognizing our responsibility as a human being in-relationship to ourselves and to other human beings. Therapy is about showing up in relationship. Therapy helps us identify patterns of shame in real time that otherwise would run rampant.
Furthermore, utilizing AI for therapy demonstrates a profound misunderstanding about what actually makes therapy work. Meta-studies demonstate that it is the therapy relationship that is greatest predictor of outcomes; not the therapy modality, not the information that the therapist gives the client. Therapy is a relationship between two human beings that feel.
In Summary…
The truth is, I could write so, so much more. Because there is so much more to say. However, for now I will end with this statement:
I unequivocally discourage the use of AI for personal guidance due to risks that are obvious, and risks that are more difficult to calculate. I reject the idea that AI can be utilized for psychotherapy; which is and has always been a relationship between two (or more) human beings. AI can give someone information, but psychotherapy involves so much more complexity than the acquisition of conceptual information.