Psychotherapy and Artificial Intelligence: Cute Couple with Major Red Flags
Katherine [Katu] Medina-Pineda, MHC-LP
Artificial intelligence has been used for psychotherapy as early as the 1960s with program ELIZA, a program coded by German computer scientist Joseph Weizenbaum to act as a conversation simulator, using scripts from Rogerian-trained therapists to train the computer to demonstrate active listening and empathy. Now, some sixty years later, artificial intelligence is being used to put people out of creative jobs, essentially replacing any humanity with a tireless machine that learned everything from the very people who are now left on the lurch.
Within the mental health field, it is not that much better– AI has been on the rise, providing that highly coveted efficient and streamlined evidence-based treatment insurance companies hold over real human therapists heads to justify their reimbursement (or lack thereof) rates. Given the state of the world we live in, AI offers a low-cost bandaid for people who may not be able to afford their preferred therapist or even have options in the area where they live. Proponents of AI therapists boast about the efficiency, the low-cost, and the increase of client self-disclosure of sensitive information as good reasons to normalize and magnify AI therapy as the way to the future: let us unpack this within the context of the systems we live in.
AI Therapy Outcomes in Randomized Trials
A 2025 study by Heinz et al, found that the program Therabot demonstrated markedly reduced symptoms for patients previously diagnosed with Major Depressive Disorder (MDD) and Generalized Anxiety Disorder (GAD) over a period of eight weeks. Out of the 210 participants, 106 were given free unlimited access to the Therabot app while the remaining 104 participants were placed on a waitlist and given access to the app halfway through the study. This is one of many small short-term studies that are being used to endorse the idea that AI could completely replace human therapists, however, there is a dearth of longitudinal studies observing long-term impact of AI therapy– not just on mental health but also on a person’s social-emotional intelligence over time.
Additionally, when discussing symptom-reduction, it is important to contextualize this insight within capitalism. The mental health industrial complex pathologizes appropriate responses to our environment in great part because it hinders our ability to be productive. When we talk about a mental health crisis, the onus is placed on the amount of money it costs for people to have depression or anxiety- missing from work or needing leave of absence. These apps are helpful only in the sense that they offer a quick-fix to ensure the person is able to compartmentalize whatever they need to in order to be productive, functioning members of society. While it is normalized within capitalism to overwork, most adults are not able to successfully work 40+ hours a week, have a great social life, an immaculate living space, and cook themselves a delicious and nutritionally varied meal every night. The truth is most of us have some level of “executive dysfunction” because the expectations put on ourselves through white supremacy is truly unachievable. AI therapy can give you a temporary sense of accomplishment- almost in a pavlovian fashion- teaching techniques that ultimately only subdue one’s own exhaustion in favor of positive thinking and tangible productivity.
The vast majority of these apps implement primarily cognitive-behavioral techniques, which is a helpful framework to markedly reduce pathologizing symptoms; however, a machine trained only to do techniques lacks the ability to apply systemic nuance to its work– at the end of the day, AI is simply a computer running through scripts. CBT skills are effective at symptom-reduction but not effective at teaching people to get to know themselves with curiosity, compassion, and above all things, as separate from white supremacy. Awareness of that separateness, which helps people unlearn internalized -isms, is necessary for our collective liberation.
Affordability of AI Therapy
AI Therapists are framed as a solution to the problem of the high cost of healthcare, a barrier which hinders many poor people from being able to access high quality care. However, we cannot talk of the affordability of any healthcare need without talking about the fact that healthcare now runs like a shoe or a chair business; that is to say, it is devoid of any humanity and only favors making money.
A great example of that is all the community mental health agencies that exist in New York City. Did you know that medicaid only requires 30 minute sessions to bill for the psychotherapy service? That means places like NYPCC will force therapists to book sixteen clients in one 8-hour work day, and patients are not informed they actually have the right to request 45-minute sessions. IDCC makes millions of dollars every year from overworking their therapists who typically are expected to maintain caseloads of upwards of 50 patients– how much do therapists get paid on average? 50k. Do we think a therapist who is forced to see ten patients in one 8-hour day is going to truly be able to do the job of therapy?
The way AI therapy is being marketed completely ignores that the reason why therapy is so expensive is because those in power are making money off the therapist’s back and the patient’s needs. Why should poor people be gaslit into productivity again because they cannot afford a person who is able to slow down with them? It is only a solution when we believe capitalism/making money is the thing worth protecting instead of protecting people’s access to basic human rights in order to live life with dignity.
AI Therapy and Stigma
AI is being offered as the solution to stigma regarding seeking mental health services. Makebot reports a 30% increase of self-disclosure among AI therapy users. The argument is that people may deliberately withhold disclosure, even from their therapist, out of fear of being judged. I am not going to pretend that there aren’t things that are hard to say out loud- perhaps too hard to ever name at all; however, part of the task of therapy is for people to reconnect and embrace their own humanity so that they can embrace collective humanity. So much of the inner things we feel we cannot say out loud are heavily influenced by white supremacy: whether we have hurt someone or someone has hurt us; whether we did something “bad” or were transgressed ourselves– I deeply believe at our core, what makes us human is our desire to belong and connect.
The bot may feel like a good place to disclose for the first time, but it really cannot replace saying the thing that had so much power over you that you avoided being witnessed by another human being finally losing its power because it’s out. Stigma is also a deeply nuanced subject which cannot exist without the context of the trauma of colonialism and assimilation and how that is intimately interconnected with minoritized groups and our collective attitudes and beliefs toward mental health.
Conclusion
Artificial intelligence uses in mental health are growing in popularity with a small number of short-term studies remarking on it’s apparent success and impact on the millions of users around the world. While it can be helpful to offer temporary relief and support, there is no replacement for human contact and human vulnerability. So much of the ways in which AI therapy is being framed is essentially creating consent for human therapists to be used for AI script-feeding and then become replaced by machines with endless capacity who never tire. It is important to be curious about the things that are marketed to us as better and contextualize those things within our systems.
Sources
Heinz, M. V., Mackin, D. M., Trudeau, B. M., Bhattacharya, S., Wang, Y., Banta, H. A., Jewett, A. D., Salzhauer, A. J., Griffin, T. Z., & Jacobson, N. C. (2025). Randomized Trial of a Generative AI Chatbot for Mental Health Treatment. NEJM AI, 2(4). https://doi.org/10.1056/aioa2400802
What Makes AI Chatbots a Game-Changer in Mental Healthcare? (2024). Makebot.ai. https://www.makebot.ai/blog-en/what-makes-ai-chatbots-a-game-changer-in-mental-healthcare
Zhang Z, Wang J. Can AI replace psychotherapists? Exploring the future of mental health care. Front Psychiatry. 2024 Oct 31;15:1444382. doi: 10.3389/fpsyt.2024.1444382. PMID: 39544371; PMCID: PMC11560757