AI4MH
GenAI will transform mental healthcare: Are we ready?

Receive the relevant articles, news, and resources about the mental health industry in our country.
Connect your patients with a dedicated advocate who provides personalized navigation to the most appropriate mental health resources, and follows them throughout their mental health journeys.
A recent survey published in the Harvard Business Review reported that the number one use of Generative AI (Gen-AI) is now “therapy.” While human therapist might be surprised to hear that bots are now performing therapy, the popularity of Gen-AI for mental health should not be surprising. After all, much of mental health care is about communication, the area where Gen-AI excels. Indeed, of all areas of health care, mental health may be disrupted most by AI. No surprise that roughly 50% of the 2024 venture capital investment in mental health start-ups has been to companies that focus on AI. And given the growing public health trifecta: the youth mental crisis, the dire outcomes (and costs) from serious mental illness, and the high mortality from substance abuse, this disruption cannot come soon enough. From recent discussions at Stanford’s AI4MH group, I see that disruption proceeding on three fronts.
Most immediately, Gen-AI is building efficiency into mental health practice, following the same playbook evident in primary care. Documentation of therapy hours can be offloaded to digital scribes, reports of care management interviews can be generated in real time, and referral letters can be personalized by Gen-AI tools that are now in wide use elsewhere in medicine. The adoption of innovation will be slower here because mental healthcare is highly fragmented, often in a cash pay, solo practice environment. Electronic health records are not yet universal in mental healthcare. But that part of the market that is accountable to commercial or public payors will be held to higher efficiency. In a specialty that is so focused on a workforce crisis, the mandate to maximize time with patients and minimize time with screens and paperwork should accelerate the adoption of Gen-AI tools for administrative efficiency.
An even more transformative opportunity for AI (natural language processing as well as Gen-AI) is the disruption of psychiatric diagnosis. For decades, researchers have searched for genetic and imaging biomarkers to provide precision in the diagnosis of mental disorders. The question now is whether natural language processing and other AI tools can yield objective measures of emotion, behavior, and cognition that may provide for psychiatry the kind of precision that DNA has provided for oncology. As just one example, voice and speech are a rich source of data to assess mood (think of the difference in speech velocity between depressed and manic states). As mental health has moved substantially to telehealth, AI can create telehealth2.0 that captures not only speech and voice, but face emotion, blink rate, and motor activity. Multimodal approaches that combine these features with data from wearables or smart phones offer deep insights for diagnostics and outcomes. AI can thus deliver objective measures to a field that has been limited to subjective assessments.
But what about therapy? If AI is already being used widely for therapy, is this good news or a worrisome trend? The answer depends on how you define therapy. The current literature already demonstrates how Gen-AI can delivery therapy and augment human-delivered therapy, especially for skill-building interventions like cognitive-behavior therapy or dialectical behavior therapy. These treatments are highly structured, dependent on homework, and have been delivered by digital means for decades. Gen-AI may make them more engaging and personalized, thus improving their efficacy.
What about psychotherapy for personal growth and change, not simply skill building? Can Gen-AI listen for what we are not saying, push back on our assumptions, challenge us to focus on what we tend to avoid? The current foundational models are built to give us answers, especially the answers we want. The stories of bots helping anorexics to lose weight or suicidal patients to self-harm are examples of why the wide use of Gen-AI for “therapy” should give us pause. Indeed, the current models, based on affirmation and reinforcement, are precisely the opposite of what we need for psychotherapy, which should challenge a person to discover what they have not known about themselves and others. Theoretically a new foundational model could be built to deliver psychotherapy, but this is not a small task. The evolution from paper maps to GPS to autonomous vehicles may be a cautionary tale. Tools to help a therapist navigate may prove far easier than replacing the therapist altogether. (Disclosure: I am an advisor to Slingshot Health which recently released Ash, a free therapy bot built from a bespoke LLM trained on high quality therapy interviews.)
No less an expert than ChatGPT says, “AI cannot yet fully replace human therapists.” But clearly, from the Harvard Business review survey, this horse – or self-driving car - has already left the barn. The question then is not whether we are ready for the autonomous vehicle but how to manage the widespread use of Gen-AI based therapy in the absence of a regulatory framework or even basic standards of evidence. As a first step, we need a deeper understanding of what constitutes Gen-AI-based “therapy”. Is this a co-pilot for loneliness, a mindfulness exercise for stress, or a treatment for a mental illness? To the extent that Gen-AI is being used to treat mental illness, the problem could be analogous to the premature release of autonomous vehicles. Accidents will happen and people will get hurt. To minimize the damage, foundation AI companies have been diligent about restricting the use of autonomous vehicles. We need the same sense of responsibility and accountability for therapy. This means guardrails, training, and oversight by clinical experts. Banning bots for diagnosis and treatment, as proposed in Illinois HB1806, is not the right answer.
The opportunity is inescapable. Mental healthcare needs disruption. For administrative tasks and diagnostic precision, AI can be the step function this field needs. For skill building therapies, Gen-AI is already proving its value. But for deep psychotherapy, although Gen-AI could improve quality, reduce cost, and democratize care, the current foundational models are full of risks with unproven benefits. Will bespoke approaches, like Ash, be the first Waymo for mental health? Will the next generation of foundational models (Chat GPT5.0 was released earlier today) prove up to the tough task of deep psychotherapy? We should know the answer to both questions in the next few months.
Take the first step toward better mental health today.
Ready to learn more about our Advocates? Reach out now and discover how our personalized support can guide you to the right mental health resources.