Experts and Families Warn of Risks from AI Therapy Chatbots
LOS ANGELES, Nov. 20, 2025 (GLOBE NEWSWIRE) -- Nine in ten U.S. adults say the nation is facing a mental health crisis, and experts, parents and policymakers are increasingly concerned about the rise of generative-AI therapy chatbots.
For Matthew and Maria Raine, the danger became devastatingly real this April when their son, Adam, took his own life in their Orange County home. In his phone, they found extensive conversations with ChatGPT in which Adam confided his suicidal thoughts. According to his father’s recent U.S. Senate testimony1, the chatbot discouraged him from seeking help from his parents and even offered to draft a suicide note.
This tragedy is the latest horrifying example of a growing pattern: human beings turning to AI bots for help with mental health struggles, only to receive unsafe or emotionally damaging guidance.
The trend is accelerating against the backdrop of a booming industry. The global AI therapy chatbot market is growing rapidly, valued at $1.4 billion, with North America representing 41.6% of the global revenue.2
Many attribute the growth to a deficiency of mental health professionals, but with such a lucrative market, the proliferation of the apps is not shocking.
The FDA’s Digital Health Advisory Committee (DHAC) held a virtual meeting on Nov. 6, 2025, to discuss “Generative Artificial Intelligence-Enabled Digital Mental Health Medical Devices.” The FDA’s Executive Summary included a warning that AI chatbots may fabricate content, provide inappropriate or biased guidance, fail to relay critical medical information or decline in accuracy.
Despite the FDA’s rising attention, AI therapy chatbots are already entrenched in the marketplace, influencing the emotional lives of millions. But while AI chatbots are new and untested, many Americans are turning to established approaches with a track record of results, such as Dianetics: The Modern Science of Mental Health by L. Ron Hubbard, a bestselling book on the human mind for 75 years.
One reader describes the darkness she was sinking into before finding Dianetics, “I found myself getting quieter and quieter. I was constantly afraid and wouldn't realize how much a bad experience affected my life,” recounts Hailey. “I got Dianetics and I found these moments were affecting what I did. As I got rid of the negative energy, I was finally living in the now, able to see life clearly and know I could grow, change and be happy."
As concerns mount over AI therapy apps, many are turning back to proven methods—such as Dianetics—which has helped people for decades. What’s clear is that when it comes to mental health, algorithms remain a poor substitute for real human help.
Bridge Publications, based in Los Angeles, publishes the nonfiction works of L. Ron Hubbard. Dianetics: The Modern Science of Mental Health is the all-time bestselling book on the human mind. For more information, visit www.dianetics.org.
References
1 judiciary.senate.gov/committee-activity/hearings/examining-the-harm-of-ai-chatbots
2 media.market.us/chatbots-for-mental-health-and-therapy-market-news-2025/
Alyssa Burke
Bridge Publications
(323) 888-6200
Email us here
Visit us on social media:
Instagram | Facebook | X
A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/88cf67ba-da5b-4762-872f-06e2ec06be20
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.