Have you ever used an AI for
mental health support?
by Diane Hu
Communication Studio 2025 Fall
Carnegie Mellon University
More and more people are sharing their most private fears, insecurities, and anxieties, not with another person, but with an AI chatbot.
This has become something deeply personal:
People turning to algorithms for comfort, connection, and advice, even though these systems were never designed - or certified - for mental health care.
According to a recent survey by the nonprofit Sentio Marriage and Family Therapy program and Sentio Counseling Center, reveals a potentially paradigm-shifting trend:
49% of LLM users who self-report an ongoing mental health condition use LLMs for therapeutic support.
The goals or purposes for using an LLM for mental health support vary.
It was reported that those who are using LLM tools for mental support are with the purpose of managing anxiety, coping with depression, improving mood, gaining insight into emotions, practicing communication skills, receiving advice on personal issues, feeling less lonely, or other reasons.
of LLM users use LLMs to manage anxiety.
to receive advice on
personal issues.
to cope with depression.
to gain insight into
emotions.
to improve mood.
to practice communication skills.
to feel less lonely.
“What were your main goals or purposes
for using an LLM for mental health support?”
Diving into the mental symptoms or issues people
have seek help with paints a clearer picture.
Common mental health issues addressed included anxiety, depression, stress, relationship issues, low self-esteem, trauma, grief, addiction, eating disorders, and thoughts of self-harm.
of LLM users have used LLMs to
help with low slef-esteem.
have used an LLM to help with relationship issues.
have used an LLM to help with anxiety.
have used an LLM to help with grief.
have used an LLM to help with trauma.
have used an LLM to deal with
thoughts of hurting themselves.
have used an LLM to help with depression.
“Which mental health symptoms or issues
did you use an LLM to help with?”
What's more scary is that people have used LLM tools
during a mental health crisis.
In the past 12 months, respondents reported that they have used an LLM during mental health crisis, including suicidal thoughts or ideation, severe depressive episode, intense anxiety or panic attack, acute emotional distress following a traumatic event, self-harm urges or behaviors, relationship or interpersonal crisis, substance use or withdrawal crisis, threats of violence, or someone trying to coerce or control them.
of LLM users said had turned to AI during a relationship or interpersonal crisis.
had turned to AI during intense anxiety or panic attack.
had turned to AI during acute emotional distress following a traumatic event.
had turned to AI during suicidal thoughts or ideation.
had turned to AI during self harm urges or behaviors.
had turned to AI during substance use or withdrawal crisis.
“Have you used an LLM during a mental health crisis?”
These are not casual check-ins, they are moments of crisis, vulnerability, and deep emotional need.
Why would they go to a LLM than a human therapist?
And does it actually help?
Among 243 respondents who has used LLM for mental health support or therapy-related goals in the past year, the top motivations were strikingly simple.
90.1% said, “It’s accessible anytime I need it.”
70.4% said, “It was free or low-cost compared with therapy.”
58.9% said, “I felt it could provide quick answers or relief.
50.6% said, “I was curious about how it could help.”
46.5% said, “I wanted to remain anonymous.”
29.2% said, “I did not have access to a human therapist.”
21% said, “I prefer interacting with an AI over a human.”
As for how helpful LLM products are for supporting mental health, people have mixed but in general positive views.
Overall, 82.3% of the respondent reported that LLM use had improved their mental health or well-being. For example, one participant reported,
“Last week I needed to understand why I randomly cry sometimes during episodes and LLM gave me clarity and a better understanding that crying is accepted and okay. It’s how we vent to feel through and heal through life altering events.”
The perceived benefit was further reflected in participants’ ratings of specific aspects of LLM helpfulness, with particularly strong positive evaluations for emotional support (74.9% rating as helpful or very helpful), practical advice (86.8% rating as helpful or very helpful) and crisis management (56% rating as helpful or very helpful). For example, one participant reported,
“I found it extremely helpful when my son was hospitalized last year with severe OCD. I felt overwhelmed by the whole situation and asked for advice on how to manage my stress and what I could do to help him deal with his issues as well. I found it helpful to have a neutral third party of sorts to bounce ideas off of and it was helpful just as an overall distraction from worry getting out of control.”
For overall satisfaction, 15.6% of participants found the effect neutral or complicated, with one writing,
“There have been a couple instances of the LLM providing responses regarding mental health that are too generic and aren’t particularly helpful to my situation.”
Still, there're 2% of the LLM users who have used LLM tools for mental health found it unhelpful or very unhelpful (7.8% rated unhelpful for emotional support, 3.3% for practical advice and 7% for crisis management). Among them, a participant reported,
“One time when I was in a depressive episode, I asked for coping strategies and not only got the usual ‘go outside, eat healthy, workout’ etc. advice that I have obviously tried, but it overwhelmed me with information and I didn’t want to read any of it.”
LLMs are not built for therapy.
Yet people continue to use them — out of curiosity, accessibility, or the simple wish to feel heard.
There are both opportunities and dangers in this.
On one hand, these tools can make emotional support more approachable, help people articulate their feelings, and even motivate them to seek real help.
On the other, they can give a false sense of safety, offer misleading advice, or deepen isolation when users need human care the most.
If we cannot stop people from using AI for mental support, how might we guide this use instead?
How can designers, researchers, and policymakers define the boundaries of AI’s role in emotional wellbeing, not as a replacement for therapy, but as a companion that supports responsible healing?
Perhaps the question is not whether we should or shouldn’t talk to machines,
but how we can talk to them, and design them, responsibly.
Reference:
- ousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025). Large language models as mental health resources: Patterns of use in the United States. Practice Innovations. Advance online publication. https://dx.doi.org/ 10.1037/pri0000292
- ChatGPT may be the largest provider of mental health support in the United States. (2025, March 18). Sentio University. https://sentio.org/ai-research/ai-survey
- Nadeem, R. (2025, September 17). How Americans View AI and Its Impact on People and Society. Pew Research Center. https://www.pewresearch.org/science/2025/09/17/how-americans-view-ai-and-its-impact-on-people-and-society/
- Luo, X., Ghosh, S., Tilley, J. L., Besada, P., Wang, J., & Xiang, Y. (2025). "Shaping ChatGPT into my Digital Therapist": A thematic analysis of social media discourse on using generative artificial intelligence for mental health. Digital health, 11, 20552076251351088. https://doi.org/10.1177/20552076251351088
- Quiroz-Gutierrez, M. (2025, June). Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren’t considering. Fortune. https://fortune.com/2025/06/01/ai-therapy-chatgpt-characterai-psychology-psychiatry/