The Mangroves Seeds of Change, LLC
  • Home
  • Who We Are
    • Heidi Jameson, LMHC
  • Referrals
  • FAQs
  • Contact Us
  • Blog
  • Resources
  • Home
  • Who We Are
    • Heidi Jameson, LMHC
  • Referrals
  • FAQs
  • Contact Us
  • Blog
  • Resources

Dangers of using ChatGPT instead of a Human Therapist

6/10/2025

0 Comments

 
Picture

At first glance, AI looks like a great way to improve mental health.  It’s free or very low cost, it’s available all day every day, and you don’t have to leave your bed to use it.  It doesn’t care if you’re awkward or if you had a shower that day.  You don’t even have to “people” to start helping yourself.  

These advantages make ChatGPT and its AI cousins very attractive, and it’s true that it can be a helpful tool in improving your mental health.  However, it should never be used to replace real counseling with a human counselor or therapist (and if you ask it, it will tell you that).  Here are four of the reasons why: 

Giving incorrect information

A 2024 study published in American Psychologist showed that both voice- and text-based AI (examples include Amazon Alexa, Apple Siri, Google Assistant, and Microsoft Cortana) and text-based chatbots (e.g., ChatGPT and AI psychotherapist Woebot) “may unintentionally retrieve misinformation from human knowledge databases, confabulate responses on their own, or purposefully spread disinformation for political purposes” (Huang, Cheng, & Rajaram, 2024).  Users also report mixed positive and negative experiences with using AI as a mental health tool, citing accuracy and reliability errors, inability to perform assessments, and a lack of understanding of moral and linguistic differences (Alanezi, 2024). 

Creating false memories

 The Huang study further showed that even when the AI bot warned participants ahead of time that the information may not be accurate, “77% of the socially misinformed words became the participants’ false memory” (Huang, et al., 2024).   The results may differ across cultures, as the study also showed that “individuals from Asian cultures and the United States placed greater trust in robots than individuals from European cultures, and such trust is positively correlated with robot reliability or consistent ability”, meaning people from these cultures are more likely to trust the bot than people from European cultures (Huang, et al., 2024).  Either way, the human brain is more malleable than we realize and it’s up to each of us to protect our own. 

No privacy

Mental health professionals in the United States, like medical professionals, are required by law to keep your information confidential.  With very few exceptions, anything considered "individually identifiable health information" must be kept confidential and protected from data breaches  (Summary of the HIPAA, 2025).  On ChatGPT (and other AI), anything you say or type into the platform can and often is used by others (OpenAI Privacy, 2024). 

No feelings, body language, etc.
 
 
Although AI can say things that sound empathetic, the responses are automatically generated and don’t reflect actual empathy (Rozental et al., 2018 as cited by Carlbring, et al., 2023).  It also can’t identify if there is a disconnect between what a client says and how their body language indicates they are actually feeling - something mental health professionals are trained to recognize and address.  This removes the most effective part of therapy, the therapeutic alliance, which is one of the best indicators of potentially successful treatment (Wampold and Flückiger, 2023 as cited by Carlbring, et al., 2023).  

Despite the disadvantages, AI can be an effective tool to use in addition to your human counselor.  Some of our clients have used it to help them prepare their thoughts before a session.  Others have used it to help practice challenging negative thoughts.  It can also be used to help with task management, decision fatigue, or even what to make for dinner (one of our favorite recommendations for clients is Goblin.tools).  

Even so, no bot can provide the expert, personalized treatment that a human professional counselor or therapist can - or the heart that cares about your well-being.  


Sources: 

Alanezi, F. (2024). Assessing the Effectiveness of ChatGPT in Delivering 
Mental Health Support: A Qualitative Study. Journal of
Multidisciplinary Healthcare, 17, 461–471. 
https://doi.org/10.2147/JMDH.S447368

Carlbring, P., Hadjistavropoulos, H., Kleiboer, A., & Andersson, G.(2023, April 11). A new era in Internet interventions: The advent of Chat-GPT and AI-assisted therapist guidance. Pubmed Central - National Library of Medicine - National Institutes of Health. Retrieved June 10,2025, from 
https://pmc.ncbi.nlm.nih.gov/articles/PMC10235420/#bb0090

Huang, T.-R., Cheng, Y.-L., & Rajaram, S. (2024). Unavoidable social 
contagion of false memory from robots to humans. American 
Psychologist, 79(2), 285–298. https://doi.org/10.1037/amp0001230

OpenAI privacy policy. (2024, November 4). Open AI Policies - Privacy Policies. Retrieved June 10, 2025, from
https://openai.com/policies/privacy-policy/

Rozental A., Castonguay L., Dimidjian S., Lambert M., Shafran R., 
Andersson G., Carlbring P.  Negative effects in psychotherapy: 
commentary and recommendations for future research and clinical practice. BJPsych Open. 2018;4(4):307–312. 
doi: 10.1192/bjo.2018.42.

Summary of the HIPAA privacy rule. (2025, March 14). Health and HumanServices Health Information Privacy. Retrieved June 10, 2025, from https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html#what

Heidi Jameson, LMHC is the founder of The Mangroves Seeds of Change, LLC.  You can read more about her here.

Follow us on Facebook, X, BlueSky, and LinkedIn. 



0 Comments



Leave a Reply.

    Archives

    September 2025
    July 2025
    June 2025
    April 2025
    March 2025
    February 2025
    September 2024
    July 2024
    June 2024
    May 2024
    December 2023
    November 2023
    October 2023

    Categories

    All

    RSS Feed

How Can We Help You?


Hours

M-F: 9:00 am - 5:00 pm

Telephone

863-268-5802

Email

[email protected]