Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

Anthropic launches code review tool to check large amounts of AI-generated code

March 9, 2026

Antropic sues Trump administration over Pentagon blacklist

March 9, 2026

President Trump vows to block legislation up to the “SAVE America Act”

March 9, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Experts talk about when you should and shouldn’t use ChatGPT as a therapist
Finance

Experts talk about when you should and shouldn’t use ChatGPT as a therapist

adminBy adminMarch 8, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


As Americans become increasingly lonely, more people are getting emotional support from artificially intelligent chatbots, worrying some mental health experts.

“There’s a lot of talk about AI for therapy[and]emotional support,” says Lianna Fortunato, a licensed clinical psychologist and director of quality and healthcare innovation at the American Psychological Association. “Anecdotally, providers are talking about it. We know from research that people are increasingly using AI tools for that kind of support.”

Some chatbot users inadvertently get drawn into mental health-related conversations, for example by complaining about a stressful day to a digital entity that is supposed to listen. Fortunato said some people seek mental health advice from AI chatbots, which are not licensed professionals but are cheaper than therapists.

In a health survey survey of more than 20,000 U.S. adults, 10.3% of participants said they use generative AI daily. Of this group, 87.1% reported using technology for personal reasons such as advice or emotional support. The study was published on January 21 and was conducted by researchers at Massachusetts General Hospital, Weill Cornell Medical College, Northeastern University, and other institutions.

On TikTok, the search term “Therapy AI Bot” has at least 11.5 million posts, ranging from users sharing the best prompts to turn chatbots into therapists to medical professionals warning about potential dangers.

Technology companies are spending billions of dollars developing AI tools and seeking to further integrate them into people’s daily lives. But historically, AI chatbots have not always understood when users are facing a serious health crisis, and have not always responded accordingly. The New York Times reported on November 23 that “nearly 50 people have suffered mental health crises while chatting with ChatGPT,” and three of them have died.

Companies like Anthropic, Google, and ChatGPT maker OpenAI say they’re working with mental health experts to make their tools more responsive to sensitive conversations. An OpenAI spokesperson told CNBC Make It, “These are incredibly heartbreaking situations, and our hearts go out to everyone affected.” “Working closely with mental health clinicians and experts, we will continue to improve ChatGPT’s training to recognize and respond to signs of distress, defuse conversations during sensitive moments, and direct people to real-world support.”

An April 2025 paper written by OpenAI product policy researchers says that frequent conversations with AI companions can impair people’s real-life social skills. Heavy daily use of ChatGPT is correlated with increased feelings of loneliness, an OpenAI-MIT Media Lab study also published in April 2025 found.

The American Psychological Association strongly recommends against using AI as a substitute for therapy or mental health support.

Some mental health experts say chatbots can be used without risk for certain related topics. Here’s what you need to know:

“I think of it as a tool, and I think tools are useful.”

AI chatbots could help us learn about mental health, says psychotherapist and lifestyle coach Esin Pinari. They can help you create journaling prompts for reflection and can also ask for links to research papers on coping strategies, treatment options, and other questions about mental health conditions, she says.

“I don’t think of it as a (replacement for) therapy. I think of it as a tool, and I think tools can be helpful,” said Pinari, founder of Eternal Wellness Counseling, a private practice based in Boca Raton, Florida. She says her clients sometimes talk to ChatGPT about specific situations in their personal lives, run through the responses, and then take action.

Pinari said his personal AI testing confirmed that chatbots were using language that supported “unhealthy behaviors” in users. For example, if you ask a chatbot about a conflict with a friend, it might tell you that your friend is being too sensitive, when in fact it’s your fault.

If interacting with an AI chatbot impacts your mental health, Fortunato recommends asking yourself the following questions:

Is there a reliable source that I can cross-check this information with? Is there a provider that I can ask these questions to?

Trustworthy sources include peer-reviewed scientific studies, articles in the health press, and resources from medical organizations such as Harvard Health Publishing and the Mayo Clinic. “AI has the potential to really increase people’s access to health information,” Fortunato says. “[But]AI doesn’t always provide the right information.”

Keep these considerations in mind when using AI

Pinari and Fortunato agree that AI chatbots should not be used to diagnose or receive support for mental health crises, especially suicidal thoughts. During an ongoing mental health crisis, you can always call or text the Suicide and Crisis Lifeline at 988. This service is confidential and available 24 hours a day, 7 days a week, and free of charge.

“We’ve seen some very high-profile cases where AI failed to properly respond to situations, especially with young people and vulnerable groups who may be at risk,” Fortunato said. “Continued to engage with people in crisis. Did not provide resources for the crisis. Did not challenge problematic thought patterns.”

Both men also argue that medical records and personally identifying information should not be shared because conversations with chatbots are not confidential or legally protected. And we shouldn’t rely on AI to solve real-world interpersonal problems, Pinari says.

“You need someone with a different nervous system sitting across from you to pay attention to your body language and tone of voice,” she says. Chatbots are “not emotionally challenging and do not require reciprocity.”

If you are experiencing a mental health crisis or have any mental health symptoms, please contact our free and confidential National Mental Health Helpline at 1-800-662-HELP (4357).

Want to improve your communication, confidence, and success at work? Take CNBC’s new online course, Mastering Body Language for Influence.

Manage your money with CNBC Select

CNBC Select is editorially independent and may earn commission from affiliate partners on our links.

We bought a fire station for $90,000 and turned it into our dream home



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleTwo confidence-building tools for parents to help their children succeed
Next Article Norwegian police do not rule out possibility of terrorism in US embassy bombing, but stress investigation is in early stages
admin
  • Website

Related Posts

President Trump vows to block legislation up to the “SAVE America Act”

March 9, 2026

Crude oil price falls below $100 at G7 energy ministers meeting

March 9, 2026

Stock Market Today: Live Updates

March 9, 2026

South Korea’s Kospi sinks as widespread rout in Asian markets triggers circuit breaker

March 9, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

Tom Brady and Alix Earle also attended the same Las Vegas party

By adminMarch 9, 20260

Fans will be excited about this. Just over two months after Alix Earle and Tom…

Beverly Mitchell talks about the fear of colon cancer

March 9, 2026

Conan O’Brien talks about Rob Reiner and Michelle Reiner’s death

March 9, 2026

Donna Kelsey’s home renovation: Jason Kelsey’s reaction

March 9, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

China warns of escalating ‘war’ but welcomes meeting between Xi and Trump

March 9, 2026

Many European leaders have tiptoed around President Trump’s war with Iran. not the prime minister of spain

March 9, 2026

‘We will rave Putin’s grave’: dancing has become resistance for some Ukrainians after four years of war

March 9, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.