Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

Paris Jackson, what more family members said

April 24, 2026

Timberwolves lead NBA playoffs over Nuggets 2-1, Hawks beat Knicks | Basketball News

April 24, 2026

Dangerous meme trading is back. Changes in trading rules may have lit the fuse

April 24, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Try to make people less dependent on chatbots
Finance

Try to make people less dependent on chatbots

adminBy adminApril 24, 2026No Comments7 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


Amelia Miller’s side hustle is a little unusual.

Miller, a fellow at Harvard University’s Berkman Klein Center for Internet and Society, is also a relationship coach, especially for people who have formed emotional connections with artificially intelligent chatbots. “Whether your chatbot acts as a therapist, assistant, lover, or friend, I help you be intentional about the relationships you want and how to build them,” her website says.

Her main goal, she says, is to help her clients, primarily men working in the tech industry, ensure that their relationships with chatbots don’t compromise their “ability to connect with real people.” Since launching her coaching business in June 2025, Miller said she has received “an incredible amount of support,” usually through the contact form on her website. The 29-year-old says she has given one-on-one sessions to more than a dozen clients in-person and virtually in New York City.

“The limit isn’t interest, it’s my ability to dedicate my time and energy to it. Right now, I’m accepting as many people as I have the bandwidth,” Miller says.

The mere existence of Mr. Miller’s second work highlights new and developing dynamics in modern life. It’s about how some AI users anthropomorphize technology, how they rely more and more on it every day, and what that means for their mental health and social well-being. According to a health survey of more than 20,000 U.S. adults released on January 21, more than 10% of U.S. adults use generated AI on a daily basis, and nearly 90% of them use AI for personal reasons such as emotional support or advice.

AI is becoming more popular among young people. Nearly a quarter of Americans between the ages of 18 and 21 reported using AI for mental health advice in early 2025, according to a research study conducted by researchers from nonprofit think tank Rand, Harvard Medical School, and Brown University School of Public Health. The survey, which surveyed a representative sample of 1,058 young people between the ages of 12 and 21 nationwide, found that about two-thirds of users “asked for advice monthly or more frequently.”

Don’t miss: Communication skills that can help accelerate your career growth

According to an April 2025 paper from OpenAI Product Policy researchers, heavy use of AI chatbots can change people’s mental health and social well-being in both positive and negative ways. That is, while it may help “isolated or neurotic” people practice social skills, it may impair other people’s ability to connect emotionally with humans in real life. The paper mentions two AI users who took their own lives after “increasingly spending time with companion AIs and commensurately withdrawing from human relationships in the months leading up to the suicides, indicating over-reliance on companion AIs.”

AI companies like Anthropic, Google, and OpenAI are powering chatbots to improve responses to mental health crises, arguing that their technology should not replace professional mental health care. For example, in August, OpenAI added a feature to ChatGPT that encourages users to take a break after using the tool for a long time, rather than remaining responsive.

Still, modern mental health providers should ask patients if and how they use AI chatbots in their daily lives, New York University researchers suggested in an April 1 paper citing youth survey data. In Miller’s practice, she says two of her clients who were in romantic relationships with each other separately sought advice on ChatGPT about their marital disagreements, ultimately making the fights worse.

“The model will validate each of their points of view, and then they’ll come back and be even more adamant that their view was correct,” Miller says.

The three-part philosophy of a human-AI relationship coach

Miller says her coaching is primarily based on her experience as a social and computer scientist, including 10 years of research on “how technology affects intimacy.” She declined to share the fees she charges customers. “My goal is to help people and find a way to keep it viable as a business,” Miller said.

Her coaching business began as dissertation research for her Masters in Internet Social Science at the University of Oxford. As she researched the potential “social and ethical implications” of using AI companion products, she placed an ad on the internet to connect with people working at companies developing AI chatbots.

Then, “people who had relationships with the chatbot started reaching out to me just to talk, and I started taking on some of those conversations,” Miller says. Initially, she viewed those conversations as research. Eventually, she says a friend suggested she start a coaching practice to help people who came to her for support.

Miller’s coaching philosophy consists of three pillars, she said.

Artificial Intimacy Literacy: Teach clients how AI chatbots are “designed to foster emotional attachment” and share the latest research on the consequences of engaging with overly positive, people-pleasing chatbots. “The goal is that if they understand these mechanisms, they can better manage their influence,” Miller says. Developing a personal AI constitution: Ask people who frequently use chatbots to reflect on how they use chatbots and whether they feel positive or negative about their AI usage habits. She’s particularly interested in helping people think about “how their relationships with chatbots are impacting their relationships,” she says. “Analog Gym”: Engages clients in exercises that, as Miller puts it, “help rebuild social muscles that are atrophied by technology.” This exercise often encourages you to be more vulnerable and present during the actual conversation. The goal, she says, is to help clients interact more confidently with other people.

Her coaching practice isn’t the only resource for people who find themselves relying too much on AI chatbots. Internet and Technology Addicts Anonymous is offering support to people who feel the use of AI chatbots is “coercive and harmful.” Some mental health professionals are providing treatment to patients with high levels of chatbot usage. Others find solace in each other through community groups on platforms like Reddit.

Miller said she tells her clients that she is not a psychologist or therapist. Her work is different from what most mental health professionals offer, she added. “We need people at the intersection of social science and computer science to step in and really think about how we can help people use these tools to build the relationships they want.”

People with “more serious psychological conditions” should instead consult a mental health professional, she says.

In October, Miller added group workshops to her coaching repertoire, and says she has worked with up to 70 people at a time at four different technology-focused organizations and conferences. She added that she does not yet charge money for the group workshops and is working at her day job adapting them for elementary, high school and college students.

In all formats, Miller says, “My goal is to help people be more mindful of how talking to a chatbot can rewire their expectations of relationships and their desire to participate in them.” “[I]want to help people prevent these changes before it’s too late.”

If you are experiencing a mental health crisis or have any mental health symptoms, please contact our free and confidential National Mental Health Helpline at 1-800-662-HELP (4357). If you are having suicidal thoughts or are in distress, please contact the Suicide & Crisis Lifeline (988) for support and assistance from a trained counselor.

Do you want to get ahead at work? Next, you need to learn how to make effective small talk. In CNBC’s new online course, “How to Talk to People at Work,” expert instructors share practical strategies for using everyday conversations to increase visibility, build meaningful relationships, and accelerate career growth. Sign up now! Use coupon code EARLYBIRD for 20% off. Offer valid from April 20, 2026 to May 4, 2026. Terms and conditions apply.

Manage your money with CNBC Select

CNBC Select is editorially independent and may earn commission from affiliate partners on our links.

When to use an AI chatbot as a therapist and when not to use one?



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleMike Vrabel and Deanna Russini kiss in photo 6 years before scandal
Next Article Rumer Willis talks about dealing with mother shamers and online criticism as Louetta’s parent
admin
  • Website

Related Posts

Dangerous meme trading is back. Changes in trading rules may have lit the fuse

April 24, 2026

Report says new graduates are finding jobs faster despite competitive job market

April 24, 2026

Traders are betting on a big move in Intel’s earnings

April 24, 2026

Regeneron signs drug price agreement with President Trump to provide free hearing loss treatment

April 24, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

Paris Jackson, what more family members said

By adminApril 24, 20260

When Colman Domingo, who plays his late grandfather Joe Jackson, said Prince and Paris were…

D4vd prosecutors claim they possessed images of child sexual abuse

April 24, 2026

Rumer Willis talks about dealing with mother shamers and online criticism as Louetta’s parent

April 24, 2026

Mike Vrabel and Deanna Russini kiss in photo 6 years before scandal

April 24, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

Third US aircraft carrier arrives in the Middle East as President Trump gives no date for ending the Iran war

April 24, 2026

Live updates: President Trump clearly rejects timeline for ending war with Iran: ‘Don’t rush’

April 24, 2026

Health workers and patients in Africa struggle to find contraceptives after U.S. aid cuts

April 24, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.