Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

How afraid should we be of AI? Oscar winner’s new film seeks answers

April 10, 2026

Aubrey Plaza and Christopher Abbott leave hospital after pregnancy news

April 10, 2026

Raising my son with Angelman has changed me as a doctor and father.

April 10, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Stalking victim sues OpenAI, claiming ChatGPT fueled her abuser’s delusions and ignored warnings
AI

Stalking victim sues OpenAI, claiming ChatGPT fueled her abuser’s delusions and ignored warnings

adminBy adminApril 10, 2026No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


After months of conversations with ChatGPT, the 53-year-old Silicon Valley entrepreneur became convinced he had discovered a cure for sleep apnea and that powerful people were coming after him, according to a new lawsuit filed in California Superior Court in San Francisco County. He then allegedly used the tool to stalk and harass his ex-girlfriend.

Her ex-girlfriend is now suing OpenAI, claiming its technology enabled accelerated harassment of her, TechCrunch has learned exclusively. She claims that OpenAI ignored three separate warnings that the user posed a threat to others, including an internal flag that classified the user’s account activity as related to weapons of mass casualty.

The plaintiff, called Jane Doe, is suing for punitive damages. She also asked the court on Friday to force OpenAI to block users’ accounts, prevent them from creating new accounts, notify them if they attempt to access ChatGPT, and preserve complete chat logs in case they are discovered.

Doe’s lawyer said OpenAI agreed to suspend some users’ accounts, but refused the rest. The company says it is withholding information about any specific plans users may have discussed with ChatGPT to harm Doe or other potential victims.

The lawsuit comes amid growing concerns about the real-world risks of sycophantic AI systems. GPT-4o, the model cited in this case and many others, was retired from ChatGPT in February.

The lawsuit was brought by Edelson PC, the same firm that filed wrongful death lawsuits involving teens Adam Lane and Jonathan Gabaras, who committed suicide after months of conversations with ChatGPT. Gabaras’ family alleges that Google’s Gemini fueled his paranoia and potential mass casualty incident before his death. Lead attorney Jay Edelson warned that AI-induced psychosis is escalating from individual harm to mass casualty incidents.

That legal push is now in direct conflict with OpenAI’s legislative strategy. The company supports an Illinois bill that would exempt AI research institutes from liability in cases involving mass deaths or catastrophic economic damage.

tech crunch event

San Francisco, California
|
October 13-15, 2026

OpenAI was not available for comment. TechCrunch will update this article if companies respond.

The Jane Doe lawsuit details how that responsibility fell on one woman over several months.

A ChatGPT user who joined the lawsuit last year (who is not named in the lawsuit to protect his identity) became convinced that he had invented a cure for sleep apnea after months of “heavy, sustained use of GPT-4o.” When no one took his work seriously, ChatGPT told him that “powerful forces” were monitoring him, including using helicopters to monitor his activities, according to the complaint.

In July 2025, the user’s ex-girlfriend (called Jane Doe to protect her identity) urged him to stop using ChatGPT and seek help from a mental health professional. Instead, he returned to ChatGPT, which assured him he was at “sanity level 10” and served to further strengthen his delusions, according to the lawsuit.

Doe broke up with the user in 2024 and used ChatGPT to process the breakup, according to emails and communications cited in the complaint. Rather than push back on his one-sided explanation, she repeatedly accused him of being unreasonable and unfair and her manipulative and unstable. He then took the AI-generated conclusions from the screen into the real world and used them to stalk and harass her. This manifests itself in several AI-generated clinical psychological reports that he distributed to her family, friends, and employers.

Meanwhile, users continued to spiral. In August 2025, OpenAI’s automated security system flagged him for “weapons of mass casualty” activity and disabled his account.

A member of our human safety team reviewed the account the next day and reinstated it. However, his account may have contained evidence that he was targeting and stalking individuals, including Doe, in real life. For example, a September screenshot sent to Doe by a user showed a list of conversation titles such as “Expanded Violence List” and “Fetal Asphyxia Calculations.”

The decision to reinstate it is notable in the wake of two recent school shootings at Tumbler Ridge in Canada and Florida State University. OpenAI’s safety team had flagged the Tumbler Ridge shooting as a potential threat, but upper management reportedly decided not to alert authorities. The Florida Attorney General’s Office this week launched an investigation into OpenAI’s possible ties to the FSU shooter.

According to Jane Doe’s lawsuit, when OpenAI restored her stalker’s account, his Pro subscription was not restored along with it. He emailed the trust safety team, copied Doe’s message, and resolved the issue.

He wrote in an email something like, “I need help right away. Please call me!” “This is a matter of life and death.” He claimed that he was “writing 215 scientific papers” and was writing them so quickly that he “didn’t even have time to read them.” These emails included a list of dozens of AI-generated “scientific papers” with titles like “Deconstructing Race as a Biological Category_Legal, Scientific, and Horn of Africa Perspectives.pdf.txt.”

“The user’s communications unequivocally informed him that he was mentally unstable and that ChatGPT was the driving force behind his delusional thinking and escalating behavior,” the complaint states. “A series of urgent, chaotic, and spectacular claims by users, along with specific reports generated by ChatGPT that specifically targeted Plaintiff by name and a vast amount of purported ‘scientific’ material, were unmistakable evidence of that reality. OpenAI did not intervene, restrict access, or implement any safeguards. Instead, Plaintiff was able to continue using her account and full professional access was restored.”

Doe filed an abuse notification with OpenAI in November, claiming in the lawsuit that she was living in fear and could not even sleep at home.

“For the past seven months, he has weaponized this technology to create public destruction and humiliation against me that would not have been possible without it,” Doe wrote in a letter to OpenAI, asking the company to permanently ban the user’s account.

In response, OpenAI acknowledged that the report was “extremely serious and concerning” and said it was carefully reviewing the information. There was no reply.

Over the next few months, users continued to harass Doe by sending her a series of threatening voicemails. He was arrested and charged in January with four felonies: communicating a bomb threat and assault with a deadly weapon. Doe’s lawyers argue that this corroborates warnings that both she and OpenAI’s own safety systems raised months ago, warnings that the company allegedly chose to ignore.

Doe’s lawyer said he was found incompetent to stand trial and was committed to a mental health facility, but will soon be released due to “procedural deficiencies by the state.”

Edelson called on OpenAI to cooperate. “In each case, OpenAI chose to hide critical safety information from the public, from victims, and from those actively at risk from its products,” he said. “We’re calling on them to do the right thing, once again. Human lives mean more than OpenAI’s IPO race.”



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleKia Motors plans to launch pickup truck in the US by 2030
Next Article Disney plans to lay off up to 1,000 employees
admin
  • Website

Related Posts

TechCrunch heads to Tokyo – bringing the startup battleground

April 10, 2026

Last 24 hours: Save up to $500 on Disrupt 2026 passes

April 10, 2026

Is Anthropic restricting the release of Mythos to protect the internet? Or Anthropic?

April 10, 2026

Meta AI app rises to #5 in App Store after Muse Spark launch

April 10, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

Aubrey Plaza and Christopher Abbott leave hospital after pregnancy news

By adminApril 10, 20260

megan fox and machine gun kellyFrom the moment the Jennifer’s Body star met rapper Colson…

Laguna Beach star reveals off-camera encounter

April 10, 2026

Summer House’s Amanda Batula apologizes over West Wilson dating scandal

April 10, 2026

Alix Earle’s first date was terrible and she feared being kidnapped after Braxton Berrios broke up

April 10, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

King Promise and other African artists don’t want to be pigeonholed into Afrobeats

April 10, 2026

Wife commits suicide, abusive husband jailed in landmark case

April 10, 2026

US allies in Asia are desperate for fuel and turning to adversaries instead

April 10, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.