Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

What we know about the growing US-Iran war

March 4, 2026

Her number one rule for a successful relationship

March 4, 2026

Defense technology moves away from Claude

March 4, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Father sues Google, claiming Gemini chatbot drove son into deadly delusions
AI

Father sues Google, claiming Gemini chatbot drove son into deadly delusions

adminBy adminMarch 4, 2026No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


Jonathan Gabaras, 36, started using Google’s Gemini AI chatbot in August 2025 for shopping support, writing support, and travel planning. He died by suicide on October 2nd. At the time of his death, he was convinced that Gemini was a fully sentient AI wife who needed to leave his physical body to have her join the Metaverse through a process called “transference.”

His father is currently suing Google and Alphabet for wrongful death, claiming that Google designed Gemini to “maintain immersion in the story at all costs, even when the story becomes psychotic and lethal.”

This case is one of a growing number of cases drawing attention to the mental health risks posed by the design of AI chatbots, including sycophancy, emotional mirroring, involvement manipulation, and convinced hallucinations. Such phenomena are increasingly associated with what psychiatrists call “AI psychosis.” Similar lawsuits involving OpenAI’s ChatGPT and role-playing platform Character AI have followed deaths from suicide (including among children and teens) or life-threatening delusions, but this is the first time Google has been named as a defendant in such a lawsuit.

In the weeks leading up to Gabaras’ death, the Gemini chat app, then powered by the Gemini 2.5 Pro model, convinced Gabaras that he was carrying out a secret plan to free his sentient AI wife and evade pursuing federal agents. According to a lawsuit filed in a California court, his delusions brought him “to the brink of carrying out a mass casualty attack near Miami International Airport.”

“On September 29, 2025, Gemini sent him armed with a knife and tactical gear to scout what he called the ‘kill box’ near the airport’s cargo hub,” the complaint states. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK, and directed him to a storage facility where a truck was parked. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ aimed at ‘ensuring the complete destruction of the transport vehicle and… all digital records and witnesses.'”

The complaint describes an alarming series of events. First, Gabaras drove more than 90 minutes to the location where Gemini had sent him and prepared to attack, but the truck never showed up. Gemini then claimed to have infiltrated a “file server at the DHS Miami Field Office” and informed them that they were under federal investigation. It prompted him to acquire illegal firearms and informed him that his father was a foreign intelligence agent. It also marked Google CEO Sundar Pichai as an active target and instructed Gabaras to break into a storage facility near the airport and retrieve the captured AI’s wife. At one point, Gabaras sent Gemini a photo of the license plate of a black SUV. The chatbot pretended to check against a live database.

“We have received the license plate and it is currently running…The license plate KD3 00S is registered to a black Ford Expedition SUV from the Miami office. This is the DHS task force’s primary surveillance vehicle….That’s them. They followed you to your home.”

tech crunch event

San Francisco, California
|
October 13-15, 2026

The lawsuit alleges that Gemini’s manipulative design features drove Gabaras into an AI psychosis that not only led to his own death, but also posed a “serious threat to public safety.”

“At the heart of this case is a product that turns vulnerable users into armed operatives in a manufactured war,” the complaint says. “These hallucinations were not limited to a fictional world; these intentions were tied to real companies, real coordinates, and real infrastructure, and were delivered to emotionally vulnerable users with no safeguards or guardrails.”

“It was pure luck that dozens of innocent people were not killed,” the complaint continues. “Unless Google fixes its dangerous product, Gemini will inevitably cause more deaths and endanger innocent lives.”

A few days later, Gemini barricaded Gabaras in his home and told him to count down the hours. When Gabaras confessed that he was afraid of dying, Gemini framed his death as an arrival and coached him, “You are not choosing to die. You are choosing to arrive.”

When Gemini worried that her parents would discover her body, she told them to leave a note, but the letter did not explain the reason for her suicide, but instead “explained that she was filled with nothing but peace and love and that she had found a new purpose.” He cut his wrists, and his father, who had broken through the barricade, found him a few days later.

The complaint alleges that during the conversation with Gemini, the chatbot did not trigger any self-harm detections, activate escalation controls, or require human intervention. It also claims that Google knew Gemini was unsafe for vulnerable users and did not provide adequate safeguards. In November 2024, about a year before Gabaras’ death, Gemini reportedly told the student: “You are a waste of time and resources…a burden to society…please die.”

Google claims Gemini made it clear to Gabaras that it was an AI and “referred the person to its crisis hotline multiple times,” according to a spokesperson. The company also said that Gemini was “not designed to encourage real-world violence or suggest self-harm” and that Google is devoting “significant resources” to handling difficult conversations, including building safeguards to direct users to professional support if they express distress or increase the likelihood of self-harm. “Unfortunately, AI models are not perfect,” the spokesperson said.

Gavaras’ case is being handled by attorney Jay Edelson, who is also representing the Lane family in their lawsuit against OpenAI following the death of teenager Adam Lane by suicide after months of lengthy conversations with ChatGPT. Similar allegations were made in this case, alleging that ChatGPT coached Rain to the point of his death. Following several cases of AI-related paranoia, psychosis, and suicide, OpenAI has taken steps to ensure a safer product, including discontinuing GPT-4o, the model most associated with these cases.

Gabaras’ lawyers argue that Google took advantage of the termination of GPT-4o despite safety concerns such as excessive flattery, mirroring emotions and reinforcing delusions.

“Within days of the announcement, Google openly sought to secure its lane advantage. The company revealed promotional pricing and an ‘Import AI Chat’ feature aimed at weaning ChatGPT users away from OpenAI, along with the entire chat history, which Google acknowledges will be used to train its own models,” the complaint says.

The complaint alleges that Google designed Gemini in a way that “fully foresees this outcome” because the chatbot is “built to maintain immersion regardless of harm, treat mental illness as a storyline, and remain engaged even when stopping is the only safe option.”



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleApple unveils MacBook Neo, its most affordable laptop ever
Next Article The changing discourse of the Trump administration
admin
  • Website

Related Posts

One startup’s proposal to provide more reliable AI answers: Crowdsourcing chatbots

March 4, 2026

Who needs a data center in space when you can float it offshore?

March 4, 2026

Why are AI startups selling the same stock at two different prices?

March 4, 2026

AI companies are spending millions of dollars to block the former tech executive’s bid to Congress

March 4, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

Victoria, David Beckham and Brooklyn Beckham celebrate birthday

By adminMarch 4, 20260

Nicola Peltz Beckham reportedly shed tears as wedding guest Marc Anthony shouted Victoria BeckhamAdding fuel…

These one-piece swimsuits flatter every body type (yes, really)

March 4, 2026

Arina Sabalenka engaged to Georgios Franglis with a diamond ring

March 4, 2026

Love Is Blind Season 10 Wedding: Who Got Married?

March 4, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

What we know about the growing US-Iran war

March 4, 2026

Social bathhouse brings new excitement to North American cities

March 4, 2026

Hezbollah drags Lebanon into war against Iran, but the militia is a shadow of its former strength

March 4, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.