Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

Chinese stocks may recover after Iranian ceasefire

April 12, 2026

Teenage sprint star Gout wins 200 meters in 19.67 seconds | Track and Field News

April 12, 2026

Hungarians vote in landmark election

April 12, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Humanity users face new choices – opt out or share your chat for AI training
AI

Humanity users face new choices – opt out or share your chat for AI training

adminBy adminSeptember 2, 2025No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


Humanity is making major changes to the way user data is processed, so all Claude users will need to decide by September 28th whether they want to use it to train their AI models. When asked what prompted the move, the company led us to a blog post about policy changes, but we formed some of our own theories.

But first, what’s changing: Previously, humanity didn’t use consumer chat data for model training. Currently, the company wants to train AI systems in user conversations and coding sessions, saying data retention has been extended to five years for those who don’t opt ​​out.

It’s a massive update. Previously, users of human consumer products were told that prompts and conversation outputs were automatically removed from human backend within 30 days “unless a legal or policy is kept longer as necessary,” or that input and output of users may be reduced for up to two years, if that input is sometimes reduced for up to two years.

Consumer means that the new policy will be applied to Claude Free, Pro, and Max users, including those using Claude Code. Business customers using Claude Gov, Work’s Claude, Claude for Education, or API access will not be affected.

So why is this happening? That post on the update states that Human Frames show changes around user choices and not opting out makes it a system that will improve the safety of the model, detect harmful content more accurately, and reduce the likelihood of flagging harmless conversations. Users say, “We will help future Claude models improve with skills such as coding, analysis, and inference, and ultimately lead to better models for all users.”

In short, help us to help you. But the complete truth is probably a bit selfless.

Like all other large language modeling companies, humanity needs more data than it needs to have fuzzy feelings about brands. Training AI models requires a huge amount of high quality conversational data, and accessing millions of Claude interactions should provide accurately the kind of real-world content that can improve humanity’s competitive positioning against rivals such as Openai and Google.

TechCrunch Events

San Francisco
|
October 27th-29th, 2025

Beyond the competitive pressures of AI development, this change will appear to reflect a wider industry shift in data policy as it faces humanity and companies like Openai on data retention practices. Openai, for example, is currently fighting a court order that forces all consumer ChatGPT conversations to be held indefinitely, including deleted chats, due to lawsuits filed by the New York Times and other publishers.

In June, Openai COO Brad Lightcap called it a “defensible and unnecessary demand” of “a fundamental conflict with privacy commitment to users.” While court orders affect ChatGpt Free, Plus, Pro, and team users, customers with zero data retention agreements are still protected.

What’s surprising is the amount of confusion that all of these changing usage policies are creating for users, and many of them remain unforgettable.

To be fair, everything is moving rapidly, so as technology changes, our privacy policy will change. However, many of these changes are rather drastic and have been mentioned only briefly in other corporate news. (You wouldn’t think that Tuesday’s policy changes for human users are very big news based on where the company placed this update on its press page.)

Image credits: Humanity

However, many users are not aware that the agreed guidelines have been changed as the design actually guarantees it. Most ChatGPT users will continue to click on the “Delete” toggle that is not technically deleted. On the other hand, the implementation of new policies for humanity follows familiar patterns.

Why? New users will choose their preferences while signing up, but existing users face pop-ups with “Consumer Terminology and Policy Update” in large text, with a prominent black “Accept” button.

As observed today in The Verge, design raises concerns that users may click quickly to “accept” without realizing that they have agreed to data sharing.

On the other hand, the interests for user perceptions do not increase. Privacy experts have long warned that the complexity surrounding AI makes meaningful user consent almost impossible. Under the Biden administration, the Federal Trade Commission has stepped in to warn that enforcement measures are at stake if AI companies engage in “secretly changing their terms of service or privacy policies, or filling in legal printed disclosures”;

Whether the committee is currently operating with three of the five commissioners, whether we still look to these practices today is an open question we place directly at the FTC.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleTrump says he orders federal intervention in Chicago and Baltimore
Next Article Behind smiles with Putin, Modi and Xi at the Chinese summit
admin
  • Website

Related Posts

Sam Altman responds to ‘inflammatory’ New Yorker article after home attack

April 11, 2026

Anthropic has temporarily banned the creator of OpenClaw from accessing Claude

April 10, 2026

TechCrunch heads to Tokyo – bringing the startup battleground

April 10, 2026

Stalking victim sues OpenAI, claiming ChatGPT fueled her abuser’s delusions and ignored warnings

April 10, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

SZA talks about Justin Bieber’s Coachella performance

By adminApril 11, 20260

Last October, he said, “I’m probably going to do both (SWAG) projects in full.” “And…

Ciara Miller talks about Amanda Batula’s ‘West Wilson Romance’

April 11, 2026

Kylie Jenner, Kourtney Kardashian fashion

April 11, 2026

Kylie Jenner tours her Palm Springs home

April 11, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

Failure of US-Iran talks deals blow to hopes of finding exit to crisis

April 12, 2026

Hungary’s vote means President Trump’s closest ally in Europe faces its toughest test yet. Here’s what you need to know

April 12, 2026

What happened when a small propeller plane encountered a US aircraft carrier in a war zone

April 12, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.