
The West Virginia Attorney General filed a consumer protection lawsuit. applealleges that it failed to prevent child sexual abuse material from being stored and shared via iOS devices and iCloud services.
Republican John “JB” McCaskey criticized other big tech companies for putting Apple’s privacy branding and its own business interests above the safety of children. google, microsoftand drop boxhas been more proactive in combating such materials using systems like PhotoDNA.
Developed in 2009 by Microsoft and Dartmouth College, PhotoDNA uses “hashing and matching” to automatically identify and block child sexual abuse object (CSAM) images if they have already been identified and reported to authorities.
In 2021, Apple was testing its own CSAM detection feature that could automatically detect and remove images of child exploitation and report images uploaded to iCloud in the US to the National Center on Missing and Exploited Children.
But the company withdrew its plans for the feature after criticism from privacy advocates who worried that the technology could be tweaked and abused to create a backdoor for government surveillance and censor other types of content on iOS devices.
The company’s efforts since then have not satisfied a wide range of critics.
In 2024, the United Kingdom-based watchdog National Society for the Prevention of Cruelty to Children said Apple was not properly monitoring, aggregating, and reporting CSAM in its products to authorities.
And in a 2024 lawsuit filed in the Northern District of California, thousands of child sexual abuse survivors sued Apple, claiming that the company should never have abandoned its early plans for CSAM detection and that by allowing such material to spread online, it caused victims to relive their trauma.
Apple has positioned itself as the most privacy-conscious Big Tech company ever since CEO Tim Cook wrote an open letter on the subject in 2014.
If the West Virginia lawsuit is successful, the company could be forced to make changes to its design and data security. The state is seeking statutory and punitive damages, as well as injunctive relief to force Apple to implement effective CSAM detection.
“Protecting the safety and privacy of our users, especially children, is at the heart of what we do,” an Apple spokesperson told CNBC in an emailed statement.
The company cited features like Parental Controls and Communication Safety, which “automatically intervenes on your child’s device when nudity is detected in messages, shared photos, AirDrops, and even live FaceTime calls” as evidence of its commitment to providing “safety, security, and privacy” to its users.
“We continue to innovate every day to combat evolving threats and maintain the safest and most trusted platform for children,” the spokesperson added.
–CNBC’s Kif Leswing contributed reporting

