West Virginia’s Attorney General filed a consumer protection lawsuit against Apple, alleging the company failed to prevent child sexual abuse material (CSAM) from being stored and shared via iOS devices and iCloud.
Republican AG John “JB” McCuskey accuses Apple of putting privacy branding and its own business interests ahead of child safety.
The big picture: The lawsuit claims other tech companies like Google, Microsoft, and Dropbox use systems such as PhotoDNA to more proactively identify and block CSAM.
- PhotoDNA, developed in 2009, uses “hashing and matching” to automatically detect known CSAM images and prevent their distribution.
- In 2021, Apple tested its own CSAM-detection features for automatically finding and removing child exploitation images on iCloud, reporting them to the National Center for Missing & Exploited Children.
- Apple withdrew those plans after privacy advocates raised concerns that such technology could enable government surveillance or censorship.
Zoomout: A separate lawsuit filed in California in 2024 by thousands of child sexual abuse survivors alleges that Apple’s abandonment of CSAM detection features allowed material to proliferate, causing ongoing trauma for survivors.
- Apple has positioned itself as the most privacy-sensitive among Big Tech, with CEO Tim Cook writing an open letter on privacy in 2014.
What we’re watching: West Virginia seeks statutory and punitive damages, and injunctive relief requiring Apple to implement effective CSAM detection in its products.
What they’re saying: Apple told CNBC that “protecting the safety and privacy of our users, especially children, is central to what we do,” citing parental controls and Communication Safety features which intervene when nudity is detected.
- The company emphasized its ongoing efforts to combat evolving threats and maintain a trusted platform for children.