West Virginia Attorney General JB McCuskey has filed a lawsuit against Apple, alleging that the company knowingly allowed its iCloud platform to store and distribute child sexual abuse material for years without action. The suit claims Apple's emphasis on privacy over safety enabled this issue. Apple maintains that it prioritizes both safety and privacy in its innovations.
On February 19, the Circuit Court of Mason County, West Virginia, received a complaint from Attorney General JB McCuskey accusing Apple of negligence in handling child sexual abuse material (CSAM) on iCloud. The lawsuit alleges that Apple executives were aware of the problem as early as February 2020, based on iMessage screenshots between Eric Friedman and Herve Sibert. In one exchange, Friedman reportedly described iCloud as 'the greatest platform for distributing child porn' and noted that Apple had 'chosen to not know in enough places where we really cannot say.' He also suspected the company was underreporting the CSAM issue, referencing a New York Times article on detection efforts.
The complaint highlights Apple's low reporting numbers to the National Center for Missing and Exploited Children: just 267 detections in 2023, compared to Google's 1.47 million and Meta's 30.6 million. It criticizes Apple for abandoning a 2021 initiative to scan iCloud photos for CSAM due to privacy concerns, and for introducing Advanced Data Protection in December 2022, which enables end-to-end encryption for iCloud photos and videos. McCuskey argues this encryption hinders law enforcement in identifying and prosecuting CSAM offenders.
"Preserving the privacy of child predators is absolutely inexcusable," McCuskey stated. He demands that Apple implement CSAM detection tools, report images, and cease allowing their storage and sharing.
Apple responded by emphasizing its commitment to safety and privacy, particularly for children. "We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," the company said. It pointed to features like Communication Safety, which is enabled by default for users under 18 and detects nudity in Messages, Photos, AirDrop, and FaceTime, though it does not target adult CSAM distribution.
Privacy advocates, including the Electronic Frontier Foundation, support encryption, arguing it protects against data breaches and government overreach. "Encryption is the best method we have to protect privacy online, which is especially important for young people," said EFF's Thorin Klosowski.
This suit follows similar actions, including a 2024 class-action in Northern California by over 2,500 CSAM victims and an August 2024 case in North Carolina on behalf of a 9-year-old survivor. It marks the first by a governmental body seeking injunctive relief and damages to enforce detection measures.