West Virginia sues Apple over iCloud CSAM allegations

West Virginia Attorney General JB McCuskey has filed a lawsuit against Apple, alleging that the company knowingly allowed its iCloud platform to store and distribute child sexual abuse material for years without action. The suit claims Apple's emphasis on privacy over safety enabled this issue. Apple maintains that it prioritizes both safety and privacy in its innovations.

On February 19, the Circuit Court of Mason County, West Virginia, received a complaint from Attorney General JB McCuskey accusing Apple of negligence in handling child sexual abuse material (CSAM) on iCloud. The lawsuit alleges that Apple executives were aware of the problem as early as February 2020, based on iMessage screenshots between Eric Friedman and Herve Sibert. In one exchange, Friedman reportedly described iCloud as 'the greatest platform for distributing child porn' and noted that Apple had 'chosen to not know in enough places where we really cannot say.' He also suspected the company was underreporting the CSAM issue, referencing a New York Times article on detection efforts.

The complaint highlights Apple's low reporting numbers to the National Center for Missing and Exploited Children: just 267 detections in 2023, compared to Google's 1.47 million and Meta's 30.6 million. It criticizes Apple for abandoning a 2021 initiative to scan iCloud photos for CSAM due to privacy concerns, and for introducing Advanced Data Protection in December 2022, which enables end-to-end encryption for iCloud photos and videos. McCuskey argues this encryption hinders law enforcement in identifying and prosecuting CSAM offenders.

"Preserving the privacy of child predators is absolutely inexcusable," McCuskey stated. He demands that Apple implement CSAM detection tools, report images, and cease allowing their storage and sharing.

Apple responded by emphasizing its commitment to safety and privacy, particularly for children. "We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," the company said. It pointed to features like Communication Safety, which is enabled by default for users under 18 and detects nudity in Messages, Photos, AirDrop, and FaceTime, though it does not target adult CSAM distribution.

Privacy advocates, including the Electronic Frontier Foundation, support encryption, arguing it protects against data breaches and government overreach. "Encryption is the best method we have to protect privacy online, which is especially important for young people," said EFF's Thorin Klosowski.

This suit follows similar actions, including a 2024 class-action in Northern California by over 2,500 CSAM victims and an August 2024 case in North Carolina on behalf of a 9-year-old survivor. It marks the first by a governmental body seeking injunctive relief and damages to enforce detection measures.

Связанные статьи

Governor Gavin Newsom signs California's Digital Age Assurance Act, requiring OS age verification for safer online content.
Изображение, созданное ИИ

California enacts Digital Age Assurance Act requiring OS age verification

Сообщено ИИ Изображение, созданное ИИ

Following initial reports of an impending law, California Governor Gavin Newsom has signed AB 1043, the Digital Age Assurance Act, requiring operating system providers to collect users' ages during account setup and share via API with app developers. Effective January 1, 2027, it applies to major platforms like Windows, iOS, Android, macOS, SteamOS, and Linux distributions, aiming for age-appropriate content without biometrics.

Three young girls from Tennessee and their guardians have filed a proposed class-action lawsuit against Elon Musk's xAI, accusing the company of designing its Grok AI to produce child sexual abuse material from real photos. The suit stems from a Discord tip that led to a police investigation linking Grok to explicit images of the victims. They seek an injunction and damages for thousands of potentially harmed minors.

Сообщено ИИ

Australian regulators are poised to require app stores to block AI services lacking age verification to protect younger users from mature content. This move comes ahead of a March 9 deadline, with potential fines for non-compliant AI companies. Only a fraction of leading AI chat services in the region have implemented such measures.

Building on our earlier coverage of California's Digital Age Assurance Act (AB 1043)—signed by Governor Gavin Newsom in October 2025 and effective January 1, 2027—the law's requirements for age data collection and API sharing pose steep compliance hurdles for volunteer-driven open-source operating systems like Ubuntu, Debian, Arch Linux, and SteamOS.

Сообщено ИИ

James Strahler II, a 37-year-old from Ohio, pleaded guilty on Tuesday to federal charges including cyberstalking and producing AI-generated child sexual abuse material, marking the first conviction under the 2025 Take It Down Act. The law, signed by President Donald Trump, targets nonconsensual intimate images created with AI. Strahler used dozens of AI tools to harass women and create explicit images involving minors.

The Department of Information and Communications Technology (DICT) is studying a potential ban on the encrypted messaging app Telegram due to concerns over illegal activities like pornography and gambling. This follows a similar proposal for Signal earlier in the month. Rights groups worry about impacts on private communication rights.

Сообщено ИИ

A new law in California mandates that all operating systems, including Linux, implement some form of age verification during account setup. The legislation aims to address online safety concerns. Details on enforcement remain unclear.

 

 

 

Этот сайт использует куки

Мы используем куки для анализа, чтобы улучшить наш сайт. Прочитайте нашу политику конфиденциальности для дополнительной информации.
Отклонить