Apple is facing a lawsuit for reportedly failing to limit child sexual abuse material on its iCloud platform. According to The New York Times, the plaintiffs, who are alleged victims of abuse, are seeking more than $1.2 billion in damages, claiming that Apple abandoned a 2021 detection system intended to identify harmful content.
Filed on Saturday, the lawsuit was brought by a 27-year-old woman who argues that Apple’s inaction allowed images of her abuse to be widely distributed. The New York Times reported that the plaintiff alleges her abuser uploaded the images to Apple’s iCloud service. She also stated that both she and her mother frequently received notifications from law enforcement whenever individuals were charged with possessing these illicit images.
She said: “It was hard to believe there were so many out there. They were not stopping.”
Victims demand over $1.2 billion in damages from Apple over iCloudThe lawsuit aims to hold Apple accountable for its decision and seeks damages on behalf of up to 2,680 victims. Under U.S. law, child sexual abuse survivors are entitled to a minimum of $150,000 each, potentially pushing Apple’s total liability beyond $1.2 billion if the company is found responsible.
The case stems from Apple’s 2022 decision to abandon a planned system for scanning iCloud images for child sexual abuse material (CSAM). Introduced in 2021, the system would have used on-device hashing to identify such content.
However, Apple shelved the initiative amid widespread criticism over privacy implications and the potential for misuse. In September 2022, the company clarified its reasoning, citing the risk of unintended consequences.
The legal action also casts renewed scrutiny on Apple’s longstanding reputation for strong privacy protections. Its timing follows another recent lawsuit involving a nine-year-old victim in North Carolina, which alleges that Apple enabled strangers to distribute CSAM through iCloud.
For decades, Section 230 of the Communications Decency Act has insulated companies from liability for user-generated content. However, recent rulings by the U.S. Court of Appeals for the Ninth Circuit clarify that these protections only extend to content moderation decisions and are not absolute.
In the North Carolina case, Apple has moved to dismiss, arguing that Section 230 shields it from liability for material uploaded to iCloud by third parties. The company also contends that iCloud cannot be treated as a product—like a defective tire—and is therefore not subject to product liability claims.
Cited by the Times, Apple spokesman Fred Sainz said: “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
Big Tech accountability under the spotlightApple isn’t the only major tech company facing scrutiny over how its platforms affect children. Earlier this year, ReadWrite reported that the CEOs of Meta, TikTok, Snap, Discord, and X appeared before a U.S. Senate hearing focused on addressing online child exploitation.
In July, the Senate passed the Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act, which criminalizes the distribution of private, sexually explicit, or nude images. Michael Salter, Professor of Criminology at the University of New South Wales, Australia, wrote on X: “In the absence of law reform in major jurisdictions like the USA or the EU, civil suits are one of the few ways to hold tech to account.”
Apple is being sued for not removing child sexual abuse material from iCloud although it has the tools to do so. In the absence of law reform in major jurisdictions like the USA or the EU, civil suits are one of the few ways to hold tech to account. https://t.co/ctt6FnT8uL
— Michael Salter (@mike_salter) December 9, 2024
ReadWrite has reached out to Apple for comment.
Featured image: Ideogram
The post Apple faces billion-dollar lawsuit over allegedly failing to block iCloud child abuse appeared first on ReadWrite.