The Business & Technology Network
Helping Business Interpret and Use Technology
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 

Did Apple’s CSAM system allow child abuse to go unchecked?

DATE POSTED:December 9, 2024
Did Apple’s CSAM system allow child abuse to go unchecked?

Apple is facing a lawsuit from victims of child sexual abuse for failing to implement a system to detect Child Sexual Abuse Material (CSAM) on iCloud according to The New York Times. The suit seeks over $1.2 billion in damages for approximately 2,680 individuals, reflecting alleged negligence in protecting users from harm.

Apple sued for $1.2 billion over child sexual abuse material detection failure

In 2021, Apple announced its intentions to introduce a CSAM detection tool intended to scan iCloud for abusive images and alert the National Center for Missing and Exploited Children. This initiative aimed to combat child exploitation effectively. However, following substantial backlash over privacy concerns, Apple abandoned the project, leaving the accused unprotected and ongoing abuse unresolved.

The lawsuit was filed in Northern California and cites that Apple “failed to implement those designs or take any measures to detect and limit” CSAM on its devices. The claim includes experiences from a 27-year-old woman, who stated that she still receives law enforcement notices concerning the online sharing of her images, taken when she was a child. The suit emphasizes the emotional turmoil victims endure as these materials circulate unchecked.

“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts,” Apple spokesperson Fred Sainz told Engadget.

In response, Apple affirmed its commitment to combating child sexual abuse while prioritizing user privacy. Spokesperson Fred Sainz noted, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk.” He referenced existing measures such as Communication Safety, which warns children against sharing explicit content.

Despite these efforts, Apple recently faced additional scrutiny when the UK’s National Society for the Prevention of Cruelty to Children accused the company of underreporting CSAM found on its platforms. This adds to the mounting pressure regarding Apple’s strategies to handle this issue effectively.

Featured image credit: Niels Kehl/Unsplash