Apple sued for not implementing tools to detect child-abuse material
What's the story
Apple is being sued for its decision to not implement a system to scan iCloud photos for child sexual abuse material (CSAM).
According to The New York Times, the suit claims Apple's inaction forces victims to relive their trauma.
The tech giant had previously announced an improved design to protect children, but failed to implement these measures or take any steps to detect and limit CSAM.
Abandoned initiative
Scrapped plans for CSAM detection
Back in 2021, Apple had announced a system that would use digital signatures from the National Center for Missing and Exploited Children and other groups to identify known CSAM content in users' iCloud libraries.
However, the plan was later abandoned after security and privacy advocates raised concerns that it could potentially create a backdoor for government surveillance.
The lawsuit comes from a 27-year-old woman, claiming her relative molested her as an infant and shared images of the abuse online.
Compensation claim
The lawsuit seeks $1.2 billion in damages
The lawsuit, filed on Saturday in Northern California, seeks damages of over $1.2 billion for a potential group of 2,680 victims who could be entitled to compensation in this case.
Attorney James Marsh, who is involved with the lawsuit, revealed these figures.
The legal action comes after a similar suit was filed by a nine-year-old girl and her guardian in August accusing Apple of failing to address CSAM on iCloud.
Company statement
Apple's response to the lawsuit
In response to the lawsuit, an Apple spokesperson told The Times that the company is "urgently and actively innovating to combat these crimes without compromising the security and privacy of users."
Another Apple rep, Fred Sainz, pointed to features like Communication Safety that warn kids about explicit content, stressing Apple's commitment to building protections against CSAM.
This legal challenge comes after UK's National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of underreporting CSAM on its platforms.