Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple announced it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children. But the company was hit with immediate backlash over the privacy implications of the technology, and ultimately abandoned the plan. The lawsuit, which was filed on Saturday in Northern California, is seeking damages upwards of $1.2 billion dollars for a potential group of 2,680 victims, according to NYT. It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take any measures to detect and limit” CSAM on its devices, leading to the victims’ harm as the images continued to circulate. Engadget has reached out to Apple for comment. In a statement to The New York Times about the lawsuit, Apple spokesperson Fred Sainz said, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.” The lawsuit comes just a few months after Apple was accused of underreporting CSAM by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).This article originally appeared on Engadget at https://www.engadget.com/big-tech/apple-sued-for-failing-to-implement-tools-that-would-detect-csam-in-icloud-202940984.html?src=rss

Dec 9, 2024 - 02:30
 0
Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple announced it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children. But the company was hit with immediate backlash over the privacy implications of the technology, and ultimately abandoned the plan.

The lawsuit, which was filed on Saturday in Northern California, is seeking damages upwards of $1.2 billion dollars for a potential group of 2,680 victims, according to NYT. It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take any measures to detect and limit” CSAM on its devices, leading to the victims’ harm as the images continued to circulate. Engadget has reached out to Apple for comment.

In a statement to The New York Times about the lawsuit, Apple spokesperson Fred Sainz said, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.” The lawsuit comes just a few months after Apple was accused of underreporting CSAM by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).This article originally appeared on Engadget at https://www.engadget.com/big-tech/apple-sued-for-failing-to-implement-tools-that-would-detect-csam-in-icloud-202940984.html?src=rss

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Viral News Code whisperer by profession, narrative alchemist by passion. With 6 years of tech expertise under my belt, I bring a unique blend of logic and imagination to ViralNews360. Expect everything from tech explainers that melt your brain (but not your circuits) to heartwarming tales that tug at your heartstrings. Come on in, the virtual coffee's always brewing!