W
Apple sued for not implementing ‘NeuralHash’ CSAM detection in iCloud.
It’s been two years since Apple dropped its plan to detect child abuse imagery using client-side iCloud scanning.
Now, the New York Times reports on a class-action lawsuit filed in California saying it harmed a group of 2,680 victims by failing to “implement those designs or take any measures to detect and limit” CSAM, like using Microsoft’s PhotoDNA.
Under law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages, which means the total award...could exceed $1.2 billion
Apple Faces Lawsuit Over Child Sexual Abuse Material on iCloud
[The New York Times]
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Loading comments
Getting the conversation ready...











