Apple delays plans to roll out CSAM detection in iOS 15

text

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it

chaotically announced last month

, citing feedback from customers and policy groups.

That feedback, if you recall, has

been largely negative

. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called

on Apple to abandon plans to roll out the technology.

In a statement on Friday morning, Apple told TechCrunch:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple’s so-called NeuralHash technology is designed to identify known CSAM on a user’s device without having to possess the image or knowing the contents of the image. Because a user’s photos stored in iCloud are end-to-end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple

claims is more privacy-friendly

than the current blanket scanning that cloud providers use.

But security experts and privacy advocates have expressed concern that the system

could be abused

by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system to detect other materials that authoritarian nation states find objectionable.

Within a few weeks of announcing the technology, researchers said they were able to create “hash collisions” using NeuralHash, effectively

tricking the system

into thinking two entirely different images were the same.

iOS 15 is expected out later in the next few weeks.

Read more:

  • Apple confirms it will begin scanning iCloud Photos for child abuse images

  • Apple’s CSAM detection tech is under fire — again

  • Apple details child abuse detection and Messages safety features

  • iOS 15 will warn parents and children about sexually explicit photos in Messages