Apple formally abandons its plans to check iCloud photos for evidence of child abuse.

Apple formally abandons its plans to check iCloud photos for evidence of child abuse.
Apple formally abandons its plans to check iCloud photos for evidence of child abuse.

Apple formally abandons its plans to check iCloud photos for evidence of child abuse.

The tech giant’s idea, which would have given iOS sweeping new monitoring capabilities, has been abandoned.

One of Apple’s most contentious ideas ever, to check iCloud photographs for evidence of child sex abuse, has been formally dropped (or, CSAM).

Yes, Apple said this summer that on-device scanning, a new iOS function that silently combed through each user’s images for evidence of questionable content, would be available. The new capability was created to warn human personnel, who would then likely alert the authorities, should the scanner discover any indications of CSAM.

Privacy and security experts reacted angrily to the proposal right away, claiming that the scanning tool may potentially be used to look for different types of content. Critics claimed that even having such scanning capabilities in iOS constituted a slippery slope towards wider monitoring abuses, and it was widely believed that the technology could rapidly turn into a backdoor for law enforcement.

Apple fought back against these complaints at the time, but in the end, the firm backed down and declared that it would “postpone” adoption of the new feature until a later time.

It appears right now that that time will never arrive. In addition to announcing a slew of new iCloud security measures on Wednesday, the business also said that it will not be proceeding with its plans for on-device scanning. Apple made it apparent that it had chosen a different path in a statement provided to Wired magazine:

We are increasing our investment in the Communication Safety feature, which we initially made accessible in December 2021, following significant conversation with experts to get input on child protection approaches we introduced last year. We have also decided against moving further with the CSAM detection tool for iCloud Photos that we had previously recommended. We will keep collaborating with governments, child advocates, and other businesses to help safeguard children, uphold their right to privacy, and make the internet a better place for them and for all of us. Children can be protected without businesses scouring through personal data.

Apple’s ideas appeared to have good intentions. Digital proliferation of CSAM is a significant issue, and specialists claim that it has gotten worse recently. Clearly, trying to find a solution to this issue was a wonderful idea. However, it seems like the underlying technology Apple recommended using—and the privacy risks it posed—just wasn’t the right tool for the job.

Leave a Reply

Your email address will not be published. Required fields are marked *