A new service that can easily match photos of sexually abused children with photos on iPhones and iCloud accounts will soon be introduced by Apple.
Apple announced this on Thursday, adding that the aim is to bring children molesters to face the law by informing the authorities.
The tech giant is trying to walk the same path as Google and Facebook but it’s been a bit shaky knowing that the new service may alter Apple’s promise to put privacy and safety first for its users.
The new service will run by turning photos on devices into an unreadable set of hashes or complex numbers stored on user devices. This will be matched with the hashes provided by the National Center for Missing and Exploited Children.
Related Post: How to get iOS 15 beta free
A post explaining the new service proper on Apple’s website said that “Apple’s method is designed with user privacy in mind. The tool does not “scan” user photos and only images from the database will be included as matches.” It will not raise any alarm over photos of children of Apple users.
A doubly encrypted safety voucher that’s encoded on photos will be created by a device and if most of it is flagged, it will alert the team. The voucher will be decrypted, the user’s account will be disabled and then an alert will be sent to the National Center for Missing and Exploited Children. Accounts that are mistakenly flagged can file an appeal for it to be restored.
Privacy advocates are not digging this new service from Apple, Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology said that “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US but around the world. Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”
With the new service, Apple said it’ll ensure that similar images even if cropped, filtered, or resized will generate the same hash.
John Clark, president, and CEO of the National Center for Missing & Exploited Children applauds this move by Apple. In a statement, he said, “The reality is that privacy and child protection can co-exist, we applaud Apple and look forward to working together to make this world a safer place for children.”
No specific date was announced for the implementation of the new service, but Apple mentioned that it will be available as a future software update soon.