Netizens do not appreciate it. Security experts say they switch to Android phones.

Apple’s upcoming new feature can scan all photos of users. This news has just been announced, letting the majority of users collectively fry the pot!

Recently, Apple announced that in order to allow children to surf the Internet more safely, they decided to add a feature that can scan user photos to iOS 15, iPad OS 15, and macOS Monterey systems.

Just through an automated system, Apple can scan the user’s photos to check whether it involves child abuse CSAM content.

Once the relevant photos are detected, they will report to the US Center for Missing and Exploited Children and even contact law enforcement. At the same time, iMessage of minor accounts will also be monitored.

If there are harassing photos in their iMessage, Apple will also issue a warning and notify their parents. This plan has just been announced and has caused great concerns among users. What about the iPhone without a backdoor that Cook said?

Apple: Rest assured and absolutely safe

In order to dispel users’ doubts, Apple stated in an official announcement that they will never directly scan users’ images, but use an encryption technology called NeuralHash, which makes it impossible to directly see user photos.

First, Apple has CSAM images provided by the National Center for Missing and Exploited Children (NCMEC) and other child protection organizations, and a set of “hash values” is derived from these images.

This group of hash value databases is encrypted and stored on Apple devices to prevent users from obtaining hash values ​​to bypass system detection.

Then, the user’s image will also be calculated using a set of hash values, which will be matched with the database on the device before uploading to iCloud. In this way, there is no need to decrypt, and you can know whether the image content is in violation by whether the hash value matches.

NeuralHash can ensure that the same and visually similar images have the same hash value, even if the image is cropped or processed with other colors. The black and white processed picture has the same hash value as the original picture.


Eligible pictures will be uploaded to Apple. Apple uses another encryption technology called threshold secret sharing. Photos below a certain threshold cannot be restored.

Only when the user’s photo exceeds a certain threshold, can the content be decrypted and entered for manual review, and the system identification error rate is less than one in a trillion.

Apple believes that these protective measures can prevent its poor child detection mechanism from being abused without seeing any other images of the user.

Moreover, Apple also provides users with an opportunity to appeal. If users believe that their account has been incorrectly flagged, they can appeal to restore the account.

From the user’s personal photos, ID information to confidential business information, the serious data breaches that occurred in 2017 caused the public to panic. Apple’s current plan is very similar to that of the network disk. This also makes people put a big question mark on whether they can protect user privacy.

Digital Marketing Consultant and a Blogger. Ben has more than 5 years of experience in Blogging and Internet Marketing. He has been a technology/lifestyle writer for years and launched many successful projects.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Exit mobile version