Close Menu
  • Home
  • AI
    • AI News
    • AI Tools and Reviews
    • AI in Business
  • Business
    • Career
    • Crypto
  • Home Improvement
  • Lifestyle
    • How to
    • Entertainment
    • Fashion
    • Quotes
    • Travel
  • Tech
  • Top Lists
  • Contact Us
Facebook X (Twitter) Instagram
  • About Us
  • Contact Us
Facebook X (Twitter) Instagram Pinterest
Attention Trust
  • Home
  • AI
    • AI News
    • AI Tools and Reviews
    • AI in Business
  • Business
    • Career
    • Crypto
  • Home Improvement
  • Lifestyle
    • How to
    • Entertainment
    • Fashion
    • Quotes
    • Travel
  • Tech
  • Top Lists
  • Contact Us
Attention Trust
You are at:Home»Technology»Apple iOS 15 “Scans” iPhone Photo Albums for Child Safety
Technology

Apple iOS 15 “Scans” iPhone Photo Albums for Child Safety

Ben BrakeBy Ben BrakeAugust 13, 2021No Comments3 Mins Read3 Views
Apple ios scans iphone photo

Netizens do not appreciate it. Security experts say they switch to Android phones.

Apple’s upcoming new feature can scan all photos of users. This news has just been announced, letting the majority of users collectively fry the pot!

Recently, Apple announced that in order to allow children to surf the Internet more safely, they decided to add a feature that can scan user photos to iOS 15, iPad OS 15, and macOS Monterey systems.

Just through an automated system, Apple can scan the user’s photos to check whether it involves child abuse CSAM content.

Once the relevant photos are detected, they will report to the US Center for Missing and Exploited Children and even contact law enforcement. At the same time, iMessage of minor accounts will also be monitored.

If there are harassing photos in their iMessage, Apple will also issue a warning and notify their parents. This plan has just been announced and has caused great concerns among users. What about the iPhone without a backdoor that Cook said?

Apple announces new features that will scan iPhone and iPad users’ photos to detect and report large collections of child sexual abuse images that are stored on their cloud servers. https://t.co/makNhBDtd6

— NBC News (@NBCNews) August 6, 2021

Apple: Rest assured and absolutely safe

In order to dispel users’ doubts, Apple stated in an official announcement that they will never directly scan users’ images, but use an encryption technology called NeuralHash, which makes it impossible to directly see user photos.

First, Apple has CSAM images provided by the National Center for Missing and Exploited Children (NCMEC) and other child protection organizations, and a set of “hash values” is derived from these images.

This group of hash value databases is encrypted and stored on Apple devices to prevent users from obtaining hash values ​​to bypass system detection.

Then, the user’s image will also be calculated using a set of hash values, which will be matched with the database on the device before uploading to iCloud. In this way, there is no need to decrypt, and you can know whether the image content is in violation by whether the hash value matches.

NeuralHash can ensure that the same and visually similar images have the same hash value, even if the image is cropped or processed with other colors. The black and white processed picture has the same hash value as the original picture.


Eligible pictures will be uploaded to Apple. Apple uses another encryption technology called threshold secret sharing. Photos below a certain threshold cannot be restored.

Only when the user’s photo exceeds a certain threshold, can the content be decrypted and entered for manual review, and the system identification error rate is less than one in a trillion.

Apple believes that these protective measures can prevent its poor child detection mechanism from being abused without seeing any other images of the user.

Moreover, Apple also provides users with an opportunity to appeal. If users believe that their account has been incorrectly flagged, they can appeal to restore the account.

From the user’s personal photos, ID information to confidential business information, the serious data breaches that occurred in 2017 caused the public to panic. Apple’s current plan is very similar to that of the network disk. This also makes people put a big question mark on whether they can protect user privacy.

Previous ArticleHomes on Wheels That Are More Comfortable Than an Apartment
Next Article 5 Signs of a Mouse Problem and What You Should Do About It
Ben Brake

Digital Marketing Consultant and a Blogger. Ben has more than 5 years of experience in Blogging and Internet Marketing. He has been a technology/lifestyle writer for years and launched many successful projects.

Related Posts

Global Internet Outage: Cloudflare Disruption Hits ChatGPT, X, and Major Sites

November 18, 2025 World News

Frank McCourt Questions TikTok Sale Legality Amid Shift to Data Ownership and AI Sovereignty”

October 14, 2025 World News

Israel Blocks Entry of Two EU Lawmakers, Accuses One of Supporting Boycott Movements

August 28, 2025 World News
Leave A Reply Cancel Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Latest Posts

500+ Best Comments for Girls Pic to Impress Her (Updated List)

February 6, 20243,440 Views

Understanding the Information Contained in a VIN Code

March 24, 2023534 Views

65+ Creative Wall Paint Designs and Ideas

January 24, 2024513 Views

What is Chat GPT? How Does It Works

February 11, 2023369 Views

5 Things you Should Know about Retirement in the UK

February 12, 2019310 Views
Don't Miss
Top Lists May 1, 2025

Best and Top Armies in the World [World Military Ranking]

Are you here to find out if your country’s army is among the strongest armies…

Top Social Media Networking Sites

60+ Trending TikTok Cake Ideas

Birthday Party Decoration Ideas

Ultimate List of Encanto Cake Ideas

Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
© 2025 attentiontrust.org
  • Home
  • Contact Us
  • About Us

Type above and press Enter to search. Press Esc to cancel.