TechnoSports Media Group
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
TechnoSports Media Group
No Result
View All Result

Apple to check iCloud, iPhone photo uploads for child abuse images

Astitva Patle by Astitva Patle
August 6, 2021
in Technology, Apple, News
0
Best Apple iPhone Deals on Amazon Prime Day

Apple will scan photo libraries stored on iCloud and iPhones in the US for known images of child sexual abuse, the company says, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications. The company will also examine the contents of end-to-end encrypted messages for the first time. The system, as per a Financial Times report, is called NeuralMatch.

It aims to leverage a team of human reviewers to contact law enforcement authorities when it finds images or content relating to Child Sexual Abuse Material (CSAM). The said system was reportedly trained using 200,000 images from the National Center for Missing and Exploited Children. As a result, it will scan, hash, and compare the photos of Apple users with a database of known images of child sexual abuse.

RelatedPosts

Trisha Kar Madhu’s Boyfriend MMS Leak: The Controversy, His Response, and What It Means?

5 Supernatural BL Dramas That Redefine Love Beyond the Human Realm

IIT Delhi Reigns Supreme: 7 Indian Institutes Crack QS Asia Top 100 for Fifth Year

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities,” said the Financial Times report.

Now, following the report, Apple published an official post on its Newsroom to further explain how the new tools work. These tools are developed in collaboration with child safety experts and will use on-device machine learning to warn children as well as parents about sensitive and sexually explicit content on iMessage.

Furthermore, the Cupertino giant added that it will integrate “new technology” in iOS 15 and iPadOS 15 to detect CSAM images stored in iCloud Photos. If the system detects images or content relating to CSAM, Apple will disable the user account and send a report to the National Center for Missing and Exploited Children (NCMEC). However, if a user is mistakenly flagged by the system, they can file an appeal to recover the account.

How Apple’s new system will work?

Here is how Apple’s system works, Law enforcement officials maintain a database of known child sexual abuse images and translate those images into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

When a user uploads an image to Apple’s iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database. Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

Possible Implications behind this system

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says: ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog post that it may be impossible for outside researchers to double check whether Apple keeps its promises to check only a small set of on-device content.

The move is “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” the pair wrote.

A big question is why now and not sooner. Apple said its privacy-preserving CSAM detection did not exist until now. But companies like Apple have also faced considerable pressure from the U.S. government and its allies to weaken or backdoor the encryption used to protect their users’ data to allow law enforcement to investigate serious crime.

Also Read:

  • Motorola Edge S Pro teased in China, sports a triple camera…
  • Oppo’s new phone may come with Dimensity 810 chipset

Tags: AppleApple scan iphone imagesiCloud imagesNeuralMatch
Previous Post

Soundcore launches’ R’ Series – R100 TWS earbuds in India, with a special offer price of Rs. 1799/- for today on Flipkart

Next Post

Brand New Cherry Flavor, Netflix’s limited series: All things we came to know so far that all must know

Related Posts

Trrssa
FAQ

Trisha Kar Madhu’s Boyfriend MMS Leak: The Controversy, His Response, and What It Means?

November 6, 2025
BL Dramas
Entertainment

5 Supernatural BL Dramas That Redefine Love Beyond the Human Realm

November 6, 2025
IIT
Education

IIT Delhi Reigns Supreme: 7 Indian Institutes Crack QS Asia Top 100 for Fifth Year

November 6, 2025
Zoho
FAQ

Zoho Mail’s $100 Million Milestone: How Privacy-First Strategy Won Global Trust

November 6, 2025
News

Mumbai’s Traffic Salvation: How Uttan-Virar Sea Link Extension Will Transform Commuting

November 6, 2025
Sona Dey Speaks Out: Viral MMS Controversy and the Perils of Digital Manipulation
Entertainment

Sona Dey Speaks Out on the Viral MMS Controversy as of 2025

November 6, 2025
Next Post
Brand New Cherry Flavor

Brand New Cherry Flavor, Netflix's limited series: All things we came to know so far that all must know

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

TechnoSports Media Group

© 2025 TechnoSports Media Group - The Ultimate News Destination

Email: admin@technosports.co.in

  • Terms of Use
  • Privacy Policy
  • About Us
  • Contact Us

Follow Us

No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment

© 2025 TechnoSports Media Group - The Ultimate News Destination