TechnoSports Media Group
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment
No Result
View All Result
TechnoSports Media Group
No Result
View All Result

Apple to check iCloud, iPhone photo uploads for child abuse images

Astitva Patle by Astitva Patle
August 6, 2021
in Technology, Apple, News
0
Best Apple iPhone Deals on Amazon Prime Day

Apple will scan photo libraries stored on iCloud and iPhones in the US for known images of child sexual abuse, the company says, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications. The company will also examine the contents of end-to-end encrypted messages for the first time. The system, as per a Financial Times report, is called NeuralMatch.

It aims to leverage a team of human reviewers to contact law enforcement authorities when it finds images or content relating to Child Sexual Abuse Material (CSAM). The said system was reportedly trained using 200,000 images from the National Center for Missing and Exploited Children. As a result, it will scan, hash, and compare the photos of Apple users with a database of known images of child sexual abuse.

RelatedPosts

Voice Acting Royalty Joins Duet Night Abyss: Kensho Ono’s Legendary Career

India vs Australia 1st T20I: Rain Washes Out Canberra Clash After Bright Start from India

EC’s Nationwide SIR Rollout: Key Differences from Bihar’s Controversial Exercise

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities,” said the Financial Times report.

Now, following the report, Apple published an official post on its Newsroom to further explain how the new tools work. These tools are developed in collaboration with child safety experts and will use on-device machine learning to warn children as well as parents about sensitive and sexually explicit content on iMessage.

Furthermore, the Cupertino giant added that it will integrate “new technology” in iOS 15 and iPadOS 15 to detect CSAM images stored in iCloud Photos. If the system detects images or content relating to CSAM, Apple will disable the user account and send a report to the National Center for Missing and Exploited Children (NCMEC). However, if a user is mistakenly flagged by the system, they can file an appeal to recover the account.

How Apple’s new system will work?

Here is how Apple’s system works, Law enforcement officials maintain a database of known child sexual abuse images and translate those images into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

When a user uploads an image to Apple’s iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database. Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

Possible Implications behind this system

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says: ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog post that it may be impossible for outside researchers to double check whether Apple keeps its promises to check only a small set of on-device content.

The move is “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” the pair wrote.

A big question is why now and not sooner. Apple said its privacy-preserving CSAM detection did not exist until now. But companies like Apple have also faced considerable pressure from the U.S. government and its allies to weaken or backdoor the encryption used to protect their users’ data to allow law enforcement to investigate serious crime.

Also Read:

  • Motorola Edge S Pro teased in China, sports a triple camera…
  • Oppo’s new phone may come with Dimensity 810 chipset

Tags: AppleApple scan iphone imagesiCloud imagesNeuralMatch
Previous Post

Soundcore launches’ R’ Series – R100 TWS earbuds in India, with a special offer price of Rs. 1799/- for today on Flipkart

Next Post

Brand New Cherry Flavor, Netflix’s limited series: All things we came to know so far that all must know

Related Posts

FAQ

Voice Acting Royalty Joins Duet Night Abyss: Kensho Ono’s Legendary Career

October 29, 2025
Cricket

India vs Australia 1st T20I: Rain Washes Out Canberra Clash After Bright Start from India

October 29, 2025
FAQ

EC’s Nationwide SIR Rollout: Key Differences from Bihar’s Controversial Exercise

October 29, 2025
Cricket

India vs Australia Women Match Preview: The Ultimate 2nd Semi-Final Showdown at ICC Women’s World Cup 2025

October 29, 2025
Cricket

Revenge or Redemption? England Clash with South Africa in High-Stakes Women’s World Cup Semi-Final

October 29, 2025
Entertainment

Kim Ji Hun’s Heartwarming Cameo: The Only True Adult in Kim You Jung’s Dark Journey in “Dear X”

October 29, 2025
Next Post
Brand New Cherry Flavor

Brand New Cherry Flavor, Netflix's limited series: All things we came to know so far that all must know

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Buy JNews
  • Support Forum
  • Pre-sale Question
  • Contact Us
Call us: +1 234 JEG THEME
No Result
View All Result
  • Home
  • Technology
  • Smartphones
  • Deal
  • Sports
  • Reviews
  • Gaming
  • Entertainment

© 2025 TechnoSports Media Group - The Ultimate News Destination