29.1 C
Delhi

Apple to check iCloud, iPhone photo uploads for child abuse images

Apple will scan photo libraries stored on iCloud and iPhones in the US for known images of child sexual abuse, the company says, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications. The company will also examine the contents of end-to-end encrypted messages for the first time. The system, as per a Financial Times report, is called NeuralMatch.

It aims to leverage a team of human reviewers to contact law enforcement authorities when it finds images or content relating to Child Sexual Abuse Material (CSAM). The said system was reportedly trained using 200,000 images from the National Center for Missing and Exploited Children. As a result, it will scan, hash, and compare the photos of Apple users with a database of known images of child sexual abuse.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities,” said the Financial Times report.

Now, following the report, Apple published an official post on its Newsroom to further explain how the new tools work. These tools are developed in collaboration with child safety experts and will use on-device machine learning to warn children as well as parents about sensitive and sexually explicit content on iMessage.

- Advertisement -TechnoSports-Ad

Furthermore, the Cupertino giant added that it will integrate “new technology” in iOS 15 and iPadOS 15 to detect CSAM images stored in iCloud Photos. If the system detects images or content relating to CSAM, Apple will disable the user account and send a report to the National Center for Missing and Exploited Children (NCMEC). However, if a user is mistakenly flagged by the system, they can file an appeal to recover the account.

How Apple’s new system will work?

Here is how Apple’s system works, Law enforcement officials maintain a database of known child sexual abuse images and translate those images into “hashes” – numerical codes that positively identify the image but cannot be used to reconstruct them.

When a user uploads an image to Apple’s iCloud storage service, the iPhone will create a hash of the image to be uploaded and compare it against the database. Photos stored only on the phone are not checked, Apple said, and human review before reporting an account to law enforcement is meant to ensure any matches are genuine before suspending an account.

- Advertisement -TechnoSports-Ad

Possible Implications behind this system

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says: ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Other privacy researchers such as India McKinney and Erica Portnoy of the Electronic Frontier Foundation wrote in a blog post that it may be impossible for outside researchers to double check whether Apple keeps its promises to check only a small set of on-device content.

The move is “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” the pair wrote.

A big question is why now and not sooner. Apple said its privacy-preserving CSAM detection did not exist until now. But companies like Apple have also faced considerable pressure from the U.S. government and its allies to weaken or backdoor the encryption used to protect their users’ data to allow law enforcement to investigate serious crime.

Also Read:

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Astitva Patle
Astitva Patle
Content writer with have a experience of more than 1 and a half year in this field. If you like my content and wanna give me opportunity then please contact at [email protected]
TechnoSports-Ad

Popular

TechnoSports-Ad

Related Stories

More from author

iQOO 9 series could come in India in Q1 2021 with Snapdragon 8 Gen 1

iQOO the Vivo’s sub-brand has unveiled the iQOO 8 and iQOO 8 Pro smartphones in China in the month of August this year. Both...

Apple Music 2021 Top Charts are here: Check all the highlights

Apple Music Top 100 charts 2021 list is in front of us. BTS's Dynamite band from South Korea is the most-streamed song in 2021...

Samsung dominates the Global Foldable smartphone market with 93% Share

The global foldable phone market seems to have reached new heights in Q3 2021, at the same time when Samsung unveiled the Galaxy Z Fold...

Xiaomi 12 will be the first smartphone powered by Snapdragon 8 Gen 1

Xiaomi is working on its new flagship device and it can be anticipated to launch in December during a launch event. The Xiaomi 12 and Xiaomi 12X flagship handsets will be launched in China first and also it'll make its debut in other requests. In a recent tweet,...