Jump to content
Kev

Apple: CSAM Image-Detection Backdoor ‘Narrow’ in Scope

Recommended Posts

Mac-Malware.jpg

 

Computing giant tries to reassure users that the tool won’t be used for mass surveillance.

 

Apple provided additional design and security details this week about the planned rollout of a feature aimed at detecting child sexual abuse material (CSAM) images stored in iCloud Photos.

 

Privacy groups like the Electronic Frontier Foundation warned that the process of flagging CSAM images essentially narrows the definition of end-to-end encryption to allow client-side access — which essentially means Apple is building a backdoor into its data storage, it said.

 

Quote

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly scoped backdoor is still a backdoor,” The EFF said in reaction to the Apple announcement.

 

Apple’s new document explained that the tool is only available to child accounts set up in Family Sharing and the parent or guardian must opt-in. Then, a machine-learning classifier is deployed to the device in the messaging app, which will trigger a warning if the app detects explicit images being sent to or from the account. If the account is for a child under 13 years old, the parent or guardian will also receive a notification, according to Apple.  The image is not shared with the parent, only a notification, Apple added.

 

Apple Explains How It Protects Privacy While Monitoring CSAM Content  

 

Quote

“This feature does not reveal information to Apple,” the company said. “Specifically, it does not disclose the communications of the users, the actions of the child or the notifications to the parents. It does not compare images to any database, such as a database of CSAM material. It never generates any reports for Apple or law enforcement.”

 

The feature also detects collections of CSAM images uploaded to iCloud photos, Apple said. First it runs code on the device that compares any photo being uploaded to a known database of CSAM images.

 

Quote

“The iCloud photo servers can decrypt the safety vouchers corresponding to positive matches — if and only if that user’s iCloud Photos account exceeds a certain number of matches, called the match threshold,” Apple added.

 

After a certain number of images is detected, the images are sent to a human reviewer and if an issue is detected, the information is turned over to the National Center for Missing and Exploited Children who will notify law enforcement as necessary.

 

Quote

“The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised,” Apple said.

 

First, Apple said it generated a CSAM device database by combining information from two separate child-safety agencies.

 

Quote

“Any perceptual hashes appearing in only one participating child-safety organization’s database, or only in databases from multiple agencies in a single sovereign jurisdiction, are discarded by this process, and not included in the encrypted CSAM database that Apple includes in the operating system,” Apple’s document explained. “This mechanism meets our source-image correctness requirement.”

 

The company added that the database is never updated or shared over the internet.

 

Quote

“Since no remote updates of the database are possible, and since Apple distributes the same signed operating system image to all users worldwide, it is not possible – inadvertently or through coercion – for Apple to provide targeted users with a different CSAM database,” the company explained. “This meets our database update transparency and database universality requirements.”

 

Apple added that it will publish a Knowledge Base article with a root hash of the encrypted database with each iOS update, to allow for independent third-party technical audits.

 

It’s unclear how any of these details will reassure critics of the move.

 

Quote

“While Apple aims at the scourge of child exploitation and abuse, the company has created an infrastructure that is all too easy to redirect to greater surveillance and censorship,” the EFF said in response. “The program will undermine Apple’s defense that it can’t comply with the broader demands.”

 

Via threatpost.com

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...