Techno

Apple warns staff to be ready for questions on CSAM scanning

0


Zhiyue Xu/Unsplash.com

Apple has warned retail and online sales staff to be ready to field questions from consumers about the company’s upcoming features for limiting the spread of child abuse images.

In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children. The tech giant also said it will address privacy concerns by having an independent auditor review the system.

Earlier this month, the company announced a trio of new features meant to fight child pornography: support in Siri for reporting child abuse and accessing resources related to fighting CSAM, or child sexual abuse material; a feature in Messages that will scan devices operated by children for incoming or outgoing explicit images; and a new feature for iCloud Photos that will analyse a user’s library for explicit images of children.

This week’s memo read:

You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process. We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.

The iCloud feature has been the most controversial among privacy advocates, some consumers and even Apple employees. It assigns what is known as as a hash key to each of the user’s images and compares the keys to ones assigned to images within a database of existing explicit material. If a user is found to have such images in their library, Apple will be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.

The company said its database would be made up of images sourced from multiple child-safety organisations

Apple initially declined to share how many potentially elicit images need to be in a user’s library before the company is alerted, sparking concern among some observers. On Friday, the company said that the initial threshold is about 30 pictures, but that the number could change over time.

The Cupertino, California-based company also addressed concerns about the potential of images not related to child pornography being inserted into the database for the purpose of some governments spying on users. In response, the company said its database would be made up of images sourced from multiple child-safety organisations, not just NCMEC, the US National Centre for Missing & Exploited Children, as was initially announced. It also said it will use data from groups in different regions and that the independent auditor will verify the contents of its database.

Apple previously said it would refuse any requests from governments to utilise its technology as a means to spy on users.  — (c) 2021 Bloomberg LP

Source

Uncap has launched a funding platform for early-stage startups in Africa

Previous article

Ramaphosa not ‘micromanaging’ the intelligence service, says retired spy boss

Next article

You may also like

Comments

Comments are closed.

More in Techno