Apple defends new photo scanning child protection tech

Tech

Apple has defended its new system that scans users’ phones for child sexual abuse material (CSAM), after a backlash from customers and privacy advocates.

The technology searches for matches of known abuse material before the image is uploaded to its iCloud storage.

Critics warned it could be a “backdoor” to spy on people, and more than 5,000 people and organisations have signed an open letter against the technology.

As a result, Apple has pledged not to “expand” the system for any reason.

Digital privacy campaigners warned last week that authoritarian governments could use the technology to bolster anti-LGBT regimes, or crack down on political dissidents in countries where protests are deemed illegal.

But Apple said it would “will not accede to any government’s request to expand” the system.

It published a question-and-answer document, saying it had numerous safeguards in place to stop its systems from being used for anything other than the detection of child abuse imagery.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it said.

However, Apple has made some concessions in the past in order to keep operating in countries around the world.

Leave a Reply

Your email address will not be published. Required fields are marked *