Skip to content

Child Safety on iOS – Apple walks back photo-scanning plans

  • by

In 2021, Apple introduced a suite of automated features designed to protect children from potential abuse. Some of those tools remain, but after a huge backlash on its controversial photo-scanning plans, the company has ditched the project.

Let’s first look at what Apple is not doing.

The original plans involved automatically scanning any photos uploaded to iCloud, checking for matches against a database of known child sexual abuse material (CSAM). If the machine learning system detected enough exact matches, Apple would be notified and the user account flagged.

Despite its lofty kid-safety intentions, a great many critics of this system worried it would compromise Apple’s famous dedication to user privacy. Unsolicited photo scanning understandably makes customers uneasy, and it could be a slippery slope that leads to more invasive checks for other reasons. A global surveillance system like that provides opportunity for exploitation by government agencies or hackers.

After several delays, and a year to take on board feedback, Apple has now confirmed it will not be launching this CSAM-detection tool.

So what is Apple doing to keep kids safe, then?

The company told WIRED that it will instead focus on cutting off child abuse at the source with its Communication Safety features – some of which are already live.

Those features are opt-in systems parents can use to protect family iCloud accounts. In these instances, automated checks can detect if a user is viewing or searching for problematic content, providing resources to report the content or seek help.

There’s also a similar function in the Messages app that can flag if a child is attempting to send or receive images that contain nudity. This detection is performed on-device, only when opted in by the parents, and Apple is not notified of the outcome. Instead, the child will see alerts warning them of potential dangers, along with links to resources if they need help or support.

These Communication Safety features aim to curb abuse before it happens, and have generally been much better received than the now-canceled photo-scanning plans. It’s certainly a heavy topic and a difficult balancing act for companies like Apple to get right without compromising user privacy – especially as it continues to improve user security elsewhere.

For more information, we recommend you read Apple’s child safety page.