Apple is making headlines this week with upcoming changes to its operating systems. Their web page Child Safety says:

"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)."

Uncontroversial so far, we all want to stop child abuse. Reading on, they describe how new tools inside Apple's messaging app iMessage will automatically warn young people (and their parents for specified age ranges) if they send or receive sexually explicit images.

Let's hope their AI works better than Tumlbr's clumsy attempt to remove adult material from its website, or Facebook's equally poor systems that famously flagged The Little Mermaid as unsuitable. 

 

Automated iCloud scanning

The controversial part comes further down the page:

"Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."

To translate: Apple's systems will look at all of your iCloud photos and run each one through a mathematical process that produces a hopefully unique very long number called a hash. The process is designed in such a way that if you crop a photo or even make it black and white, the hash will be the same. Your computer will compare this hash to a known list of hashes of CSAM images, and if it finds enough matches in your iCloud library it will notify Apple, who will then verify the matches and contact the police. 

Again, this sounds like a great idea. It's important to note that Apple isn't downloading CSAM onto your phone, just the long numbers that identify each CSAM item, and it's impossible to recreate the images from the number. It's also really important to note that taking photos of your kids in the bath isn't going to trigger this system. It's looking for very specific, known images of CSAM. 

The tools will only match photos that are known to be CSAM and identified as such by the US-based National Center For Missing and Exploited Children. They have taken on the unenviable task of assembling a database of all known CSAM and computing hashes from the media just for this purpose. 

This is very clever indeed, and I applaud their efforts. But can you spot the problem here?

 

Apple's focus is not your privacy

The problem has nothing to do with CSAM and everything to do with how much you trust technology companies and governments. Apple has been criticised for many years because they hold the keys to decrypt your device backups, photos, and anything you put on iDrive, and will hand this over to law enforcement with the proper authority. The FBI pressured Apple into not encrypting device backups "after the FBI complained that the move would harm investigations".

This new technology is going to demonstrate to oppressive governments new options for scanning their citizen's devices. While Apple's privacy chief Erik Neuenschwander has explicitly said that "safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government", the New York Times reported in May that Apple has already caved to Chinese government demands for access to Apple users' data which is stored in data centres that are nominally controlled by Apple, but in reality run by the Chinese state:

"Chinese state employees physically manage the computers. Apple abandoned the encryption technology it used elsewhere after China would not allow it. And the digital keys that unlock information on those computers are stored in the data centers they’re meant to secure."

 

Future applications of this technology

It's worth noting that this technology is currently only being rolled out in the US, but it's not so hard to see where this could go next.

Maybe the Chinese government pressure Apple into adding more items to their database of banned media, starting with photos of the 1989 Tiananmen Square massacre. Maybe the Russian government add the gay pride flag to their database to help with prosecutions under their gay propaganda laws.

But that's okay because you don't live in China or Russia, and surely the UK government would never spy on their citizens. Right?

Sadly we live in an age where our privacy is being actively attacked by bad actors. The recent Pegasus spyware news means we all have to seriously consider what rights we're willing to concede for a safer society. 

The Cato Institute sums it up really well:

Described more abstractly and content neutrally, here’s what Apple is implementing: A surveillance program running on the user’s personal device, outside the user’s control, will scan the user’s data for files on a list of prohibited content, and then report to the authorities when it finds a certain amount of content on the list. Once the architecture is in place, it is utterly inevitable that governments around the world will demand its use to search for other kinds of content—and to exert pressure on other device manufacturers to install similar surveillance systems.

 

 

Tagged under: Hot topics   Privacy   Legal   Security   AI  

Nice things people have said about us

"The Content Management System is well designed and easy to use and they really take the time to ensure you understand how to use it."

Rob Dean, Dunmore Farm