US iPhones Scan for Images of Child Sexual Abuse; Apple Announced

0
132

Apple announced it would implement a new system in the United States to check photos on iPhones for known images of child sex abuse before they are uploaded to the company’s iCloud storage services.

If a user is detected uploading child sex abuse images, Apple can initiate a human review and report the user to law enforcement, the company said. The system is designed to reduce false positives, which the company puts at one in one trillion odds. Other major large technology companies, including Google, Facebook and Microsoft, already have systems in place to check images against a database of known child sex abuse imagery.

With the new system, Apple is addressing competing imperatives. On the one hand, the company faces requests from law enforcement officials to help stem the tide of child sex abuse, and on the other hand, privacy and security are core tenets of the Apple brand. The new system, Apple believes, can balance both. Dubbed “NeuralHash,” the system is also designed to catch images of child sex abuse that have either been edited or are similar to ones known to law enforcement.

In the US, law enforcement maintains a database of known child sex abuse imagery that has been translated into “hashes,” or codes, that positively identify an image of child sex abuse but cannot be used to reconstruct it.

iPhones will create a hash of the images uploaded to the company’s iCloud storage service and compare it to the existing database. The company has promised that a human review will occur before any information is passed to law enforcement.

A key aspect is that images will be checked before they arrive on company servers. Photos stored on the iPhone will not be checked, only those uploaded to the iCloud servers. Users who feel their accounts were suspended improperly will have the right to appeal, the company said.

John Clark, the chief executive of the National Center for Missing & Exploited Children, said in a statement, “These new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

While many cryptographers and security researchers see the need and utility for such technology from Apple, there are concerns that the technology is a gateway to greater demands by authoritarian states.

Matthew Green, a top cryptography researcher at Johns Hopkins University, raised two significant concerns.

On the one hand, vulnerable individuals such as dissidents or even business competitors could be targeted if they are sent harmless but malicious images designed to trick the system and make it scan as child sex abuse images. This could fool Apple’s algorithm and alert law enforcement, effectively framing an innocent person.

Leave a reply