News

Apple will begin to scan iPhones for child sex abuse, underage sexual imagery

On Thursday Apple announced new features which will automatically scan iPhone and iPad users’ photos in order to detect large collections of child sexual abuse images stored in the iCloud severs, match photos to a data of sexual abuse images and then alert the authorities if necessary.

In a statement Apple said that “we want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material {CASM},” acknowledging that the “program is ambitious, and protecting children is an important responsibility.”

To identify those who have such images the program will turn photos on devices into an unreadable series of hashes {complex numbers} stored on user devices. These numbers will then be matched to a database of hashes provided by the National Centre for Missing and Exploited Children.

The company said that the detection element of the program happens on the device and insisted that only those who had a large cache of images uploaded to iCloud that matched to data base of know sexual abuse material would cross the threshold, enabling the decryption of their photos so that Apple could analyse them.

Understandably the move was greeted by John Clark, president & CEO of the National Centre for Missing & Exploited Children, who said that “Apple’s expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

However, some experts have expressed concern on how the program could infringe on a user’s privacy. One such expert, Greg Nojeim, co-director of the Security & Surveillance Project at the Centre for Democracy & Technology, explained that “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-screen not only in the U.S. but around the world.”

Nojeim finished by saying that “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.” In a post on the company website outlining how the update will work Apple said that it is “designed with user privacy in mind.”

As part of the new program, Apple’s Messenger app will use “on-device machine learning to analyse image attachments and determine if a photo is sexually explicit.” The Message app will also intervene if the computer believes a minor is sending or receiving an image/video showing “the private body parts that you cover with bathing suits,” according to information provided by Apple.

This means that if a minor receives such images then “the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo.” Furthermore, “as an additional precaution the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent and the parents can receive a message if the child chooses to send it.”

ARTICLE: NATHAN REID

MANAGING EDITOR: CARSON CHOATE
PHOTO CREDITS: THE GUARDIAN

Leave a Reply