Scroll Top

Apple announces plans to scan every iPhone for child abuse images

Futurist_Bappleprivacy

WHY THIS MATTERS IN BRIEF

This strikes at the heart of people’s moral conundrums when it comes to the ethics of privacy – on the one hand the idea is to be praised, and on the other demonised …

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential Universityconnect, watch a keynote, or browse my blog.

The battle for privacy, or for anonymity depending on how you frame it, is in full swing. And, frankly, consumers aren’t winning, like at all – unless you count Europe’s new GDPR regulation which to be fair did move the dial.

And now Apple, in a move that has both great and awful implications at the same time, have announced that they’ll be scanning the photo libraries stored on iPhones in the US for known images of child sexual abuse, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications. The company will also examine the contents of end-to-end encrypted messages for the first time.

 

RELATED
China seeks cyber dominance with the world's fastest 1.2 Terabit internet network

 

Apple’s Artificial Intelligence (AI) tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery. If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

 

Learn more about the Future of Privacy, by Keynote Matthew Griffin

 

Since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry. But researchers worry the matching tool – which does not “see” images, just mathematical fingerprints that represent them – could be put to different purposes.

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the system could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images.

 

RELATED
Revolutionary new IoT chip communicates miles without using any energy

 

“Researchers have been able to do this pretty easily,” he said of the ability to trick such systems. Other abuses could include government surveillance of dissidents or protesters.

“What happens when the Chinese government says: ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service for child abuse images. But the decision to move such scanning on-device is unprecedented among major technology companies.

 

RELATED
Blockchain, EU GDPR privacy law could finally give users back control of their privacy

 

Alongside the neuralMatch technology, Apple plans to scan users’ encrypted messages as they are sent and received using iMessage. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes. That system, which is purely aimed at providing tools to “warn children and their parents when receiving or sending sexually explicit photos”, will not result in sexually explicit images being sent to Apple or reported to the authorities. But parents will be able to be notified if their child decides to send or receive sexually explicit photos.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data. Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

But the Electronic Frontier Foundation, an online civil liberties pioneer, called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

 

RELATED
China rolls out its Social Credit Score complete with Orwellian overtones

 

The computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child abuse images online, acknowledged the potential for abuse of Apple’s system but said it was far outweighed by the imperative of tackling child sexual abuse.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California, Berkeley, who argued that plenty of other programs designed to secure devices from various threats had not been affected by “this type of mission creep”. For example, WhatsApp provides users with end-to-end encryption to protect their privacy but also employs a system for detecting malware and warning users not to click on harmful links.

Apple was one of the first major companies to embrace end-to-end encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement has long pressed the company for access to that information. Apple said the latest changes would roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

 

RELATED
New all seeing cameras will revolutionise computer vision as we know it

 

“Apple’s expanded protection for children is a gamechanger,” said John Clark, the president and chief executive of the NCMEC. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Apple denied that the changes amounted to a backdoor that degraded its encryption. It said they were carefully considered innovations that did not disturb user privacy but rather protected it.

“At Apple, our goal is to create technology that empowers people and enriches their lives – while helping them stay safe,” the company said in a post announcing the new features. “We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM).

“This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.”

Related Posts

Leave a comment

You have Successfully Subscribed!

Pin It on Pinterest

Share This