Apple has announced that it will temporarily stop testing its neuralMatch technology, which is supposed to scan images on its users’ devices for illegal content and send reports directly to law enforcement.
In early August, Apple announced its plans to develop NeuralMatch software. This technology, using an artificial intelligence algorithm, is intended to monitor and inform a team of content reviewers about potentially illegal images on its users’ devices. If it is confirmed that the detected content may be illegal, such information was to be forwarded to the appropriate law enforcement agencies.
Apple needs more time
Apple confirmed yesterday that the project has been temporarily put on hold. The delay in the implementation of NeuralMatch, is to allow for necessary improvements to the system and is intended to allow the company to gather more feedback regarding it.
– Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of [Child Sexual Abuse Material] – can be found in a Apple announcement – Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Concerns about neuralMatch
Security researchers were alarmed that Apple risked allowing governments around the world, to open the door to accessing their citizens’ private data and that over time, access to the data could expand to include other content.
Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch – an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of […] our phones and laptops.
In response to plans to create software that monitors images on all devices, the Electronic Frontier Foundation organized a signature collection to convince Apple to abandon its neuralMatch technology
Electronic Frontier Foundation noted that – Child abuse is a scourge, but it can be investigated and prosecuted without breaking encryption, or scanning our private personal photos. Crimes against children cannot be an excuse for Apple to install surveillance software that will scan millions of iPhones.
The US-based Woodhull Freedom Foundation, which works to protect the fundamental human right to sexual freedom, also took to Twitter to encourage people to join a campaign organized by the Electronic Frontier Foundation.
It seems that Apple has heard the concerns and objections that the news regarding NeuralMatch has caused. While the plan to introduce this technology has not been abandoned, but suspended, it can be inferred from Apple’s statement that the company will want to work with experts and organizations to find a solution that will both protect children and the privacy of iPhone users.