Apple has announced that it will temporarily stop testing its neuralMatch technology, which is supposed to scan images on its users’ devices for illegal content and send reports directly to law enforcement.
In early August, Apple announced its plans to develop NeuralMatch software. This technology, using an artificial intelligence algorithm, is intended to monitor and inform a team of content reviewers about potentially illegal images on its users’ devices. If it is confirmed that the detected content may be illegal, such information was to be forwarded to the appropriate law enforcement agencies.
Apple needs more time
Apple confirmed yesterday that the project has been temporarily put on hold. The delay in the implementation of NeuralMatch, is to allow for necessary improvements to the system and is intended to allow the company to gather more feedback regarding it.
– Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of [Child Sexual Abuse Material] – can be found in a Apple announcement – Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Concerns about neuralMatch
Security researchers were alarmed that Apple risked allowing governments around the world, to open the door to accessing their citizens’ private data and that over time, access to the data could expand to include other content.
Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch – an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of […] our phones and laptops.
In response to plans to create software that monitors images on all devices, the Electronic Frontier Foundation organized a signature collection to convince Apple to abandon its neuralMatch technology
Electronic Frontier Foundation noted that – Child abuse is a scourge, but it can be investigated and prosecuted without breaking encryption, or scanning our private personal photos. Crimes against children cannot be an excuse for Apple to install surveillance software that will scan millions of iPhones.
The US-based Woodhull Freedom Foundation, which works to protect the fundamental human right to sexual freedom, also took to Twitter to encourage people to join a campaign organized by the Electronic Frontier Foundation.
It seems that Apple has heard the concerns and objections that the news regarding NeuralMatch has caused. While the plan to introduce this technology has not been abandoned, but suspended, it can be inferred from Apple’s statement that the company will want to work with experts and organizations to find a solution that will both protect children and the privacy of iPhone users.
Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our Comments Policy.
To display your comment, we process your following personal data: name, e-mail address, IP number. The reason for processing the User's data in this case is to perform the service for your benefit, i.e. to publish your comment on the Service. Your data is processed only for the purpose of publishing your comment. Your personal data (apart from your e-mail address and IP address) are visible in the public Service. Personal data will be processed until the date of removing the comment by the Administrator or by you. Providing personal data is voluntary, however, not providing it makes publication of the comment impossible. You have the right to request access to the content of your personal data, its correction, deletion and the right to limit its processing. In addition, you also have the right to transfer data and the right to object to the processing of personal data. The administrator of your personal data is ErotisNews, e-mail address: erotisnews@erotisnwes.com. You have the right to lodge a complaint with the President of the Office for Personal Data Protection. Personal data will only be transferred to our trusted subcontractors, i.e. IT service providers.