Apple wants to scan all adult content on iPhones

Apple announced this week its plans to create neuralMatch software for iPhones, initially sold only in America, that will scan all user images for illegal content involving childrens. The software is to be installed on all devices, regardless of the consent of the device owner.

CaliTeens

The U.S. company’s plans for the new feature were reported by The Financial Times, which noted that Apple’s intentions raise many doubts among security and privacy researchers. This kind of intrusion into the privacy of millions of people, could in time lead to surveillance of all devices not only for adult content.

The neuralMatch system, is designed to use an artificial intelligence algorithm to monitor and inform a team of content reviewers of potentially illegal images on iPhones. If it is confirmed that the detected content may be illegal, the team will contact the appropriate law enforcement agencies.

Security experts are concerned

Security researchers told The Financial Times that while they may support work to combat child abuse, they are concerned that Apple risks allowing governments around the world, to open the door to accessing their citizens’ private data. Over time, access to the data may expand to include other content than the neuralMatch system was originally designed for.

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch – an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.

The researchers also note that – although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.

Matthew Green, a security professor at Johns Hopkins University, warned of the possibilities of such technology – This will break the dam — governments will demand it from everyone.

The system Apple wants to introduce may create pressure on other technology companies to create similar technology. The ability to continuously scan photos stored on iPhones, initially only of U.S. users, will make governments more willing to work with Apple, and other companies will be forced to offer similar solutions to compete with it in this arena.

Algorithm that evaluates all images

Whether an image on an iPhone user’s device is legitimate or not will first be verified by an artificial intelligence algorithm. According to Apple, the algorithm was trained using 200,000 images of sexual abuse collected by the U.S. nonprofit National Center for Missing and Exploited Children.

Every photo on phones and uploaded to iCloud will receive a tag, indicating whether it is suspicious or not. If a certain number of a user’s photos are flagged as suspicious, then Apple will allow all of the person’s suspicious content to be decrypted, and if the human team determines that the content may be illegal, they will forward it to the appropriate authorities.

What if there are faulty verifications?

It is not yet known how the entire system will be protected against errors or so-called ‘false positives‘, i.e. when legal content is recognized as material containing illegal content. It is also unclear who will be responsible for damages that users may suffer in case of false accusations.

Anyone who has seen the 2012 movie The Hunt, starring Mads Mikkelsen, knows how a mere accusation of pedophilia, whether it is right or wrong, can completely destroy a person’s reputation and life. The situation presented in the movie is not very different from reality.

But the most difficult question is whether the system will actually serve its purpose. Advanced iOS users, people who place a high value on their privacy, or specifically people who knowingly store illegal content on their devices, will be looking for ways to circumvent the NeuralMatch system. And looking at analogous situations, it shouldn’t take them long to find a workaround.

Leave a Reply

Your email address will not be published. Required fields are marked *

Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our Comments Policy.

To display your comment, we process your following personal data: name, e-mail address, IP number. The reason for processing the User's data in this case is to perform the service for your benefit, i.e. to publish your comment on the Service. Your data is processed only for the purpose of publishing your comment. Your personal data (apart from your e-mail address and IP address) are visible in the public Service. Personal data will be processed until the date of removing the comment by the Administrator or by you. Providing personal data is voluntary, however, not providing it makes publication of the comment impossible. You have the right to request access to the content of your personal data, its correction, deletion and the right to limit its processing. In addition, you also have the right to transfer data and the right to object to the processing of personal data. The administrator of your personal data is ErotisNews, e-mail address: erotisnews@erotisnwes.com. You have the right to lodge a complaint with the President of the Office for Personal Data Protection. Personal data will only be transferred to our trusted subcontractors, i.e. IT service providers.