Apple to bring tool to scan iPhone for child abuse images

The company said that it would also protect the privacy of its users

In a bid to create a safe space for children, Apple is reportedly developing an AI system for checking child abuse photos on iPhones, including the media content related to child pornography.


The company ensured that it will not pass privacy reports from its photo checking system to law enforcement if the review finds no proof of child abuse.


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC.


The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography.


There is an increase in demand that tech companies should work more to take down illegal content and promote a secure online community.


However, WhatsApp chief Will Cathcart criticized Apple for its idea. He tweeted, "I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world."


He even clarified that WhatsApp is not going to change their policy or adapt this system. "People have asked if we'll adopt this system for WhatsApp. The answer is no." he said.


WhatsApp is also under force from various governments who wants to check on their citizens and now the company fears that the pressure will increase in time.

India Scanner News Network

Leave a comment