Policy groups ask Apple to drop plans to scan iMessages for child abuse

Scanning of iPhones without prior user consent threatens the privacy and security of iPhone users.

Apple's plan to counter child abuse backfired as it can potentially threatens the privacy and security of the Apple iPhone users worldwide.

 

According to reports, a total number of 90 political, human rights, and advocacy groups globally had sent letters to Apple requesting them to stop the plan of scanning all iPhones to detect data related to child sexual abuse material (CSAM). The company was facing hurdles about privacy and security concerns.

 

Legal groups are expected to raise the matter with the company. They have stated that the new plan of the company might help protect children and curb child pornography but scanning of iPhones without prior user consent threatens the privacy and security of users. The tech company is taking the wrong approach to meeting them.

 

The Centre for Democracy and Technology said that this process of determining and identification of problematic content is a harsh and blunt tool.

 

The organization is also concerned that governments will force Apple to expand the use of its hash database beyond scanning content in iCloud and once introduced the capability of client-side scanning for CSAM, it will definitely open the door to demands for such scanning to occur on all images stored on phones. CDT also questioned this 'misguided' plan and instructed Apple to halt such surveillance.

 

India Scanner News Network

Leave a comment