The head of WhatsApp has criticised Apple’s decision to scan iPhones for images of child sexual abuse, saying the approach introduces“something very concerning into the world.”

On Thursday, the tech giant revealed its plan to search  and announced  iPhone users who store known images of child sexual abuse in their phones of us user . While Apple insisted the process is secure and designed to preserve user privacy, not everyone is convinced.

WhatsApp head Will Cathcart  tweeted,“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world,” . also said “scan  all of a users private photos on your phone even photos you have not shared with anyone”.

An Apple spokesperson said it will use a  hashing tool, called NeuralHash , that analyses an image and maps it into unique numbers. The system then performs  a database of known image hashes of child sexual abuse provided by the National Center for Missing and Exploited Children and  child-safety organisations.

 In case a match is found, the image will be manually reviewed and upon confirmation, the user’s account will be disabled and NCMEC will be notified.

WhatsApp chief blasted Apple for using a software that can scan all the private photos on a phone instead of focusing on making it easy for users to report such content shared with them.

Cathcart added,“This is an Apple built and operated surveillance system that could very easily be used  scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable” .Apple has made privacy a selling point for its products and services.


Please enter your comment!
Please enter your name here