In the latest update, WhatsApp Head Will Cathcart has slammed Apple over its plan to launch photo identification measures. This comes as a measure to identify child abuse images in the iOS photo libraries. WhatsApp claimed that this would be a breach of privacy since Apple would be able to scan through private photos.
WhatsApp Head Expresses Concerns Over Apple’s Approach Towards Fighting Child Sexual Abuse
As per the latest reports, Cathcart said that they won’t be stressing that WhatsApp won’t allow any of the Apple tools to run on the platform. He said that Apple’s fight against Child Sexual Abuse Material (CSAM) has been long needed. But the approach that they are (reportedly) taking would be concerning to the world.
In a Twitter thread late on Friday, Cathcart said that he had read the information Apple had put out and he was concerned. He said that it was a wrong approach and a setback towards people’s privacy for all across the world.
He further clarified that people have asked if they will adopt the same system but the answer is “no”. Cathcart accused Apple, instead of focusing on making it easy for the people to be able to report the content that gets shared with them. He said that Apple has built software that can scan through the personal pictures on the phone, including the pictures that have not been shared with anyone at all.
“That’s not privacy”, said Cathcart.
Apple To Roll Out New Applications To Limit The Spread Of CSAM
On Thursday, Apple announced that they are planning to deploy the new technology within iOS, macOS, iMessage and watchOS will be able to detect any potential threat regarding child abuse imagery.
As per the reports, the devices in the US, new versions of will be rolling out the new applications this fall. However, Cathcart said that it is Apple-built and a surveillance system would easily be used to scan private content.