On Thursday, Apple announced that it would scan iPhones in the US to detect child abuse imagery. That way, they are raising the alarm among the people and security experts who announced that their plan would allow the firm to surveil tens of millions of personal devices for unknown reasons.
The company announced the reports said that new scanning technology is a child protection programs suit that, with time, will evolve and expand. It would be part of the iOS 15, which would be released in August.
Join The True Defender Telegram Chanel Here: https://t.me/TheTrueDefender
Apple is regarded as a safeguard users’ right to Privacy Company, and it seems to attempt and preempt privacy concerns only by saying the software will enhance the protections by avoiding the need to create widespread image scanning on its cloud servers.
“This innovative new technology allows Apple to provide valuable and actionable information to [the National Center for Missing and Exploited Children] and law enforcement regarding the proliferation of known CSAM,” stated the company. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”
The Cupertino-based tech giant shared that the system will use breakthrough cryptography technology and AI to search for abused material kept in iCloud Photos. The images found will be compared to an illegal images database. If the photos are illegal, they will reach the National Center for Missing and Exploited Children. Bear in mind that the software won’t be used for videos.
“Apple’s expanded protection for children is a game-changer,” John Clark, CEO of the National Center for Missing and Exploited Children, said. “The reality is that privacy and child protection can coexist.“
Some experts and researchers stated that the program would increase significant privacy concerns.
Ross Anderson is a security engineering professor at the University of Cambridge and described the system as “an absolutely appalling idea,” “It is going to lead to distributed bulk surveillance of … our phones and laptops,”
When the news broke on Wednesday, John Hopkins, University professor, and cryptographer Matthew Green both repeated the concerns.
“This sort of tool can be a boon for finding child pornography in people’s phones,” Green wrote on Twitter. “But imagine what it could do in the hands of an authoritarian government?”
Green stated: “if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” noting that such “systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
The expert spoke for the Associated Press, adding that Apple would be pressured by other, more authoritarian governments to use the scan for different purposes.
PhotoDNA by Microsoft assists companies in identifying child sexual abuse on the net, but Facebook and Google use systems to flag and review possibly illegal content.