Apple changed its mind about scanning users’ photos
- December 9, 2022
- 0
The American company Apple has abandoned its program to scan user photos for child abuse. This was stated by Apple Senior Vice President Craig Federighi in an interview
The American company Apple has abandoned its program to scan user photos for child abuse. This was stated by Apple Senior Vice President Craig Federighi in an interview
The American company Apple has abandoned its program to scan user photos for child abuse. This was stated by Apple Senior Vice President Craig Federighi in an interview with The Wall Street Journal.
“Child sexual abuse can be prevented before it happens. From this perspective, we are now directing our efforts to protect children.”said the senior manager.
He stressed that keeping user data safe is crucial, and that Apple will place special emphasis on protecting children by giving parents tools to protect their underage sons and daughters in iMessage.
In a conversation with WSJ reporter Joanna Stern, Federighi also noted that the project’s rejection was indirectly related to the introduction of end-to-end data encryption in iCloud.
Apple announced as early as 2021 that it will scan users’ personal photos in iCloud Photos galleries using the Child Sexual Abuse Material (CSAM) Detection system. It was assumed that with the release of a new version of the operating system for each device, the system will begin to work on all gadgets of the company.
However, the innovation has drawn harsh criticism from human rights defenders and users of Apple devices, who see a threat to their privacy. After that, Apple delayed the use of the system indefinitely.
Source: Port Altele
I’m Maurice Knox, a professional news writer with a focus on science. I work for Div Bracket. My articles cover everything from the latest scientific breakthroughs to advances in technology and medicine. I have a passion for understanding the world around us and helping people stay informed about important developments in science and beyond.