April 29, 2025
Trending News

Apple changed its mind and won’t scan users’ private photos

  • December 9, 2022
  • 0

Apple has stopped developing technology designed to detect child sexual abuse material (CSAM) on users’ devices. This plan was made public last fall and announced that it will

Apple changed its mind and won’t scan users’ private photos

Apple has stopped developing technology designed to detect child sexual abuse material (CSAM) on users’ devices. This plan was made public last fall and announced that it will be implemented in iOS 15. We wrote in detail in a separate article.

what is known

The backlash from encryption and privacy experts didn’t take long. Them accused the company of developing a surveillance system that would become a spying tool if it fell into the wrong hands (for example, at the behest of undemocratic governments).it allows you to target literally anything: protest photos, cartoons, illegal (according to local regulations) images, etc.

The company later announced that it was interrupting its implementation plans as it sought to find a way to improve the technology and consult with experts. But development has now completely stalled, according to a new statement from Craig Federighi, Apple’s vice president of software.

Child sexual abuse can be prevented before it happens. We focus our energies on this first,
Federighi said.

The company is now expanding its use of end-to-end encryption by applying it to phone backups and adding other new features aimed at protecting privacy and security when using iMessage and for data stored in iCloud.

It is worth noting that Apple still implements some of this technology, i.e. a function called “communication security in iMessage”. It depends on Family Sharing settings and scans incoming and outgoing images for “obscene” material in children’s accounts. Thus, after artificial intelligence detects a pornographic photo sent to a child, it will automatically blur that photo and offer assistance in blocking the addressee. The original plan was also to automatically notify parents when such images were detected, but after publication this feature became a possible option, which can only be turned on on demand.

Source: 24 Tv

Leave a Reply

Your email address will not be published. Required fields are marked *