There are few more complex debates and irreconcilable positions than the confrontation between security and privacy, and it’s something Apple experienced firsthand after announcing NeuralHash in August of last year, and responding to it with such force that less than a month later, the company forced to postpone, saying that it will hold meetings with experts from all interested fields in search of an effective solution that will ensure the required balance between security (in this case of minors) and privacy. .
In case you don’t remember or haven’t read it at the time, NeuralHash was a proposal to detect CSAM (child sexual abuse material, pedophile content) by checking images uploaded by users from their devices to iCloud. And how was I going to do it? Its name already points us in the right direction: generating a hash for each file and then cross-referencing it with a database of image hashes identified as CSAM. In addition, as Apple explained, the system would be able to detect some changes in the images (and of course in their corresponding hashes) to prevent a small modification that would allow these images to go unnoticed.
In case of a positive identification, the system would automatically block the iCloud account in question and would also inform the relevant authorities so that they could take appropriate measures. And for the unlikely case of false positives, which Apple downplayed, he claimed that users would have a system to report the error, which would include a manual check to confirm whether or not it was a bug.

The backlash was, as we told you back then, more than vehement, and for Apple, which has for many years emphasized the protection of the privacy and security of its users, it was a huge setback that, despite the announced plans to “spin NeuralHash”, the project actually ended up at the bottom of a drawer. A drawer that we can assume is very, very deep.
So we haven’t heard from him since then, and finally, as we can read in Wired, Apple decided to throw in the towel with NeuralHash and canceled it. Here’s what the company says about it:
“Following extensive expert consultation to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communications security feature, which we first made available in December 2021.the company told WIRED in a statement. “We have also decided not to move forward with our previously proposed child sexual abuse detection tool for iCloud Photos. Children can be protected without companies controlling personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people, preserve their right to privacy and make the internet a safer place. safe for children and for all of us.“
Apple will therefore focus its efforts in the fight against CSAM on the tools provided to parents and guardians to adequately monitor the content their children receive and use, features that already began to be deployed at the end of last year and will continue. evolve.
However, it remains to be seen how this fits into the plans of the European Union, which in the middle of this year began raising the possibility of setting up NeuralHash-like systems to deal with CSAM on the web in cooperation with technology companies. .