Apple gives up scanning iCloud photos for child pornography

icloud drive iconThis is news that caused a huge scandal a few months ago. Apple announced to the world its intention to analyze the photos via artificial intelligence to find child pornography content and denounce the users concerned. This serious invasion of privacy (which could have been beneficial) has created a great controversy, today, Apple indicates to give up continuing on this strategy.

The decried project is canceled

Apple revealed tonight that it has abandoned his project CSAM which consists of analyzing photos hosted on iCloud to detect content that is offensive to children. The Cupertino firm had been blisse by child protection associations who hoped that Apple, Google or another would take this type of initiative. However, organizations advocating for Protection of private life saw a serious breach on the personal data of millions of users around the world. What was scary was the door that Apple was about to open, the company started with content of this type and then could have go further in the analysis of your photos in order to find other crimes.

With this new system, Apple wanted to rely on an artificial intelligence capable of analyzing thousands of photos per hour, it would have been trained to spot and isolate photos of naked children so that an Apple team can then verify and alert the authorities.

tim cook

Apple confided in the media Wired on the abandonment of the project:

After extensive consultations with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communication security functionality we made available for the first time. times in December 2021.

We have further decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies going through personal data, and we will continue to work with governments, child advocates and other companies to help protect young people, preserve their right to privacy and making the internet a safer place for children and for all of us.

In September 2021, Apple announced that it wanted postpone the analysis of photos on iCloud, the company wanted to calm things down and prevent customers from leaving for the competition. It took a long time to realize that an iCloud photo analysis is still a serious violation of a user’s privacy!

For the moment, Apple has not indicated new measures for the protection of children, the company remains convinced that it can act and do better in this domain. It cannot be ruled out that Apple will propose other less restrictive measures for the privacy of users in the future.

We would like to thank the author of this post for this outstanding content

Apple gives up scanning iCloud photos for child pornography


Visit our social media accounts and other pages related to themhttps://www.ai-magazine.com/related-pages/