Apple Will Not Scan iPhones for Child Abuse Images Over Criticism

Apple has decided to postpone the launch of a new feature for scanning iPhone photos before uploading them to iCloud to search for child sexual abuse images, the company noted in a statement.

The company said that it decided to improve the feature over the next few months, and the decision was made because of criticism from users and advocacy groups.

Apple has not mentioned how long it will take to improve the feature, but the tech giant will definitely not abandon its implementation in general.

It is said that the new system could help law enforcement agencies in the investigation of criminal cases, but at the same time, the new feature could open the way for increased legal and government requirements regarding user data.

Apple announced an operating system with a feature for scanning iCloud images and checking them for child sexual abuse material (CSAM) last month.

iPhone Will Warn Parents and Kids About Child Abuse Images
The new feature is likely to be very useful for parents or the police, but many data privacy experts are already concerned about the announcement, including Matthew D. Green, an associate professor of cryptography at Johns Hopkins University.

According to Apple, the new feature will work directly on the user's device. If it finds more than 30 illegal images, the system will report them to the law enforcement agencies. Real people will check the result to avoid errors.

Back then, many iPhone users and some organizations expressed concerns that repressive governments could exploit the new feature as a tool for additional censorship, while others mentioned damage to the company's reputation as a data protection advocate.

Apple employees were worried about the company's plans to scan photos, too.

After that, the Cupertino company has published an FAQ answering common questions about the iOS 15 updates, which will include the new feature.