Apple has announced a new feature on iOS, macOS, watchOS, and iMessage that will allow parents to know when their children send or receive explicit images or sex texts in messages. The new feature is likely to be very useful for parents or the police, but many data privacy experts are already concerned about the announcement.

The technology that will detect CSAM (Child Sexual Abuse Material) uploaded to iCloud will be available later this year. According to Apple, the new feature will be designed to protect users' privacy. In this way, the company ensures that it never sees or learns about any explicit photos that the kids might exchange in messages.
The scans will be performed on the kid's device, and notifications will only be sent to the parent's devices. Apple also shared the opinions of some cybersecurity and child protection experts who praised the company's approach.

However, the new feature also has opponents. For instance, Matthew D. Green, an associate professor of cryptography at Johns Hopkins University, expressed his concerns on Twitter. He says that this feature sets a dangerous precedent, as law enforcement agencies or governments could use the new technology in different countries.
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. https://t.co/mKdAlaDSts
— Matthew Green (@matthew_d_green) August 5, 2021
The head of WhatsApp Will Cathcart spoke out against Apple's child protection feature, too. He calls Apple's approach troubling and allowing governments with varying perceptions of what is acceptable or unacceptable to require the tech giant to add non-CSAM images to the databases they compare against.
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. https://t.co/2KrtIvD2yn
— Will Cathcart (@wcathcart) August 6, 2021
Different opinions on Apple's new technology show how thin the line between maintaining public safety and customer privacy is.
The head of WhatsApp isn't the only one who has spoken out against Apple's new technology. Edward Snowden believes that if a company can scan child porn today, it can scan anything tomorrow.