Apple has announced a new feature on iOS, macOS, watchOS, and iMessage that will allow parents to know when their children send or receive explicit images or sex texts in messages. The new feature is likely to be very useful for parents or the police, but many data privacy experts are already concerned about the announcement.
The technology that will detect CSAM (Child Sexual Abuse Material) uploaded to iCloud will be available later this year. According to Apple, the new feature will be designed to protect users' privacy. In this way, the company ensures that it never sees or learns about any explicit photos that the kids might exchange in messages.
The scans will be performed on the kid's device, and notifications will only be sent to the parent's devices. Apple also shared the opinions of some cybersecurity and child protection experts who praised the company's approach.
However, the new feature also has opponents. For instance, Matthew D. Green, an associate professor of cryptography at Johns Hopkins University, expressed his concerns on Twitter. He says that this feature sets a dangerous precedent, as law enforcement agencies or governments could use the new technology in different countries.
The head of WhatsApp Will Cathcart spoke out against Apple's child protection feature, too. He calls Apple's approach troubling and allowing governments with varying perceptions of what is acceptable or unacceptable to require the tech giant to add non-CSAM images to the databases they compare against.
Different opinions on Apple's new technology show how thin the line between maintaining public safety and customer privacy is.
The head of WhatsApp isn't the only one who has spoken out against Apple's new technology. Edward Snowden believes that if a company can scan child porn today, it can scan anything tomorrow.