Apple
Is Apple’s controversial CSAM photo detection feature dead in the water?
Many users and advocacy groups feel the feature could present major privacy issues.
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
A few months ago, Apple announced its plans to add a new feature to iPhones that would be able to scan users’ phones and iCloud Photo libraries for potential Child Sexual Abuse Material (CSAM). Now, the company’s website is completely void of any information on the feature. This makes us wonder if the idea has been scrapped altogether.
First spotted by MacRumors, it looks like Apple has completely removed any material about its potential CSAM scanning from the Child Protection page on its website.
The company had previously delayed the feature after privacy groups spoke out against the potential privacy risk. Advocacy groups and users alike argued that a feature like this could open up a “backdoor” for bad actors to take advantage of and cause harm to Apple’s customers.
Despite this pushback from just about everyone, it looks like Apple hasn’t quite given up on the feature. A spokesperson spoke with The Verge about the potential new feature after news broke about CSAM information being removed from the website.
Apple spokesperson Shane Bauer said the company’s stance on the feature has not changed since its update in September:
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” – Apple’s September statement on its new CSAM feature
So, it looks like Apple is still moving forward on this new feature. Despite the setbacks to this particular child safety feature, the company has added a couple of other features aimed at protecting children on Apple devices.
Will the upcoming CSAM detection feature be a major privacy risk? Let us know what you think.
Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.
Editors’ Recommendations:
- iOS 15.2 blurs nude images for kids using the Messages app
- Your WhatsApp messages aren’t actually private – moderators can read them
- Apple will let you pick who gets your iCloud data if (when) you die. Here’s how to set it up
- Apple’s new Android app will sniff out hidden AirTags