Is Apple’s controversial CSAM photo detection feature dead in the water?
Many users and advocacy groups feel the feature could present major privacy issues.
Take Control of Your Health with Fitbit Charge 6!
Get accurate insights into your heart rate, calories burned, sleep patterns, and more. Achieve your fitness goals while prioritizing your overall well-being.
A few months ago, Apple announced its plans to add a new feature to iPhones that would be able to scan users’ phones and iCloud Photo libraries for potential Child Sexual Abuse Material (CSAM). Now, the company’s website is completely void of any information on the feature. This makes us wonder if the idea has been scrapped altogether.
The company had previously delayed the feature after privacy groups spoke out against the potential privacy risk. Advocacy groups and users alike argued that a feature like this could open up a “backdoor” for bad actors to take advantage of and cause harm to Apple’s customers.
Despite this pushback from just about everyone, it looks like Apple hasn’t quite given up on the feature. A spokesperson spoke with The Verge about the potential new feature after news broke about CSAM information being removed from the website.
Apple spokesperson Shane Bauer said the company’s stance on the feature has not changed since its update in September:
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” – Apple’s September statement on its new CSAM feature
So, it looks like Apple is still moving forward on this new feature. Despite the setbacks to this particular child safety feature, the company has added a couple of other features aimed at protecting children on Apple devices.
Meet The All-New Google Pixel Watch 2
Introducing Google's newest smartwatch. Pixel Watch 2 comes with upgraded performance, all-day battery life (with always-on display), new safety features, and more. Preorder today!
Will the upcoming CSAM detection feature be a major privacy risk? Let us know what you think.
- iOS 15.2 blurs nude images for kids using the Messages app
- Your WhatsApp messages aren’t actually private – moderators can read them
- Apple will let you pick who gets your iCloud data if (when) you die. Here’s how to set it up
- Apple’s new Android app will sniff out hidden AirTags