pixel
Connect with us

Apple

Is Apple’s controversial CSAM photo detection feature dead in the water?

Many users and advocacy groups feel the feature could present major privacy issues.

apple store selling iphone in new york
Image: Unsplash

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

A few months ago, Apple announced its plans to add a new feature to iPhones that would be able to scan users’ phones and iCloud Photo libraries for potential Child Sexual Abuse Material (CSAM). Now, the company’s website is completely void of any information on the feature. This makes us wonder if the idea has been scrapped altogether.

First spotted by MacRumors, it looks like Apple has completely removed any material about its potential CSAM scanning from the Child Protection page on its website.

The company had previously delayed the feature after privacy groups spoke out against the potential privacy risk. Advocacy groups and users alike argued that a feature like this could open up a “backdoor” for bad actors to take advantage of and cause harm to Apple’s customers.

Despite this pushback from just about everyone, it looks like Apple hasn’t quite given up on the feature. A spokesperson spoke with The Verge about the potential new feature after news broke about CSAM information being removed from the website.

Apple spokesperson Shane Bauer said the company’s stance on the feature has not changed since its update in September:

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” – Apple’s September statement on its new CSAM feature

So, it looks like Apple is still moving forward on this new feature. Despite the setbacks to this particular child safety feature, the company has added a couple of other features aimed at protecting children on Apple devices.

Will the upcoming CSAM detection feature be a major privacy risk? Let us know what you think.

Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Staff writer at KnowTechie. Alex has two years of experience covering all things technology, from video games to electric cars. He's a gamer at heart, with a passion for first-person shooters and expansive RPGs. Shoot him an email at alex@knowtechie.com

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Apple