pixel
Connect with us

Apple

Are Apple’s new controversial child protection features actually secure?

Or do they further threaten user privacy?

apple store selling iphone in new york
Image: Unsplash

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

A couple of weeks ago, Apple announced its plans to begin scanning users’ phones in an effort to combat child sexual abuse material (CSAM). While many people have been praising the new policy, there are some who believe that the new policies could open up a “backdoor” that could potentially threaten user privacy.

Apple is a company that has always held its users’ privacy in very high regard. Its recent privacy policy updates have been praised by privacy advocates everywhere (while simultaneously ruffling Mark Zuckerberg’s feathers over at Facebook).

But this new policy has some people questioning whether or not Apple’s efforts to combat child sexual exploitation are safe. To unfold this conversation, let’s take a look at exactly what the new policies will do.

Apple’s new child protection policies

The new policies announced earlier this month are boiled down into three basic categories. The first is relatively straightforward and non-controversial. Apple will now direct search topics related to CSAM to resources for reporting to law enforcement or getting other kinds of help.

The controversy lies in the next two features. First, Apple will add a parental control that scans Messages for CSAM. It can block this content for users under the age of 18, and even report it to parents of users under the age of 13.

The last update involves iCloud Photos. Apple said it will also begin scanning photos that users sync with iCloud for CSAM. If the company finds enough CSAM matches from a user, it will then send those photos to an Apple moderator who can then notify the National Center for Missing and Exploited Children (NCMEC).

Apple has ensured that it has kept user privacy a high priority when designing these features, but there are some who don’t agree. So why are people concerned with these new features?

Why are some people concerned with Apple’s new features?

ios apple iphone showing apps and wifi connection
Image: Unsplash

The feature that seems to be the most controversial so far is the scanning of users’ iCloud Photos, but we’ll get to that in just a minute. First, let’s take a look at the new Messages feature. With this new feature, Apple gives parents more control, by notifying them if a user under 13 has received (and viewed) images containing CSAM.

The only real concern that has come from this feature is that it gives parents more ability to snoop on their kids. Some people believe that this could potentially hurt kids, by outing queer and transgender kids to their nosey parents.

Most of the concern regarding these new policies involves Apple’s new photo scanning feature. For US-based iOS or iPadOS users, Apple will soon start scanning your photos whenever you sync them with iCloud. Apple is not the only company that scans user photos for CSAM. Many platforms scan for CSAM, but the difference is where the scanning takes place.

How does the company’s new CSAM scanning work?

Most platforms scan content that is uploaded to their servers. In Apple’s case, the scanning will actually happen locally, on your device. I know, it is a bit confusing, so let me clarify. Whenever you sync your photos with iCloud Photos, Apple will scan your device for photos with CSAM. Apple moderators are then alerted when several CSAM matches are found on your device, and they have the ability to alert the proper authorities at the NCMEC.

The concerns with these new policies mainly lie in the fact that Apple will now be scanning your phone locally, instead of scanning its cloud servers, to try and find CSAM. One critic of the new policy, Ben Thompson, wrote this at Stratechery:

“Instead of adding CSAM scanning to iCloud Photos in the cloud that they own and operate, Apple is compromising the phone that you and I own and operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.”

The problem is that Apple has now set a precedent that lets it look at user data on their phones instead of on the company’s servers. Even though Apple says it is only scanning the photos that users are syncing with iCloud, the fact remains that the company will soon begin scanning users’ devices whenever they are using the iCloud service.

Apple is fighting back against these concerns

iphone on table
Image: Unsplash

Despite the growing concerns over its new policies, Apple is sticking to its guns and trying to ensure users that their privacy will remain intact after the policies have been implemented. The company released a frequently asked questions document surrounding the concerns that people have over the changes.

In the document, Apple ensures users that Messages end-to-end encryption will not be affected by these changes, and that it “never gains access to communications as a result of this feature in Messages.” Additionally, the FAQ document expanded on the iCloud Photos scanning, reiterating that, though the scans are done locally, only images that are being shared with iCloud are being scanned. Why this scanning happens locally instead of after sharing has been completed is still in question.

The company also addressed concerns that these new policies could be used by countries looking to infringe upon Apple users‘ rights by diving deeper into your device’s content. Apple confirmed that they would refuse any request of this nature, saying this:

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Regardless of any concerns that have come forth so far, Apple has not pulled back on these new features. All three of these child protection updates will be coming with later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

From what I’ve seen so far, these changes seem pretty good. CSAM is absolutely disgusting, and it’s good to see that Apple is being proactive in trying to eliminate it from its platforms as much as possible, without harming user security. It seems that the steps the company has taken to protect privacy are sufficient, but we will have to wait and see if that remains the case after they are implemented.

Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Staff writer at KnowTechie. Alex has two years of experience covering all things technology, from video games to electric cars. He's a gamer at heart, with a passion for first-person shooters and expansive RPGs. Shoot him an email at alex@knowtechie.com

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Apple