pixel
Connect with us

Apple

Your iPhone could soon have a feature that automatically scans your photos for child abuse

Apple has yet to confirm anything regarding this report from a cryptographer at Johns Hopkins.

magsafe on iphone
Image: Daniel Romero / Unsplash

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

Your iPhone is reportedly about to get a new tool in the fight against CSAM (Child Sexual Abuse Material), in a scanner that will look for images of child abuse and report you if it finds any matches. It could be a powerful tool to fight abuse, but security experts worry it could be easily co-opted by authoritarian governments.

It’s the first time Apple has decided to introduce a system-wide scanning tool into iOS. That’s not the only time they’ve taken a stance against child sexual abuse material though, as individual apps have been yanked out of the App Store for similar concerns.

The tool is meant to be client-side at launch, that is, the detection is done on the iPhone or iPad. It’s not a stretch to imagine the flow of data off the device and to a remote server as the tool expands, which could lead to some thorny privacy implications. The most egregious? The possibility of scanning messages sent by end-to-end encrypted messaging clients like iMessage or Telegram.

“This sort of tool can be a boon for finding child pornography in people’s phones,” says Matthew Green, associate professor at Johns Hopkins Information Security Institute. “But imagine what it could do in the hands of an authoritarian government?” while linking to a The New York Times story about how operating in China means Apple had to compromise its own security protocols.

READ MORE: Google scans Gmail and Drive accounts for cartoons depicting child sexual abuse

The other problem is that the image hashes used by the tool aren’t completely unique. Using AI, it’s possible to generate other images that will trigger the tool, so benign imagery could be mistaken for CSAM.

AppleInsider says that Apple has not confirmed this new tool as yet, with the only confirmations from Matthew Green’s sources.

Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Maker, meme-r, and unabashed geek with nearly half a decade of blogging experience at KnowTechie, SlashGear and XDA Developers. If it runs on electricity (or even if it doesn't), Joe probably has one around his office somewhere, with particular focus in gadgetry and handheld gaming. Shoot him an email at joe@knowtechie.com.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Apple