Apple’s Decision to Scan Your Photo Library for Exploitative Content Isn’t a Privacy Problem. It’s Much Worse

by admin
Apple’s Decision to Scan Your Photo Library for Exploitative Content Isn’t a Privacy Problem. It’s Much Worse

[ad_1]

On Thursday, Apple announced a series of changes that it says are designed to better protect children. In a sense, the changes represent a noble effort on Apple’s part to address what is a very real problem–the sexual exploitation of minors. I think that’s a fight we can all get behind. 

At the same time, the changes represent the most significant shift in the promise Apple makes its users about how it treats their data. The biggest change is that when you upload images to iCloud Photos, Apple will now analyze images to determine if they match known child sexual abuse material (CSAM). 

According to Apple’s documentation, “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes.” Basically, your iPhone will analyze images when you upload them to iCloud using a technology that converts the image to a mathematical hash and compares it against a database of known exploitative content. No one can actually see the image, and the content remains private, but it can be compared with those in the database.

If there’s a match, the image is flagged and reviewed manually. If the amount of content in an iCloud account reaches a certain threshold, it’s reported to the National Center for Missing and Exploited Children. For security reasons, Apple doesn’t say what that threshold is. 

Apple says that its technology is good enough that the false-positive rate is less than one in a trillion, which is great from a technical standpoint. That means there is almost no chance that images will be flagged unless they are actually known CSAM. Philosophically, however, that’s irrelevant.

Apple has apparently been working on this technology for a while. Jane Horvath, who appeared on a panel at CES in 2020, talked about how Apple was working on the ability to scan iCloud Photos libraries for this type of material. Horvath heads up Apple’s privacy efforts, so it’s notable that this entire effort arguably infringes on user privacy.

There’s also a certain irony that, when she spoke publicly on the topic, Horvath made clear that Apple believes “building backdoors into encryption is not the way we are going to solve those issues.”

Except, that’s sort of what happened. Or, more accurately, that’s what it looks like. To be fair, there’s quite a difference between the two, but when it comes to earning the trust of your users, there’s little distinction.

To that end, it’s not a surprise that people started to complain that Apple was violating user privacy by scanning their photo libraries and looking at images to make sure none of them were illegal. Of course, that’s not what’s happening at all. In fact, the noise serves mostly to distract from what I think is actually a much larger problem. 

No one at Apple is going to be able to view photos of your cat, or even of your children. Your iPhone isn’t going to suddenly gather information about all of your photos and report back to the mother ship. 

That said, it’s worth mentioning that if you’re using iCloud Photos, that data is encrypted but Apple holds a key, meaning that it is able to turn it over if subpoenaed by law enforcement. 

The thing is, privacy isn’t the problem. At least, not in terms of Apple looking at your photo library. The problem is that Apple, more than any other company, has made a promise about protecting user data. Tim Cook, the company’s CEO, regularly reminds us that Apple believes “privacy is a fundamental human right.”

For example, the data on your iPhone is encrypted, and not even Apple can access it without your passcode. The messages you send via iMessage are end-to-end encrypted, meaning that only the person you send them to can view them.

Apple has even famously refused to cooperate with federal law enforcement on several occasions when the FBI has sought its assistance to unlock devices associated with known criminals and terrorists. Its reasoning is that it is technically impossible for it to unlock an iPhone, and it has been a fierce opponent 

Sure, we can all agree that CSAM is repulsive and should be erased from the internet. I haven’t heard anyone argue differently.  

But, once you make an exception, it’s hard to justify not making another, and another, and another. If your argument is “we don’t have the technology to do that,” it’s pretty easy to resist the pressure to hand over user data. It’s a lot harder to make the argument “well, we could do that, but we don’t think it rises to our standard.” The problem is, at some point, someone will come along and force that standard with a law or a court order. 

Maybe there are technical reasons that won’t ever happen. Maybe Apple has enough force of will that it really is only ever going to make an exception for CSAM. Still, a backdoor is a backdoor. And you can’t have real privacy-protective encryption as long as a backdoor–any backdoor–exists.

On the other hand, maybe there should be a backdoor, though I think Apple would argue that this isn’t actually a backdoor. Still, it doesn’t really matter if that’s what it looks like to the people to whom you promised you’d keep their data private. Once you break that trust, you’ve lost your most valuable asset.  

I reached out to Apple but did not immediately receive a response to my questions. 

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

[ad_2]

Source link

You may also like