Invest in the Future You Want to See.

A First Public Offering of Purism Stock on StartEngine. Minimum $500.

Invest Now

Kyle Rankin

Kyle Rankin

PGP ID: 0xB9EF770D6EFE360F
Fingerprint: 0DFE 2A03 7FEF B6BF C56F73C5 B9EF 770D 6EFE 360F
Librem Social
Kyle Rankin

Latest posts by Kyle Rankin (see all)

Imagine an Internet of Snitches, each scanning whatever data they have access to for evidence of crime. Beyond the OS itself, individual phone apps could start looking for contraband. Personal computers would follow their lead. Home network file servers could pore through photos, videos and file backups for CSAM and maybe even evidence of copyright infringement. Home routers could scan any unencrypted network traffic. Your voice assistant could use machine learning to decide when yelling in a household crosses the line into abuse. Your printer could analyze the documents and photos you send it.

It’s not much of a surprise to most people that their devices, especially their phones, are snitching on them to the hardware vendor (or app developer). Some people are surprised to discover just how much. I already wrote a post Snitching on Phones That Snitch on You that focused on the amount of data an idle Android and iOS device are sending to Google and Apple respectively, described how we avoid those problems on the Librem 5, and even explained how to use OpenSnitch to track any attempts by a malicious app to snitch on you.

So we know most devices and proprietary apps track people to some degree (even for paying customers), and that the problem has extended to cars. While many people don’t like the idea of this, they also shrug it off, not just because they don’t feel empowered to do much about it, but also because their data is “only” being used for marketing purposes. Someone is profiting off of the data, sure, but their data isn’t being used against them.

Yet we are starting to see how your data can be used against you. Police routinely get location data from data brokers to track suspects without having to get a warrant. Even private groups have paid data brokers to dig up dirt on people, leading to a Catholic priest’s resignation after location data revealed he used the Grindr app and frequented gay bars.

Crossing the Rubicon

So companies capture and sell our data, and the police and private groups sometimes buy that data to look for crimes. But up to this point, the “snitching” that devices did on you was indirect–it would send data to vendors or app developers to sell to brokers, but the only time that vendors might search your data and alert the authorities is when searching files stored on their own servers that you have shared. Up to now, actually scanning for potential contraband on a person’s device was a line companies wouldn’t cross.

This past week, however, Apple crossed that line. Apple announced in their new child safety initiative that they will scan all customers’ iPhone photos for CSAM (Child Sexual Abuse Material) before they are backed up to iCloud. Plenty of other groups have already weighed in on the risks and privacy implications of this particular move for iPhones and the EFF in particular has explained the issues well, so I won’t cover any of that here. What I will discuss, instead, are the broader implications of crossing the Rubicon into client-side scanning of devices for potential evidence of crimes.

Apple has focused a lot of its marketing lately around privacy because it’s an area of weakness for its major competitors like Google and Facebook. As part of the child safety announcement, Apple described complicated cryptographic measures they will put in place for this client-side scanner so that human beings at Apple will only look at your photos if they have a strong belief they are CSAM. In response to some of the backlash, and concerns over this technology expanding under government pressure, they have also promised that they would not add extra hashes to their CSAM database even if the government demanded it.

While there are reasons to be skeptical of Apple’s claims, for the purposes of this article let’s assume that Apple’s client-side scanner does only what it says it does, how it says it does it, and it will never expand into other areas. Even if this is the case, Apple’s move now legitimizes client-side scanning. After all, if Apple can do it, a company that markets itself as caring about privacy, it must be OK from a privacy perspective for other companies to follow suit with their own client-side scanners.

The Internet of Snitches

We are surrounded by computers. Today most people have devices in their homes, cars, and pockets full of sensors that can track their location, listen to their voice, store their personal files, and have an always-on connection to the Internet either via WiFi or cellular networks. This “Internet of Things” has promised people convenience, yet we know that convenience has already come at the cost of personal privacy.

Up to this point, the Internet of Things has “only” captured our data and shipped it off to the vendor for profit. It’s likely many hardware and software vendors will follow Apple’s lead to implement client-side scanning of their own. Like with Apple it will probably start with scanning for evidence of child sexual abuse, but might expand into terrorism, sedition, or similar categories of crime. Also, most vendors have less of an incentive to protect their customers’ privacy and often have a perverse incentive to violate it.

As the scope of scan-worthy crimes expands, so do Internet of Snitches capabilities. Streaming media services would be motivated to search for DMCA violations on your devices. Your car could report when you parked illegally or disobeyed the speed limit (there are already proposals to mandate new cars scan you for impairment or for children left in the car). Your smart irrigation system could detect whenever you watered your lawn in violation of water conservation orders. Your home security system or smart doorbell could expand what data sharing it already does with law enforcement.

Some people are OK with Apple’s child safety scanning, and some people are willing to give up most of their freedoms in general if it means that it could possibly stop a criminal. We already see that privacy trade-off taking place with Ring doorbells in many neighborhoods. Some people will believe the assurances they get that these technologies are accurate, well designed, and only trigger on real evidence of horrendous crime.

Now imagine an Internet of Snitches doing it badly. It’s easy if you try. Some vendors already do a bad job of keeping customer data private and will continue that track record, so you can expect public leaks of databases that have flagged suspected criminals. Other vendors will write bad machine learning algorithms (or biased ones) leading to false positives and ruined lives.

Still other vendors won’t be able to resist government demands to expand the scope of their scans, especially if they are threatened with no longer doing business in that country. The hard part is getting customers to accept client-side scanning in the first place. With that out of the way and the technology in place, expanding what it scans for bit by bit will be much easier.

Take Back Control

Some propose government regulation to limit the ability of companies to do client-side scanning, and others signed a petition to ask Apple to stop. The fundamental problem, though, is that people no longer have control (and arguably no longer have true ownership) over devices they buy. Vendors can implement whatever policies they want, even client-side scanning against a customer’s will, and the customer can’t stop it.

You should have control over your own computers. Your phone should be your castle. True control means controlling your hardware and software. It means picking hardware that doesn’t depend on absolute trust in a vendor for its security, but gives you control over your own security so you don’t have to ask the vendor’s permission to use the computer how you wish. It means using a free operating system that lets you install whatever software you want and remove any software you don’t. Finally, it means running free software that you or anyone in the community can modify (or change back) if a developer ever makes it work against your interests.

Computers will only become more important and integral in our society, and collectively we get to choose the rules that govern them. It’s not too late to reject technology that’s not on your side. Invest in technology that gives you back the control and ownership you should have always had.

Recent Posts

Related Content

Tags