There is a saying that goes around modern privacy circles that “Privacy is about Consent.” This means that the one big factor that determines whether your privacy is violated comes down to whether you consented to share the information. For instance, let’s say Alice tells Bob a secret: if Bob then tells the secret to someone else, Bob will be violating Alice’s privacy, unless he had asked Alice for permission first. If you think about it, you can come up with many examples where the same action, leading to the same result, takes on a completely different tone–depending on whether or not the actor got consent.
We have a major privacy problem in society today, largely because tech companies collect customer information and share it with others without getting real consent from their customers. Real consent means customers understand all of the ways their information will be used and shared, all the implications that come from that sharing–now, and in the future. Instead, customers get a lengthy, click-through privacy policy document that no one is really expected to read or understand. Even if someone does read and understand the click-through agreement, it still doesn’t fully explain all of the implications behind sharing your location and contact list with a messaging app or using voice commands on your phone.
Big Tech has been funded, over the past two decades, by exploiting the huge influx of young adults who were connected to the Internet and shared their data without restriction. While it’s a generalization that young adults often make decisions based on short-term needs, without considering the long-term impacts, there’s also some truth behind it–whether we are discussing a tattoo that seemed like a good idea at the time, posting pictures or statements on social media that come back to bite you or giving an app full access to your phone. Individuals didn’t understand the value of this data or the risks in sharing it; but tech companies knew it all along and were more than happy to collect, store, share and profit off of it, and Big Tech is now a multi-billion-dollar industry.
Tech companies (and much of society until a very recent past) have dismissed privacy concerns by concluding that “people don’t care about privacy” when the truth is that most people were simply unaware of the data they were sharing, the implications of sharing that data, and of the potential risks of sharing it. Therefore, any consent they gave wasn’t informed consent–companies weren’t motivated to educate customers on the risks they were taking, because it might mean losing their consent.
The main reason everyone is starting to talk about privacy now is because it takes time for long term effects to be felt. As these adults entered the workforce, their youthful indiscretions began to impact their job prospects. Then, with controversies like the Cambridge Analytica scandal, everyone got a clear-cut example of how the data that ad tech collected could be used against them–to do more than show them ads. Privacy has become the tattoo removal of the information age as everyone is looking for a way to cover up mistakes from the past. Now that “privacy” has become marketing gold, these same companies have rallied around redefining the word to apply it to their products without actually protecting their customers.
The reality is that people do care about privacy, but they don’t feel empowered to do anything about it. Between Big Tech, advertisers and governments all wanting to collect and analyze your data, what are you to do? The solution is simple: consent. Society is educating college students on the importance of affirmative consent in sexual encounters and that the default is a position of no consent. This means that it’s not enough that a person didn’t say ‘no’ (opt-out) to escalating sexual contact; they need to say ‘yes’ (opt-in). Affirmative consent grants each individual power over their own body in a way that opting out doesn’t; if these large tech organizations, who started from a position of no consent, were now required to get explicit and informed consent (opt-in) from customers–before capturing and sharing their data–people using them would finally be in control.
But that is unfortunately not what’s happening. Instead, each time privacy proposals come before the government, these same companies that tout privacy in their marketing campaigns fight to remove any requirement that they need to get your consent before collecting and sharing your data. They realize that most people wouldn’t consent if asked, so they’d prefer you ask them to stop (opt-out) and hope most people won’t bother, or understand. When you later discover how they’ve used and abused your data, they can claim you never opted out. They’d much rather ask for forgiveness than for permission.
This privacy problem is why Purism was founded, is cemented into our corporate charter, defines how we build all of our products; it is why we created Librem One services and why we are asking the California legislature to require tech companies to get consent before using your data. You should be the one in control of your technology and your data, and the key to that control is consent.