Purism

Purism

Beautiful, Secure, Privacy-Respecting Laptops, Tablets, PCs, and Phones
Purism

Latest posts by Purism (see all)

You’ll often hear hackers say that they “owned” (or sometimes “pwned”) a computer. They don’t mean that they have the computer in their physical possession, what they mean is that they have compromised the computer and have such deep remote control that they can do whatever they want to it. When hackers own a computer they can prevent software from running, install whatever software they choose, and remotely control the hardware–even against the actual owner’s wishes and usually without their knowledge.

Hackers intuitively understand something many computer users don’t–ownership is not about possession, it’s about control. If your company gives you a computer, or even if you bring your own, but they remotely control how you use it and can override your wishes, it’s their computer, not yours. By this definition, most phones today are owned by the vendor, not the user, and as I said in The General Purpose Computer in Your Pocket:

One of the neatest tricks Big Tech ever pulled was convincing people that phones weren’t general-purpose computers and should have different rules than laptops or desktops. These rules conveniently give the vendor more control so that you don’t own a smartphone so much as you rent it. Now that the public has accepted these new rules for phones, vendors are starting to apply the same rules to laptops and desktops.

The Illusion of Control

The illusion that Apple users have control over their computers was briefly disturbed this week when Apple released their new MacOS version “Big Sur” to the world. Users started noticing that around the same time that the update was released, that they had problems launching local applications, applications stuttered and macOS itself was unresponsive at times–even if the user hadn’t updated to Big Sur. It seemed like a pretty odd coincidence that a new OS release would somehow cause local applications–even non-Apple applications–to stall.

As this Ars Technica article describes, users were able to troubleshoot the issue pretty quickly:

It didn’t take long for some Mac users to note that trustd–a macOS process responsible for checking with Apple’s servers to confirm that an app is notarized–was attempting to contact a host named ocsp.apple.com but failing repeatedly. This resulted in systemwide slowdowns as apps attempted to launch, among other things.

To summarize the issue, every time you launch a signed application on MacOS, a notary service sends information about the application to Apple servers to make sure signatures match. If signatures match, your OS allows the application to launch. When the computer isn’t online, the check fails but it still allows the app to run, but when the computer is online, signing is enforced and because the service was up but slow, applications stalled as the OS waited for a reply.

Remote Control Through Code Signing

Applications often use code-signing as a way for the user to detect tampering. The developer signs the software with their private key and the user can test that signature against a public key. Only software that hasn’t been changed will match the signature. In the free software world, distributions including PureOS include public keys on the local computer and software updates automatically test whether the signatures match before the update is applied. When used in this way, you can test an application for tampering before you install it and the user has full control over the process.

Apple has taken code-signing a step further with the inclusion of this notary service. Any signed applications–even those not from Apple–must get permission from the remote notary service to run. This means that Apple not only knows which applications you have installed, it knows each time you run them. While in the past this was an optional service, now it’s mandatory and starting with Big Sur, you can no longer use a tool like Little Snitch to block this service, or route it through Tor for some privacy. Apple (and anyone who can sniff this plaintext communication) can know when you launched Tor browser or other privacy tools, or how often you use competitors’ applications.

[Update: It looks like Apple’s notary services doesn’t send information about the app, but instead sends information about the developer certificate used to sign them (which makes more sense given how OSCP works). This means that they can know, for instance, that you ran an application from Mozilla, but they can’t necessarily tell whether you ran Firefox or Thunderbird. If a developer only signs a single application, of course, they could correlate the certificate with the app. The service also seems to cache an approval for a period of time so whether it sends Apple information each time you run an app depends on how frequently you launch it.]

While I imagine most people were surprised to discover this feature, I also suspect many accept it in the name of security. Yet like with so many Apple features, security is a marketing term when the real motivation is control. While code signing already gave Apple control over whether you could install or upgrade software, this feature grants Apple control over whether you can run applications. Apple already has used code signing on iOS to remove competitor’s applications from the App Store and also remotely disable apps in the name of security or privacy. There’s no reason to think they won’t use the same power on macOS now that it can no longer be bypassed. Apple’s ultimate goal with code signing, their secure enclave, and their proprietary silicon is to ensure full control–full ownership–of the hardware they sell.

Take Back Ownership

You should own the computers you buy. Neither hackers, nor vendors, should be allowed to own you remotely. We build secure, privacy- and freedom-respecting laptops, desktops, servers, and phones that put you back in control and ensure that when you buy a Purism computer, you own it.

Recent Posts

Related Content

Tags