Skip links

Governments can’t seem to stop asking for secret backdoors

Opinion With Apple pulling the plug on at-rest end-to-end encryption (E2EE) for UK users, and Signal threatening to pull out of Sweden if that government demands E2EE backdoors, it’s looking bleak.

There’s no answer to the objection that you can’t protect deliberate flaws in encryption from themselves being abused. The strength of E2EE is the underlying mathematical proof that can no more be overturned by law than pi can be made exactly 3. The implementation has to be deliberately crippled.

exit

Signal will withdraw from Sweden if encryption-busting laws take effect

READ MORE

This has its own practical problems. Most obviously, if you are a criminal relying on encryption to hide your misdeeds, you will choose to use a non-crippled option. The UK government thought about this and made the whole process of demanding access “secret,” which would work only if everyone involved, including those outside UK jurisdiction, felt honor-bound not to leak it. Even then, secrecy wouldn’t last if features mysteriously changed in the software. Assuming nobody noticed, then the first court case that relied on evidence from a supposedly secure source would fall apart on examination. It turns out that if you can’t back secrecy with solid math, it won’t stay secret.

Conversely, if you know what you’re doing, then you can evade snoopery. You can simply use software that doesn’t rely on the compromised services, you can run encryption software locally before uploading to the cloud, or you can arrange your own private services that don’t have a corporate entity attached who can be forced to capitulate. If you control the software that implements the math and the data flow on your system, you’re golden. Criminals know this, tech types know this, it’s just the vast majority of innocent users who don’t. They’re the most at risk of abuse from snoops, and a fix that works for them will do the most good for the most people.

Of course, the best way to control software you trust is to use open source. You can’t hide a back door in open source because it would be immediately visible and removable, as well as being forensic evidence for who’d been doing the tampering. That’s the theory, anyway, although there are a variety of reasons it could happen. 

Proton Mail is a good proof of concept. Email sent between two Proton Mail users is always end-to-end encrypted, with only encrypted versions stored by Proton’s servers. Message composition and consumption happens within the users’ browsers, using code sent by Proton Mail for client-side execution. This code is completely open and examinable by users. More to the point, it is amenable to automated non-Proton change logging. Once a known-good version is established in the community, any changes can be instantly flagged for attention. Crypto backdoors aren’t subtle when introduced to well-documented, known-good code. The entire point of modern encryption is that, except for the keys, there are no secrets and nowhere to hide – provided you can look.

What Proton Mail does in-browser – client-side encryption code distributed as source – can be extended to whatever functions are needed to provide E2EE flows within any OS. It’s been decades since we had enough local resources to make on-the-fly compilation with caching work effective in even the meanest devices. Open source has proved that trusted distribution systems can be built from multiple independent entities, leaving no single point of attack where criminals or law enforcement can operate against users’ interests – at least undetected. As for seamless integration of FOSS components on proprietary systems, everything except Windows is built on a mix of FOSS and closed commercial components. Windows’ core may be proprietary, but ask nicely and there’s Linux.

FOSS is the ideal model for an open, attack-resilient, self-monitoring E2EE framework that absolves commercial entities from being entities that malicious governments can pry open. A completely open system cannot be secretly compromised, and if any component is compromised, it can be isolated and replaced fast. Unless all countries outlaw developing and publishing source code implementations of standard mathematical algorithms, the idea is unassailable – if the industry decides it wants it. It’s not as if it would displace any significant proprietary secrets: all this stuff is known, and there’s plenty of room elsewhere to create differential advantages for any particular platform.

States and their security services have spent decades trying to secure encryption for “good” people while breaking it for “bad” ones. You cannot do that, no matter how devoutly you say Something Must Be Done. If the industry, especially the open source community, can remove the temptation once and for all, then the stuff that does work – intelligence, policing, infiltration – will get the attention it needs. ® 

Source