What Regulators Get Wrong About Encryption Backdoors

Lena Kowalski

Lena Kowalski

March 15, 2026

What Regulators Get Wrong About Encryption Backdoors

Politicians and law enforcement have argued for decades that encryption should have a “backdoor”—a way for authorities to access private communications with a warrant. The idea sounds simple: good guys get in, bad guys don’t. In practice, what regulators get wrong about encryption backdoors is more fundamental: you can’t build a backdoor that only the good guys use. The math and the threat model don’t allow it.

The “Exceptional Access” Myth

Proposals for “exceptional access” or “lawful access” usually assume a technical design where only authorized parties (e.g. government with a warrant) can decrypt. The problem: any mechanism that allows a third party to decrypt can be discovered, stolen, or abused. There’s no way to build a backdoor that only works for one set of users. If a key exists, it can leak—through compromise, insider threat, or legal overreach. History is full of “secure” government and vendor systems that were breached. Adding a backdoor doesn’t add good-guy-only access; it adds another attack surface.

Cryptographers have explained this repeatedly. A 2015 report from the National Research Council and many since have concluded that mandated backdoors would weaken security for everyone. You can’t have “encryption that works for citizens but not for criminals”—criminals would use unbackdoored tools (open source, foreign software, or custom crypto) while ordinary users would be left with weakened systems. The result: worse security overall, with no guarantee that law enforcement gets the access they want.

What Regulators Assume (And Get Wrong)

Regulators often assume that tech companies can “just” add a backdoor and keep it secure. They underestimate how many parties would need access (every country that wants “lawful access”), how key escrow or split-key schemes would work at global scale, and how quickly any backdoor would be targeted. They also tend to conflate “we want to solve crime” with “encryption is the main barrier.” In many cases, law enforcement already has other paths: metadata, device access when seized, cloud backups, or cooperation from providers. The encryption debate often ignores that and focuses on the hardest case—unbreakable end-to-end encryption—as if breaking it were the only option.

Another mistake: treating encryption as a policy choice rather than a technical reality. Strong encryption is widely available in open source. You can’t un-invent it. Legislation that requires backdoors in commercial products would push motivated actors to use tools that don’t comply, while average users would remain on weakened, compliant products. The gap between “what the law requires” and “what exists in the wild” is what regulators keep getting wrong.

There’s also the international angle. If the U.S. or E.U. mandates a backdoor, every other country can demand the same key or equivalent access. Authoritarian regimes would use the same legal and technical arguments to spy on dissidents and journalists. You can’t build a backdoor that only democracies use—once it exists, the precedent and the mechanism spread.

What Actually Helps

If the goal is to balance security and lawful access, the answers lie outside backdoors: better resourcing for law enforcement to use existing legal tools, international cooperation, and targeting the endpoints (devices, accounts) rather than trying to break encryption itself. Regulators could also focus on transparency and oversight when access is granted, rather than on mandating weak crypto. None of that is as politically simple as “require a backdoor,” but it’s what the technical and policy reality supports.

What regulators get wrong about encryption backdoors is the belief that you can have both strong encryption and guaranteed lawful access in the same system. You can’t. The choice is better security for everyone with other investigative tools, or weaker security for everyone with a backdoor that won’t stay in the right hands. So far, the evidence keeps pointing toward the first option.

More articles for you