The debate over whether the federal government should be able to access coded communications, which stretches from Silicon Valley to Washington, D.C., has a ground zero: your phone.
When you furiously type out a text message on your iPhone or Android device, you may think that only the person to whom you sent it can read it. But unless you’re using one of a handful of apps that incorporate technology designed to prevent eavesdropping, anybody with access to the underlying infrastructure of the Internet—most often law enforcement workers—can spy on your texts, instant messages, phone calls, and in some cases your stored pictures.
This past year, Apple and Google began encrypting the data stored on their smartphones. Some of their apps and services encrypt data in transit, too, while third-party services such as Tor, Signal, and Silent Circle’s suite of communications apps fill some security gaps.
In the aftermath of the Paris terrorist attacks, U.S. government officials oppose tech companies’ encryption efforts, arguing that law enforcement agencies such as the FBI and state police departments need the ability to eavesdrop on encrypted communications to stop criminals. They demand private “backdoors” that will allow them to decipher encrypted code.
Security experts and privacy advocates, meanwhile, say creating encryption backdoors would set digital-privacy rights back 20 years. It would also nearly guarantee the chance of introducing vulnerabilities, they say, that bad actors such as terrorists could exploit.
“All of society benefits from strong encryption,” says Phil Zimmermann, who created the e-mail encryption protocol Pretty Good Privacy in 1991 and co-founded secure-communications provider Silent Circle. He points to the decade-long debate in the 1990s involving the FBI, the NSA, Congress, courts, media, civil-liberties groups, and human rights groups, commonly referred to as the “crypto wars,” which resulted in the deregulation of mathematical equations that fuel computer encryption.
“Plenty of companies today provide secure services to their customers and still comply with court orders.” — James Comey, director, FBI
“After 9/11, we did not reverse that decision,” Zimmermann says. “Now we have a deep entrenchment of that technology: Browsers have it, routers, TLS [encryption protocol Transport Layer Security] that everybody uses for online banking.”
Why the government wants access to your phone
The debate over government encryption resurfaced in 2013, in the wake of Edward Snowden leaking National Security Agency documents. Recent moves by tech companies such as Apple and Google to increase communications security, coupled with the high-profile terrorist attacks in Paris and San Bernadino, Calif., have bolstered its intensity.
CIA Director John Brennan said in November that encryption backdoors are needed to track terrorists.
FBI Director James Comey told Congress in December that the government isn’t seeking a backdoor, per se. But tech companies, he said, “should figure out” how to get requested information to the judge. “Encryption is getting in the way of our ability to have court orders effective to gather information we need in our most important work,” he said.
“It’s not a technical issue,” Comey said, adding that “plenty of companies today provide secure services to their customers and still comply with court orders.”
To Comey, “it is a business model question. Lots of good people have designed their systems and devices so that judges’ orders cannot be complied with.” For example, he said, the government can’t access “109 messages” that “an overseas terrorist” exchanged with someone allegedly planning a terrorist attack in Garland, Texas, last year because they “were encrypted.”
Sen. Tom Cotton (R-Ark.) released a statement echoing Comey’s comments, and he criticized Apple CEO Tim Cook’s stance on encryption in Apple products. Cotton said Apple doesn’t have access to encrypted data stored on its devices “only because it designed its messaging service that way.” Companies like Apple, Google, and Facebook run the risk of becoming “the preferred messaging services of child pornographers, drug traffickers, and terrorists,” he said.
“The FBI has been complaining about ‘going dark’ since they lost the crypto wars 20 years ago.” — Cindy Cohn, executive director, Electronic Frontier Foundation
Sen. Cotton, like the FBI, did not respond to requests for comment. Although the issue has become important enough to register a response in favor of government access from the Manhattan District Attorney’s office (PDF), and to show up in both Republican and Democratic presidential primary debates, U.S. government representatives have not been speaking with a single voice on the matter.
President Obama said in early October that his administration would not push for access to encrypted user data. His administration has not made a single legislative proposal available for public review, though some expect a proposal this spring.
Risky business for security when opening encryption backdoors
The resurgence in government calls for access to encrypted communications and stored data doesn’t surprise Cindy Cohn, who successfully litigated Bernstein v. United States, the seminal case decided in 1997, which forced the U.S. government to remove import and export bans on computer cryptography, the mathematical basis for encryption.
“The FBI has been complaining about ‘going dark,’” or losing access to encrypted communications, “since they lost the crypto wars 20 years ago,” says Cohn, executive director of the Electronic Frontier Foundation, which was founded in large part to help businesses and individuals fight for the right to encrypt electronic communications and stored data.
A government win would mean “your communications would be more vulnerable to hackers, to malicious governments, to anybody who wants access to your data,” Cohn says.
Creating a backdoor through encrypted code that only the U.S. government can access is no simple technological feat, says Justin Troutman, a cryptography researcher and book author.
“They’re asking us to build a framework that can break safely, and there’s no way to break these systems safely,” he says. “The downside to master keys is that even if we could trust the government completely to use it only when authorized, we can’t trust that they won’t get hacked.” Troutman cites the June revelation that 18 million records from the U.S. Office of Personnel Management had been breached in a hack as but one recent example.
“Encryption affects every company you can imagine, from the biggest consumer companies to B2B.” — Tyler Shields, a former analyst, Forrester Research.
The complicated mathematics of computer encryption differ greatly from other computer code. Normal programming will return an error of some kind when it fails, says PGP creator Zimmermann, but failed encryption generally doesn’t affect the functionality of the software it’s protecting.
“It’s hard to make this stuff work right” normally, he says, “never mind when you add in the complexity of a backdoor.”
You need not look further than last month for what can happen when an encryption backdoor gets discovered. Network security company Juniper Networks revealed in December that it had discovered “unauthorized” code in its software that would allow anybody who knew about it to monitor Internet traffic. The security hole, said Bob Worrall, Juniper’s chief information officer, dates back to at least August 2012.
Juniper quickly released a patch for its vulnerable software, including ScreenOS and NetScreen firewalls. But because it requires manual installation, not all systems will be patched at the same time. And the patch itself points to the very nature of the vulnerability, security experts told Wired. Anybody savvy enough to reverse-engineer the mathematics of the hole can exploit it, opening the so-called “backdoor” to a lot more people.
It’s not known how many companies and users have been affected by the unintentional backdoor. But think of it like a seatbelt on a car that appears to click into place but is actually broken: You don’t know that you’re not protected until it’s too late.
The high cost of not using encryption
Meanwhile, more companies are exploring how to use encryption to their benefit, says Tyler Shields, a former Forrester Research analyst.
“Encryption affects every company you can imagine, from the biggest consumer companies to B2B,” he says. “If you don’t do encryption, and you get hacked, you’re putting your customers at risk.”
A 2011 white paper by Osterman Research found that the mere risk of running afoul of data breach laws should motivate corporations to encrypt consumer data at rest and in transit. The cost of securing a corporate reputation was worth the cost of securing its customer data.
Although multiple legal and privacy experts said government agencies won’t likely succeed in their attempts to cut a law enforcement-size hole in encryption, they wouldn’t rule out the possibility, thanks to election year politics and an erratic Congress.
“There’s no way to implement this without making everybody who relies on encryption significantly less secure,” says the EFF’s Cohn. “We won’t be any safer because the bad guys will still have crypto.”