Share with:

FacebookTwitterLinkedIn


It might seem like a no-brainer that would-be terrorists use the best available technology to hide their communications from outsiders. But there is evidence that those behind the November 13 attacks in Paris, which killed about 130 people, may not have have been using privacy-enabling technology known as encryption.

Despite the uncertainty, U.S. government agencies appear more determined than ever to extract concessions from Silicon Valley that they have been demanding for two decades. Namely, they say tech companies should create ways for law enforcement agencies to unscramble encrypted information, colloquially known as backdoors.

Creating encryption backdoors is “a zombie idea. It’s a failed idea,” said Bill Blunden, a security researcher, historian, and author at San Francisco State University. The practicality of backdoors, he said, has “been debunked once before, during the Clinton era, with the Clipper chip.”

The Clipper chip was developed by the National Security Agency in the 1990s to encrypt voice communications while enabling law enforcement agencies to eavesdrop on the conversation. The plan for the chip failed after massive opposition from the technology community, bolstered by research showing that the government introduced security flaws in the chip (PDF).

Blunden and others, including Apple CEO Tim Cook, fear that once a flaw has been introduced into an encryption standard, anybody with enough technical knowledge—not just authorized law enforcement officials—will be able to exploit it.

“No one should have to decide between privacy [and] security. We should be smart enough to do both, Cook said at the WSJ.D Live conference in October. “Both of these things are essentially part of the Constitution.”

Government officials from the top of the Central Intelligence Agency on down disagree with Cook. To effectively research and prevent terrorist attacks like the one in Paris, they argue, they need better access to communications data.

“I think this is going to open an entire new debate about security versus privacy,” Michael Morell, former deputy director of the CIA, told television magazine 60 Minutes.

“Historically, Silicon Valley is an offshoot of the NSA. These companies have cooperated [with the government] in the past on several occasions, and they’ll do so again.” — Bill Blunden, security researcher, historian, and author, San Francisco State University

CIA director John Brennan echoed that sentiment, saying “policy and legal” actions against government encryption backdoors “make our ability collectively, internationally, to find these terrorists much more challenging.”

Meanwhile, the Manhattan District Attorney’s office published a white paper (PDF) asking Congress to pass a law requiring commercial encryption technology to be “accessible” to law enforcement with a warrant, thus enabling them to decode any data secured with that technology. It also argues that encrypting data on smartphones—as Apple’s iOS and Google’s Android do—causes “harm” to “crime victims” because “it severely hampers law enforcement’s ability to aid victims.”

Technology and privacy advocates are taking issue with these positions. They are also expressing caution about trusting tech titans to keep the data their products encrypt private.

SFSU’s Blunden argued that the current encryption debate will make tech firms “look really good—like they’re defending our privacy,” he said. But “historically, Silicon Valley is an offshoot of the NSA. These companies have cooperated [with the government] in the past on several occasions, and they’ll do so again.” The defense companies that formed the foundation of modern Silicon Valley got their start, thanks to military researchers opening labs at Stanford University in the 1950s.

A high-level executive at a communications company that specializes in secure communications, who requested not to be named because of current sentiment in the wake of the Paris attacks, said he isn’t worried about government regulation of encryption. He expects the current debate over encryption backdoors to go nowhere.

“At the end of the day, we’re going to have more discussion on this, and we’re going to end up back where we were. We need to have more encryption,” he said, citing its effectiveness in not only ensuring a higher level of consumer privacy, but also in preventing crimes like corporate espionage and smartphone theft. “Encryption is just part of the change of having computers being used in more and more places.”

Eva Galperin, global policy analyst for the technology rights group Electronic Frontier Foundation, echoed Blunden’s concerns about the aftermath of the Paris attack.

“If companies cave [to government data access requests] in private, we have no way of knowing,” she said. It would take a whistleblower or document leak to learn about the collusion. “That’s a doomsday scenario.”