Share with:

FacebookTwitterLinkedIn


To understand why Apple believes it will beat the FBI in the legal tussle over a locked iPhone, start with an exchange that took place on May 4, 2012.

A team of FBI agents and lawyers had completed draft legislation that would force Apple and other technology companies to insert backdoors in their products for government agencies to use. For several years, the task force had been asking field offices whether “investigations have been negatively impacted” by device encryption. They had commissioned an evaluation from think tank RAND and briefed the transition team of the president-elect, Barack Obama.

Senators now grilled FBI director Robert Mueller about when they’d finally see the legislation. “We have been waiting patiently for the administration to put forth a proposal with necessary fixes,” said Chuck Grassley, an Iowa Republican.

The FBI director couldn’t provide a date: The White House, concerned that mandatory backdoors would inflame relations with Silicon Valley, had quietly blocked it.

No bill ever found its way to the Senate.

[Apple] says it holds certain pro-privacy, anti-backdoor political views—and that forcing engineers to comply with the court order runs afoul of their, and their employer’s, First Amendment rights.

Apple today is pointing to these deliberations, along with a pair of First Amendment-based defenses, when arguing that it shouldn’t be forced to develop a security-bypassing version of iOS for the FBI. “The government abandoned efforts to obtain legal authority for mandated backdoors,” Apple’s lawyers say in a 65-page legal brief, filed in federal court. “Congress never granted the authority.”

In a brief filed on March 10, the Department of Justice characterized the 2012 episode as “merely vague discussions about potential legislation.”

The litigation currently under way arose out of an FBI probe into the husband and wife who shot and killed 14 people in San Bernardino, Calif., last year. The bureau says it obtained an iPhone 5C used by one of the shooters but has been unable to unlock it.

Last month, a federal judge ordered Apple to engineer a custom version of iOS—call it fbiOS—designed to bypass the phone’s built-in security. The Justice Department insists that developing fbiOS won’t impose an “undue burden” on Apple’s engineers.

A hearing in federal court is scheduled for March 22 in Riverside, Calif.

In addition to revisiting the FBI’s 2012 legislative failures in its brief, Apple is going further than many companies would, arguing that the court order is unconstitutional.

Can a corporation like Apple hold political views?

Apple’s logic goes something like this: The First Amendment protects speech. Engineers who write code engage in speech. Therefore, a government demand to force engineers to write code conflicts with the First Amendment.

Google, Microsoft, Facebook, and other companies have set aside their usual rivalries to publicly agree with Apple. “The government seeks to force Apple and its engineers to write software—that is, to engage in protected speech—against their will,” the companies said in an amicus brief.

There’s legal precedent to support Apple’s position. Courts concluded more than a decade ago that at least some computer code can qualify as speech. In a 1999 ruling that has never been overruled, the 9th Circuit Court of Appeals said “encryption software, in its source code form…must be viewed as expressive for First Amendment purposes.” Another federal appeals court reached the same conclusion a year later.

It’s far from clear how the courts—including the U.S. Supreme Court, if the case goes that far—will respond.

What’s novel, though, is Apple’s additional First Amendment argument. Apple says it holds certain pro-privacy, anti-backdoor political views—and that forcing engineers to comply with the court order runs afoul of their, and their employer’s, First Amendment rights.

“When Apple designed iOS 8, it wrote code that announced the value it placed on data security and the privacy of citizens by omitting a backdoor that bad actors might exploit,” its lawyers argue. “The government disagrees with this position and asks this court to compel Apple to write new software that advances its contrary views. This is, in every sense of the term, viewpoint discrimination that violates the First Amendment.”

It’s far from clear how the courts—including the U.S. Supreme Court, if the case goes that far—will respond. Disputes over forced speech tend to deal with whether the devoutly religious may cover up license plate mottos, or whether students can be required to recite the Pledge of Allegiance, not whether the world’s wealthiest computer company can be compelled to write code that it believes should never exist.

What Apple hopes to do is draw a new legal line that the FBI, and other agencies that will follow, will never be allowed to cross. Police aren’t allowed to force confessions, Apple argues, even if that rule makes it harder to enforce the law. Police can’t search homes without a warrant. Spouses can’t be forced to testify against one another. Some investigatory techniques simply remain off-limits.

Apple wants government backdoors in software to be viewed as off-limits as well. It’s an untested argument: No company appears to have previously asked a judge to draw this line.

In an open letter to customers, Apple CEO Tim Cook put it succinctly: Some technologies are simply “too dangerous to create.”