How an FBI win against Apple could hurt my company

Apple is vigorously challenging the FBI over unlocking an iPhone for one simple reason: If Cupertino loses, there’s no obvious limit to the code that software companies, including mine, could be forced to create.

The FBI claimed, in its initial court filing, that its request was “tailored for and limited to this particular phone.” That statement was misleading. A federal judge in New York revealed that the bureau has “a dozen more such applications pending, and it clearly intends to continue seeking assistance that is similarly burdensome.”

If a favorable precedent is set, local, state, and other federal police agencies would pounce. Manhattan’s district attorney said he’d “absolutely” invoke such a precedent, and FBI Director James Comey acknowledged in a recent congressional appearance that the Apple case would invite copycat demands.

If Apple loses its fight with the FBI, and one of our users comes under criminal investigation, that precedent would let the government turn us into unwilling surveillance assistants.

A hearing was scheduled in federal court today, but FBI officials requested a last-minute delay, saying they were investigating ways of accessing data that didn’t require Apple’s assistance.

This case arose out of an FBI probe into the husband and wife who last year shot and killed 14 people in San Bernardino, Calif. The bureau says it obtained a work phone, an iPhone 5C, used by one of the shooters, and it wants to compel Apple to develop a version of iOS with a backdoor to unlock it. Apple is fighting the resulting court order, which is temporarily suspended while the FBI investigates other unlocking techniques.

At my company, Recent Media, we write code for iOS and Android news recommendation apps. We have no legal department and certainly no process in place to comply with a judge’s order to undermine our software’s security. In fact, by limiting the log data we store, and allowing Recent News to be used anonymously, we’re hoping to avoid any such order.

If Apple loses its fight with the FBI, and one of our users comes under criminal investigation, that precedent would let the government turn us into unwilling surveillance assistants.  On threat of being held in contempt, we potentially could be forced to write thousands of lines of code, based on specifications drafted by prosecutors who know little to nothing about how our technology works.

Such code, we imagine, could be designed to alert authorities when a user performs certain activities, or when a phone enters or leaves a certain geographical area. It could also be designed to integrate government malware that would redirect network traffic, or remotely activate a device’s microphone or camera.

Threat of FBI win makes for strange bedfellows

To preemptively fight the threat of innumerable such police wishlists, Facebook, Microsoft, Google, Yahoo, and others—usually bitter rivals—have set aside their differences to warn of the real dangers in allowing the government to conscript engineers.

“Make no mistake: If the government prevails in this case, it will seek many such decisions,” the companies wrote in a joint legal filing supporting Apple. “Investigative tools meant for extraordinary cases may become standard in ordinary ones.”

This is why you’re starting to hear mild-mannered engineers discuss civil disobedience. A New York Times report said Apple’s iOS engineers might quit rather than create a government backdoor to the operating system (they’d have little difficulty in finding jobs elsewhere). The ethics code of the professional association for computer scientists says to protect “the privacy and integrity of data,” not undermine it. Computer security professionals, long aware of the ethics of backdoors, are considering their own social responsibilities.

Opposition to surveillance orders is hardly hypothetical. Two years ago, after the FBI demanded that Ladar Levison hand over the private encryption key to his Lavabit email service, he pulled the plug on Lavabit. Gaining the encryption key would have enabled the bureau to decrypt the connections of all of its users, including those of Edward Snowden. (Levison filed a brief in the Apple case, saying the FBI is demanding involuntary servitude, abolished in 1865.)

Apple says the FBI’s arguments would catapult the industry into unexplored terrain. The lack of any limiting principle would ostensibly allow, the company stated in a brief, “compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant, or requiring a journalist to plant a false story in order to help lure out a fugitive.”

Technology companies are hardly opposed to helping police fight crime: Every major company has a department that responds to legally authorized requests for stored data. Yahoo alone deals with requests for approximately 50,000 user accounts each year. For Microsoft, the global total is closer to 110,000 accounts.

In this case, the FBI is not requesting stored data from Apple. The bureau is demanding a company to craft code that would bypass its own security protections. Nobody should be surprised that Silicon Valley is supporting Apple’s fight.