Silicon Valley investors and developers may consider digital privacy an oxymoron, but that doesn’t make it so. Taking a fresh look at what privacy looks like in the era of cheap drones, expanding state surveillance, and voice-controlled home assistants like the Amazon Echo, a group of legal scholars have authored a series of papers on how privacy rights should be revisited in the 21st century.
Thomas Donnelly, senior fellow for constitutional studies at the National Constitution Center in Philadelphia, has spearheaded a white-paper series and panel discussion dubbed “A 21st century framework for digital privacy,” to be live-streamed tonight, on what modern privacy rights should look like in the United States.
“The purpose behind this initiative was to try to bring together a combination of scholars and people who’ve had experience in national security, and ask,” he says, “How might you translate those laws, in light of the digital age, or how might you update them?”
Jennifer Daskal, an American University associate law professor, is expected to testify in front of the Senate Judiciary Subcommittee for Crime and Terrorism on her paper, “Whose law governs in a borderless world? Law enforcement access to data across borders.”
“There’s a general consensus among experts from across the ideological spectrum,” Donnelly told The Parallax in a recent interview, “that both [the] doctrinal framework covering digital privacy, and the statutory framework, is outdated.”
We spoke with Donnelly on subjects ripped from recent headlines examining how the law can be updated to better reflect the privacy-protective intentions of the Founding Fathers when privacy-compromising technology is so pervasive and ostensibly helpful.
Much of our conversation focused on the Fourth Amendment to the U.S. Constitution, which protects citizens against “unreasonable searches and seizures,” and two relatively recent Supreme Court cases that could provide opportunities for restoring stronger individual privacy rights under the law: United States v. Jones, which held that the government using a GPS to track a suspect constitutes a search under the Fourth Amendment, and Riley v. California, a unanimous decision that declared unconstitutional a search of a suspect’s cell phone during an arrest without a warrant.
What follows is an edited transcript of our conversation.
Q: What’s the current state of digital privacy and the law? How did we get here?
Our main focus at the Constitution Center is certainly on the Fourth Amendment and on Constitutional doctrines, but obviously, a lot of these questions are also guided by federal laws dealing with anything from how the government gets access to digital information, to how it gets access to data across international borders.
“[W]e need to take a close look at those old rules pertaining to phone communications, now that all of our data—all of our most personal information—is ending up in the cloud.”
There have been a couple of privacy cases before the court in the last couple of years: the Jones case [a Supreme Court ruling which held that the government using a GPS to track a suspect constitutes a search under the Fourth Amendment] and the Riley case [in which the Supreme Court unanimously decided that searching a suspect’s cell phone during an arrest without a warrant was unconstitutional]. But there hasn’t been a singular blockbuster on privacy dealing with the question, “How do you translate the Fourth Amendment in the age of cloud computing?”
Instead, we’ve inherited this old doctrinal framework with things like the third-party doctrine, this idea that if you forfeit your data to third-party providers, it loses Fourth Amendment protection. We don’t know exactly what the rules need to be, but we need to take a close look at those old rules pertaining to phone communications, now that all of our data—all of our most personal information—is ending up in the cloud.
Can you point to a couple of key rulings that are holding back progress?
There were rulings decades ago, including the 1976 Miller decision, about when the government can use pen registers to get access to telephone numbers that people have dialed. The doctrinal framework suggests that the Fourth Amendment does not protect, in most circumstances, the data you turn over to cloud service providers.
So we have these old cases dealing with old technologies, but we also have two rulings in recent years that at least suggest that the Supreme Court is thinking through how to apply some of those old ideas to newer technology.
In the Riley case, the Chief Justice John Roberts wrote the opinion for the court and spoke powerfully about the need to be able to translate what the framers were trying to do with the Fourth Amendment, where they were trying to eliminate general warrants—and the ability of the king’s agents to rummage around our houses on fishing expeditions—trying to find evidence of a crime. He said, very powerfully, that once you have these technologies with all this information, we have to think hard about how to ensure that we are protecting as much privacy today through the Fourth Amendment as they were in the 18th century.
Many of our favorite online services, such as Google and Facebook, store our data in countries with different laws than our own. Can you describe what kinds of legal tensions are inherent in third-party hosting and cloud computing, vs. keeping your data locally on your phone or laptop?
Daskal’s paper talks about thinking of the ways in which data, as an intangible piece of property, is different from a tangible piece of property. When deciding whether a government entity may exercise jurisdiction over certain types of data, we need to think about the values we care about and identify factors worth considering.
Data location is certainly one factor. We’re also looking to rethink others, including nature of a crime, location of a service provider, and location and nationality of a suspect.
Her paper provides some concrete examples and hypotheticals to show how the rules would apply in ways we may not want. In the end, it’s still going to require a certain amount of judgment by Congress constructing laws, or by courts constructing or reconstructing doctrines. She advocates thinking clearly about a range of interests and factors.
Given the general lack of progress that laws pertaining to digital rights and privacy have had, such as the 31-year-old Computer Fraud and Abuse Act used to prosecute security researchers, are you hopeful that we’ll see changes in the next 5 to 10 years? It seems like a really big stretch to be betting on that.
Cases like Riley and Jones suggest that the justices are at least considering how to deal with things like Fourth Amendment protections for data held by third parties. They don’t necessarily have the answers, but I think they understand that it’s a different question and certainly signal in Jones that they are thinking deeply about it.
In a case like Riley, where you have a cross-ideological coalition of justices coming together, to refine best practices and provide more privacy protection for smartphones, this is at least a set of issues that the justices are thinking about. I don’t know when the right case might come up, or whether it’ll be in 5, 10, or 15 years, but those two cases were in court enough to suggest serious justice thinking. There’s at least a rich conversation going on there, not to mention among elected lawmakers.
“[B]alancing privacy and security is not a zero-sum game. We have to realize the ways technology is threatening both privacy and security.”
In almost any policymaking area, there’s a certain amount of pessimism about action in Congress and, conversely, complaints about a block on action. But certain values cross ideological lines. Progressives, conservatives, and moderates understand at least some privacy issues, and they want to keep the American people safe. They also understand that to come up with proper revisions to existing laws is difficult. The details are hard. But there might be enough shared values that you can see constructive action.
What do the white-paper authors say about where digital privacy should be heading? What does the future actually look like?
We’re hoping that this series of papers can at least get enough in the bloodstream of people who are interested in these issues that it can spark a few lines of discussion. There are a couple of papers, one by Christopher Slobigin of Vanderbilt and one by Jim Harper, formerly of Cato Institute and now at the Competitive Enterprise Institute, that really focus on what sort of legal rules should apply in the age of cloud computing.
In Jim Harper’s case, it’s some thoughts about how courts specifically can revise Fourth Amendment doctrines. In Chris Slobigin’s case, he looks at different contexts, and calls for a proportional approach to balancing privacy and security in different contexts, whether it’s a large-scale program or an individualized search.
Another paper by David Kris, the former assistant attorney general for national security, looks more at identifying multiple developments in technology and explaining. His big argument is, we may all be really familiar with the narrative that tells us our privacy is declining in this age, but balancing privacy and security is not a zero-sum game. We have to realize the ways technology is threatening both privacy and security. He’s especially focusing on some threats to security in his paper. The whole theory is that, through identifying a lot of those trends, we can get people thinking.
In Jennifer Daskal’s paper, it’s trying to think constructively about how we construct jurisdictional rules that allow us to identify the proper times when governments should be able to get access to cross-border data. The hope is that all of these papers, taken together, can spark discussions along each of these lines.
What can the United States learn from studying how countries with more privacy-protective laws, including many in Europe, have handled digital-privacy issues?
The lessons to be learned from abroad, in the context of privacy and the digital age, are difficult to figure out. The right to be forgotten is in tension with strong First Amendment values here in the United States.
If you look at what our CEO, Jeffrey Rosen, wrote in his book on Supreme Court Justice Louis D. Brandeis, and we try to channel the Brandeisian spirit in thinking through these problems, you would probably first look at how different balances are being struck at the state level. And you would see at least some evidence of alternatives to the third-party doctrine there.
I think that Brandeis would try to see if anything could be learned from lower courts. He would also, I would imagine, in that spirit of looking for laboratories of democracy, look at how other Western democracies might be striking a balance.
I think that people see a free-speech exceptionalism in the United States that is in tension with some of those privacy laws we see in Europe, but it’s not inconceivable to me that as technology continues to advance, we’re gonna to try to learn what we can from activities at the state level and abroad, even if, in the end, it’s rejecting what’s happening there and reaffirming some sort of differentiated American value.
What do you think the chances are of applying stronger legal protections to some of the more common privacy-violating practices in tech, such as overly broad end-user license agreements?
One of the white papers I haven’t mentioned to this point is by Neil Richards, a scholar at Washington University in St. Louis. He writes specifically about government surveillance, and one of the conclusions he draws is, to balance privacy and security properly, you have to balance two main sources of power: governments and companies. A proper balancing act requires policies, consumer activism, and consumer-business coalitions to figure out best practices and advocate public policy along those lines. It requires figuring out the best way to mediate policy in each of those directions.
Are you hopeful that there will be changes that favor consumers and private citizens?
We already have a deep desire to make sure that our government can keep us safe from serious threats. The pervasive uses of digital, cloud-based technologies may also make us sensitive to associated rising privacy concerns. We may not come up with precisely the same solutions, but we may at least have some common values and common angles in mind in a way that isn’t quite the same in a lot of other policy areas I can think about.
If you’re telling a darker story, the widespread use of this technology could just cause us all to value our privacy less, as everything is just made public, and our expectations of privacy change. You can see some evidence of that.
Either one of those stories is possible, and the optimist in me sees people from across the ideological spectrum uniting behind reforms that can promote both privacy and security.
One of the better observations in the paper by Kris is that our increased use of these technologies—of cloud storage and also of encryption technology—creates an interesting divergence between sophisticated users who can shield their data, and the rest of us who are just sort of hoping that the digital services we’re using are protecting our stuff, and that everything will be OK.
It’s only been 10 years since the iPhone changed the way that we interact with computers. Do you think that it changes the calculus of both judges and lawmakers that, by and large, all now have smartphones too?
It naturally does. If nothing else, it commonsensically clarifies the strength of the interests on both sides of the equation. Judges, lawmakers, and people in the executive branch have an intuitive sense of the importance of law enforcement, the importance of national security, and those interests at stake, as they try to come up with the proper court doctrine or the proper laws. Through their own experience with technology and brushes with cybersecurity breaches, they can appreciate the privacy interests as deeply as any of us would.
That doesn’t mean that it makes it easy to figure out how you strike the right balance between privacy and security. That’s why we did these papers—to present a variety of perspectives and interests. Based on our individual experiences, we get a better sense of the dangers to our data, how much of it is stored all around the world, and what that means.