To combat Covid-19, contact-tracing apps must include verifiable privacy

The lack of trust in tech solutions is causing well-meaning contact-tracing efforts to fail before they even see the light of day. And without effective contact tracing, the United States will continue to lag behind the progress other countries have made in stopping the spread of Covid-19.

A task force formed in mid-March with 45 members composed of the White House’s chief technology officer, Silicon Valley tech leaders, epidemiologists, and public health experts, among others, fizzled in just two months. Before disbanding, it found that discussions around preventing the spread of the virus were soon being drowned out by concerns over privacy and regulatory issues.

Whereas taking preventative measures, supporting the community, and staying healthy used to dominate news headlines, there is now a public assumption of contact-tracing failure, as headlines around privacy concerns and maintaining rights (PDF) dominate the media landscape surrounding contact tracing.

Contact-tracing initiatives that have emerged in the wake of this pandemic bring with them the promise of identifying new infections, tracking the spread of the disease, and ultimately aiding in the reopening of our economy. Experts say a successful contact-tracing program is critical, and must be used by at least 60 percent of the population to be most effective, and yet individuals are hesitant to sign up because they have valid concerns about their privacy.



READ MORE ON COVID-19 AND PRIVACY

Pandemic discourages regulators from enforcing GDPR
Employer data goes AWOL under Covid-19 lockdowns
Facebook fails to curb coronavirus misinformation
Secure contact tracing needs more transparent development
Hydroxychloroquine misinformation makes way for political disinformation
Ebola-hacking lessons for coronavirus fighters (Q&A)
How to make your Zoom meetings more secure
CanSecWest, the last tech conference standing in the face of the coronavirus


Tech companies—most notably Apple and Google—and public-health authorities across the world have joined together to build contact-tracing platforms capable of scaling to the necessary level. These technologies range from coarse Bluetooth-based proximity tracking that alerts you, if you’ve been near potentially infected people, to more granular tracing that ties directly to individual medical records and information. However, of the 82 percent of American adults with smartphones, half said they probably or definitely would not use a contact-tracing app.

Why are only at an estimated 41 percent of potential participants willing to participate? What’s missing? These apps lack privacy, in the form of verifiable trust.

Every voluntary contact-tracing action will need to involve trust—trust that the data will be used only for its intended purpose, including that the government wouldn’t use the data to spy on its citizens, and trust that the data is in safe hands—a challenge “big tech” faces due to everything from targeted advertising to “AI training.” This trust must also be verifiable on demand.

It is not enough to say that an app is safely securing individuals’ sensitive data; the app must show how the data is being used. Individuals must have the ability to see who has accessed their personal data and for how long. If verifiable trust is not provided within contact-tracing apps, we will fail.

I’ve seen firsthand the effects of not taking privacy into consideration when introducing policies and programs designed to protect Americans. As lead White House technology adviser during and after the September 11, 2001, attacks, I remember all too well the number of early decisions that had significant, unintended negative consequences on privacy—and that are still felt today in the form of surveillance and data access extending far beyond tracking terrorism.

Nineteen years later, House lawmakers considered an amendment to the USA Patriot Act designed to give privacy back to Americans that ultimately could not get through the House of Representatives. This is 19 years too late; we cannot afford to wait to embed privacy into the government’s response to another national crisis.

Privacy is a human right, grounded in trust, and there is no sustainable solution to Covid-19 without it. As contact-tracing solutions become more widely available, maintaining the privacy of all participating citizens must be given careful consideration. In January, Secretary of Health and Human Services Alex Azar called for a “completely different” health care system, with the patient “at the center and in control, with seamless access to the data you need to make decisions.” Recent technological advances make this possible, but verifiable trust is the key to success.

I am encouraged by the Exposure Notification Privacy Act, introduced in June, but I urge lawmakers not to take their foot off the gas. While a significant step in the right direction—the bipartisan bill would ensure that tracking is not forced on any one individual or sold for commercial use, and that “strong” security safeguards are in place.

To effectively curb the spread of Covid-19, contact-tracing apps must give individuals the power to approve or revoke access to their sensitive information at any time. We must also not ignore the need to sunset data generated during the pandemic after it has ended. An exit strategy is required, if privacy is to be preserved and trust is to be restored.

We are in this together, but 4 in 10 U.S. adults willing to use contact-tracing apps is not enough. We can do better. Americans deserve to have access to technology that ultimately seeks to protect them and their health. They deserve to feel confident in the privacy of their most sensitive personal data. Most importantly, Americans deserve to feel that their civil liberties are protected, even in times of crisis.