Sections

Commentary

How our outdated privacy laws doomed contact-tracing apps

Contact tracing app TraceTogether, released by the Singapore government to curb the spread of the coronavirus disease (COVID-19) is seen on a mobile phone, in Singapore March 25, 2020. REUTERS/Edgar Su
Editor's note:

Jessica Rich is the former director of consumer protection at the Federal Trade Commission and a longtime manager of the FTC’s privacy program.

The pandemic has taught us many things: How vulnerable we still are to uncontrolled disease. How divided we are politically, even when it comes to protecting our health. How much we have taken the eradication of earlier diseases, such as polio and smallpox, for granted. How much we enjoy the simple freedom of eating in a restaurant or browsing in a store. How much we rely on interaction with friends and family for our daily happiness.

I’m a privacy lawyer, so one of the lessons that I have learned from the pandemic involves privacy and the failure of contact tracing apps. Last spring, when the disease first started its rapid spread, these apps were heralded as a promising way to control it by tracking diagnoses and exposure through self-reporting and location tracking. At that time, Apple and Google announced a joint effort to develop technology that government health departments could use to build apps for their communities, “with user privacy and security central to the design.” While these apps have had mixed success worldwide, they’ve been a huge failure in the United States. Indeed, despite early hopes and multiple efforts to implement these apps in various states and localities, Americans have largely rejected them, and they’ve played a minimal role in controlling the disease.

A key reason for this failure is that people don’t trust the tech companies or the government to collect, use, and store their personal data, especially when that data involves their health and precise whereabouts. Although Apple and Google pledged to build privacy measures into the apps’ design—including opt-in choice, anonymity, use limitations, and storage of data only on a user’s device—Americans just weren’t persuaded. For example, a Washington Post/University of Maryland survey conducted soon after the app announcement found that 50% of smartphone users wouldn’t use a contact-tracing app even if it promised to rely on anonymous tracking and reporting; 56% wouldn’t trust the big tech companies to keep the data anonymous; and 43% wouldn’t even trust public health agencies and universities to do so. By June, Americans’ mistrust had increased, with a new survey showing that 71% of respondents wouldn’t use contact tracing apps, with privacy cited as the leading reason.

Privacy concerns weren’t the only reason these apps failed. As experts have predicted, they failed for other reasons too, including the insufficiency of testing, the unreliability of self-reporting, and the wide and fast spread of the disease. However, Americans’ response to these apps shows that privacy now plays a critical role in their decision-making. Contrary to the longstanding argument that people say they care about privacy but act like they don’t (sometimes called the “privacy paradox”), Americans refused to use these apps in large part due to privacy concerns. Privacy really mattered.

Americans had good reason to be wary of data collection by these apps. In recent years, they’ve been victimized again and again by data breaches and other privacy abuses (including by the big tech companies) too numerous to mention. In many instances, the privacy laws in this country have failed to protect them from these abuses, whether because the abuses fell outside the limited scope of these laws, or because the laws imposed insufficient penalties or other remedies. The same dangers loomed with respect to contact-tracing apps. Indeed, as readers of this blog are likely aware, the U.S. has no baseline data protection law that would protect the sensitive data obtained through these apps.

While the U.S. has laws that protect certain data in certain market sectors, those laws have limited application here. In fact, no U.S. law of which I am aware would clearly require that all data collected through COVID tracing apps must be stored and transmitted securely, used only for the purpose of tracking COVID, and disposed of securely when no longer needed for this purpose. Without such protections, there’s no assurance that this sensitive data won’t end up in the hands of insurance companies, employers, creditors, identity thieves, or stalkers, to be used in ways that could harm or discriminate against individuals.

For example, the Health Insurance Portability and Accountability Act (HIPAA) provides certain protections for our medical information, but only if the data is collected and used by a “covered entity”—that is, a medical provider like a doctor or hospital or a “business associate” helping to carry out medical activities. That wasn’t the case here, since state and local health departments were the ones collecting and using the data. In any event, by April, HHS had already announced that it was suspending HIPAA enforcement and penalties for many covered entities engaged in “good faith” measures to fight COVID, rendering the question largely moot and suggesting that HHS viewed its own health privacy rules as ill-equipped to deal with a public health emergency.

Other U.S. laws don’t fare much better. The FTC Act allows the FTC to challenge, usually after the fact, “unfair or deceptive acts or practices” in commerce—including material misrepresentations about data privacy or security, or data practices that cause significant consumer injury without offsetting benefits. Although this law arguably has the broadest application of any U.S. law applicable to privacy, it doesn’t come close to providing the specific protections needed here—clear limits on how (and how long) data collected through COVID-tracing apps can be used, stored, and shared. Instead, in most cases, it allows companies to decide for themselves what privacy protections to provide (or not), so long as they avoid deception and obvious forms of injury. Adding to the problem, the FTC Act doesn’t authorize civil penalties (necessary to deter wrongdoing) except in limited instances.

If the apps are used by citizens of particular states or localities, then state or local laws may apply. However, a quick look at the leading state law (the California Consumer Privacy Act or CPPA) is not promising, since it does not apply to the government agencies that build and use these apps. Even if CCPA did apply, a law that protects citizens only in one state hardly provides the privacy assurances needed for broad nationwide uptake of contact-tracing apps.

Months into the pandemic, Congress attempted to fill this legal void by enacting yet another narrow, situation-specific law. In May and June, after contact-tracing apps had already been developed and deployed in certain localities, a number of senators rushed to circulate draft bills in order to regulate the apps and the sensitive data they collect. Some of these bills had serious flaws and gaps, and none of them made any headway in Congress.     

So what lessons can we take away from this experience? First is the obvious lesson, highlighted above, that concerns about privacy fueled public mistrust of these apps and helped to ensure their failure. For years, proponents of strong privacy laws have argued—often to a skeptical industry audience—that robust protections are needed to maintain consumer trust in the marketplace. Contact-tracing is a concrete example.

What happened here also reminds us that clear standards governing data use should not just be viewed as a restraint, but also as a way to enable responsible uses of data, including data use for emergencies. Properly crafted, a privacy law should govern and guide our everyday data practices, as well as how we use personal information to fight a pandemic.

In addition, our experience here aptly illustrates the chaos and confusion we routinely face as we try to manage privacy in the U.S.—a patchwork of laws that leaves much of our data unprotected, uncertainty about what laws apply, hasty efforts to fill the gaps in the heat of the moment, and eroding public trust.

Taken together, all of these lessons lead us back to the same conclusion that was the topic of my earlier blog post—that we need a baseline federal privacy law to establish clear and enforceable privacy rules across the entire marketplace, one that protects our personal information in good times and in times of crisis.    


Apple and Google are general, unrestricted donors to the Brookings Institution. The findings, interpretations and conclusions in this piece are solely those of the author and not influenced by any donation.

Authors