Companies, not people, should bear the burden of protecting data

Traders work at Bloomberg terminals on the floor of the New York Stock Exchange, May 13, 2013. Bloomberg LP customers, including the U.S. Federal Reserve and the U.S. Treasury, are examining whether there could have been leaks of confidential information, even as the media company restricted its reporters' access to client data and created a position to oversee compliance in a bid to assuage privacy concerns. Bloomberg has more than 315,000 terminal subscribers globally, with each Bloomberg terminal costing more than $20,000 a year. REUTERS/Brendan McDermid (UNITED STATES  - Tags: BUSINESS MEDIA) - GM1E95E0EJ701

Privacy isn’t dead as some would suggest … but consent is. When was the last time you read a privacy policy for the apps on your mobile phone? Did you know that apps have privacy policies? How about reading the cookie notice on web pages you visit? Or reading the privacy notice on Internet-of-Things devices like your baby monitor?

Let’s face it—almost nobody spends time reading privacy notices, and if you did it would take 76 work days per year to get through them. Despite this, privacy policy in the European Union and the United States is largely based on the myth that people read these notices and make informed decisions about how their data will be used, disclosed to third parties, and retained. Privacy notices are filled with legalese that often grants the company free rein to use your data. It’s time to move beyond consent.

When we go into a restaurant, we do not check the kitchen to make sure hygiene standards are being met. We expect that the government will impose reliable food safety requirements. For credit bureaus, we do not provide consent for their collection and use of our data. Instead, laws require bureaus to limit access to our data and correct any disputed errors. Why shouldn’t the data we provide companies, websites and apps be automatically protected by law?

Data protection beyond consent

We need a new 21st century paradigm for protecting our data that doesn’t turn on consent. Our work at the Consultative Group to Assist the Poor (CGAP) is focused on expanding access to digital financial services for the poor in developing countries who face literacy challenges, a multitude of native tongues, and technological limitations. Policing how their data is used by multi-billion-dollar corporations would unfairly add to these burdens. Developing countries have the unique opportunity to leapfrog the Western world and adopt modern data protection legislation.

New legislation should put the burden on companies to only use, disclose, and retain consumer data for legitimate purposes. This would mean that consumers’ personal data could only be processed in ways that are consistent with reasonable expectations formed in their relationships with companies. This can be accomplished by limiting providers to the collection, creation, use, and sharing of the data necessary for or compatible with the services being provided. When no longer necessary for those legitimate uses, data should not be retained in identifiable form.

Government-issued implementing regulations, agency enforcement actions, and business and consumer education could clarify a new law’s requirements. Furthermore, obtaining individual consent could not override this “legitimate purposes” approach. People would benefit from legitimate purposes protections, regardless of which boxes they were required to check before accessing a website, downloading an app, or using a digital service.

Legitimate purposes for collecting and using data would include servicing accounts, fulfilling orders, processing payments, making sure a site or service is working properly, quality control, security measures, auditing, or other activities driven by evolving business models. An advantage of this approach is that it allows providers to establish the scope of permissible data uses depending on the products or services they choose to offer. More information can be used to provide financial planning advice than fulfilling a request for shipment of a single product. A social networking site could post data so a customer’s connections could see her postings. The site could also use the data for targeted advertising that allows the site to be offered without charge.

Recently in India, researchers discovered that an app that streamed stories and music was also collecting sensitive data, including location information, for use by lending companies in evaluating credit applications. A legitimate purposes approach would not permit such unrelated and undisclosed secondary use of personal data. Of course, the Cambridge Analytica scandal is another glaring example of data being collected for a legitimate purpose (social networking) but being used for an illegitimate purpose (influencing elections). A legitimate purposes approach would retain other legal disclosure and retention obligations while permitting disclosure of information necessary to protect someone whose health or safety was threatened.

Allowing for innovation

A legitimate purposes test would not stifle innovation. Data could still be used for wide-ranging purposes if robustly de-identified, aggregated, or personally identifiable information was otherwise removed to reduce the risk of its being used in harmful ways. Thus, the data could be used to improve the provider’s business operations, develop new products and services, and improve risk assessment without impinging on customer privacy. While much has been written about the ease of re-identifying information, in this context efforts to re-identify could be prohibited by law when undertaken by the company and any other entities that might receive the information. Privacy enhancing tools have also been developed to make data useful while still protecting individual privacy.

A legitimate purposes approach would be more protective than the EU’s General Data Protection Regulation (GDPR). The GDPR imposes data use limitations, but it also provides that information be collected for explicit, specified, and legitimate purposes; it prohibits processing in a way incompatible with those purposes. However, there is significant difference between requiring the uses be compatible under a legitimate purposes approach and the GDPR’s not incompatible standard. Compatibility puts the burden on providers to establish a direct link between use and the collection purpose; not incompatible use would seem to more broadly permit uses so long as providers could show they did not contradict the purpose of collection. According to the UK Information Commissioner, GDPR’s purpose limitations can be overridden by obtaining individual consent. Consent isn’t likely to work any better later in the process than it does up front.

While the shortcomings of consent are often acknowledged, the response is often a push for more and better consent: layered consent; simplified consent; just-in-time consent. These cut against people’s desire to get information or entertainment, conduct transactions, or play games with as little interference as possible. Time and again consumers make clear what they want is more protection—not more consent.