Sections

Research

The partial data barter trades of the digital economy

App Store top charts ranking for free iOS apps are seen on an iPhone.
Shutterstock / Tada Images

In her exploration of “surveillance capitalism,” Shoshana Zuboff (2019) sheds light on the pervasive use of data by companies. She vividly characterizes “an economic system built on the secret extraction and manipulation of human data.” Her sentiment is echoed in a December 2021 survey of Americans’ perceptions of the fairness of firms’ use of their personal data. The findings reveal a striking divide, with six out of 10 consumers asserting that companies benefit more than individuals in the exchange of personal data. In contrast, a mere 15% of respondents believed that customers themselves profit from this transaction. This stark contrast underscores a growing sense, echoed by lawmakers, that companies should pay consumers to use their personal information.

But what if consumers are already, unwittingly, being paid for their data? In some instances, there is clear data barter. Data barter refers to the exchange of goods or services for data, without the use of money. Free digital apps are an obvious example.

While data barter is substantial, an even larger value of data transactions has gone unrecognized: partial data barter. Partial data barter arises when firms provide consumers with valuable goods or services in return for some monetary payment and for harvesting their usage data. In this scenario, data is part, but not all, of the payment the firm receives. In the data economy, goods and services are increasingly partially bartered for data. We explain why this phenomenon arises and what its consequences are for measuring the size of an economy, for assessing firm competition, and for privacy regulations. In an economy with partial data barter, consumers are paid for their personal data. The payment is the price discount—the difference between the actual price and what the price would have been if no data were exchanged.

This article explores the logic, evidence, and consequences of partial data barter. We argue that most firms are not stealing data; they are paying for it with price discounts. Policies that prohibit uses of consumer data may cost consumers who prioritize low prices over privacy. The policy debate should shift from discussions of “data theft” and prohibitions to a discussion of differential privacy pricing and the terms of data trade.

Why many goods are sold at a discount

In the case of free digital goods, there is an obvious barter trade. Tett (2021) describes how consumers give up their data in return for free maps, free search services, or free use of a social network site. None of these services are genuinely free. They are provided in return for consumers’ data and attention.

Many of these free digital goods and services are offered by platforms operating in two-sided markets. As typical in these types of markets (Rochet and Tirole 2006), there are incentives to subsidize one of the sides to draw more customers into that side (in this case, the users of free digital services), which in turn attracts and increases the willingness to pay on the other side (in this case, firms that wish to access those customers). Thus, the motives and benefits from barter trade to firms may well extend beyond the collection and use of consumer data.

Two recent studies measure the value of digital goods to consumers. Brynjolfsson, Collis, et al. (2023) estimate the value of free digital goods to consumers: the ten digital goods selected in their study were valued at $2.52 trillion in welfare across 13 countries—the value per country ranged from $13 billion in Romania to $1.29 trillion in the U.S. For the United States, the value derived from free digital goods would comprise 5.5% of total U.S. economic output if it were counted. Such goods are not counted in the Gross Domestic Product (GDP) measure of the economic output of a country. The reason they are not counted is that they have a price of zero, and GDP weights each product by its price. In another study, Brynjolfsson, Kim, et al. (2023) measure the consumer surplus of free goods on the internet and find them to be worth about $38 billion per year in the United States, equivalent to approximately 0.29% of the annual GDP. Although estimates vary widely, free digital goods are not trivial, nor are they an overwhelming fraction of U.S. economic activity.

While the barter of free digital goods has been widely discussed, zero-price goods and services may be only the tip of the iceberg when it comes to data barter. Far more goods and services may be partially bartered for data. For example, some customers at Whole Foods supermarkets are offered the option to use their phone to scan a QR code that links their grocery purchases with their online profile. In return, they receive a discount on their groceries. When a customer allows their data to be used, it lowers their price but does not lower the value of the groceries. Instead, the customer pays the difference in price with their data. Frequent flier programs and loyalty cards are data collection devices that link purchases to a customer. They offer similar reductions in price in return for the data collected, among other reasons. Credit card companies gain fees from merchants but also obtain data from consumer transactions. Some consumers receive cash back and points from credit card companies that are also partial refunds for data. A small industry is arising to manage these exchanges. Grand View Research estimates that the 2023 global market for loyalty management services was $10.67 billion. Given that data uses are much more extensive than just loyalty management services (Baley and Veldkamp 2024), the value of the data managed must be far larger than that.

Firms do not need to care about charity, fairness, or customer happiness to give price discounts in return for customers’ data. These discounts, or partial data barters, result from firm profit maximization. A profit-maximizing firm that values customer data, either because they can use it or sell it, will try to acquire more of that data. Acquiring more data typically involves more customer transactions. Firms of all kinds would like to sell more and have more customer transactions. How can a firm sell more? Have a better product or a lower price. Either is equivalent to a lower price per unit of product quality. In other words, firms that want to collect customer data will give discounts to collect more data. These discounts are the partial barter of data.

However, barter may still not be fair to consumers. Even if consumers are paid for their data, the exchange may still be a bad deal. Firms could be using market power to extract surplus from the data barter exchange. To measure firms’ market power, economists typically examine the firm’s markup: the price of the good they sell divided by the cost of producing that good. Firms with high markups are often presumed to be using their market power to extract consumer surplus (though some argue that markups compensate firms for risk or provide incentives to innovate (e.g., Romer 1990; Eeckhout and Veldkamp 2023; Tonetti, Maglieri, and Di Tella 2024). But if the price consumers pay is not the value of the transaction, then markups are underestimated. Firms may have more market power than we believe because not only are some firms’ prices high relative to their costs, but they also get value from the data they collect. A firm that provides free or inexpensive goods to consumers may exert market power if the value of the data they collect is high relative to the value of the product they deliver.

GDP mismeasurement

Partial data barter changes how we should think about GDP measurement, firms’ market power, and privacy regulations. When computing aggregate economic measures like GDP, economists value a good or service according to the price paid for it. If most goods and services are partially paid for with data transfers, i.e., if firms are lowering their prices a little to encourage more transactions and more data accumulation, then the actual value being paid for the item sold is the monetary price plus the value of the data transferred.

The astronomical market valuations of data-intensive companies suggest that the value of data transferred is enormous. Firms collect a vast amount of data about their consumers. American companies spent over $19 billion in 2018 acquiring and analyzing consumer data. According to Tucker and Neuman (2020), collecting and selling customer data generates as much as $200 billion in value. While this hints at the outsized role that data plays in the modern economy, neither data sales nor the cost of data workers fully captures the value of a firm’s data. However, these partial estimates do suggest that incorporating partial data barter into economic measurement could significantly alter our measurement of economic activity.

Consumer costs and benefits of privacy policy

In the data privacy debate, partial data barter offers a new lens through which to view the trade-offs. Recent data regulations restrict firms’ ability to collect or monetize data. In the EU, the General Data Protection Regulation (GDPR) provides individuals with certain rights over their personal data, including the right to know how their data is being used, the right to object to the processing of their data, and the right to have their data deleted. The California Consumer Privacy Act (CCPA) is a state law that provides California consumers with the right to know what personal information is collected about them, the right to delete personal information collected from them, the right to opt-out of the sale or sharing of their personal information, and the right to non-discrimination for exercising their CCPA rights. China’s Personal Information Protection Law (PIPL) requires consent as its principal basis for data collection and handling, introduces provisions with extraterritorial effect, and restricts cross-border data transfers.

These laws make sense if firms are taking customers’ data without providing anything in return. However, if we view customer data as a form of barter payment to firms, then these laws, in part, prevent firms from accessing or monetizing customer payments. Ensuring customer consent makes sense. However, limiting firms’ ability to provide different services to customers who do not pay with their data as to those who do is like requiring them to provide the same service to customers who pay with money as those who do not pay at all. Such restrictions are likely to reduce the quality of the goods and services offered in return because they reduce the benefit to firms from offering quality products. Quan (2023) finds that GDPR led to a 6% drop in user ratings of digital products in the EU and that this decline in product quality reduced overall welfare despite the gain in utility from customers’ estimated degree of privacy preference. We recommend that privacy policy explicitly allow for differential pricing of transactions, depending on whether they are privacy protected or not. Of course, when one person shares data, they may reveal traits or preferences of others who are similar to them. Acemoglu et al. (2022) show that such external effects of data sharing make data pricing imperfect. Yet, even with such data leakage, differential pricing would still allow for more consumer control of their data than the US minus California accords.

The harms from the loss of data privacy typically fall into four categories. One is the fear of governments’ use of data to control or police its citizens. However, government use of data could be separately regulated without impeding private data commerce. A second fear is identity theft, stalking, or other criminal activity. Yet, enforcement of digital crimes could be stepped up without prohibiting the use or trade of data. A third category of harm is embarrassment. But just like our homes are special locations for private behavior, we could strengthen rules about incognito browsers and special settings where privacy is protected. Finally, data is used to target ads. This targeting can take the form of psychological manipulation, where data is used to predict what triggers will induce people to act in a particular way, whether that be voting or purchasing a product. This harm is the most difficult to regulate because it is not obviously different from using data to bring together people and products they would enjoy in an efficient way.

What typically distinguishes manipulative targeted ads from helpful information is the way in which manipulative content holds our attention. In a 2021 Brookings article, Aileen Nielsen describes attention harvesting, which fuels digital addiction, and the “dark patterns” used for customer manipulation. She proposes several metrics of nefarious digital activity that could form the basis for customer warnings or digital product safety guidelines. None of these regulations involve prohibiting the use of data.

Large data sets and the more accurate predictions they enable help firms to make fewer mistakes. When firms’ predictions are more accurate, fewer resources are wasted, people can get matches with more useful products, and costs are lower, resulting in lower prices (Baley and Veldkamp, 2024) Good prediction enables efficient outcomes. Of course, data also makes bad behavior more efficient. However, prohibiting the trade or collection of data because it enables harmful activity is like prohibiting trade in books because books can teach people to build bombs. Instead of prohibiting the use of an entire category of information, we should identify specific harms and regulate those. We would never prohibit the use or trade of something that offered so many benefits in other domains. Why would we do so for data?

Conclusion

Firms that value data maximize their profit by lowering prices to lure more data-generating transactions. The unobserved nature of price discounts causes governments to underestimate economic activity and leads some consumers and policymakers to believe their data is being stolen. Instead of prohibiting the harvesting, use, or sale of consumer data, regulations should focus on making the data discount explicit. By encouraging firms to set differential prices for transactions that are privacy protected and those that are not, policy could encourage price competition on consumer benefits from sharing their data. This may not solve all harms. But it will protect the efficiency benefits, allow the privacy-conscious to maintain anonymity, and allow others to choose whether to benefit from the use of their data.

Authors