Sections

Commentary

How to measure and regulate the attention costs of consumer technology

The Roblox app in the App Store is displayed on a smartphone screen and a Roblox logo in the background.

Notifications incessantly ping our mobile and desktop screens. Algorithmic social media feeds consume vast quantities of our time. Simple online tasks require users to traverse minefields of unfavorable default options, all of which need to be laboriously unclicked. To address these daily annoyances of digital life, some might suggest updating smartphone notification settings, practicing better personal discipline, and doing less business online—in short, emphasizing personal responsibility and digital hygiene. But digital hygiene falls far short of systematically addressing the way in which technology is capturing an increasingly large share of our limited stock of attention.

Software does not get bored, tired, or overwhelmed, but we do—and when we do, software is often designed to prey on us. Without recognizing and potentially regulating for engagement maximization in technology, we may increasingly lose de facto ownership of our own attention through seemingly minute, but pervasive digital incursions. In a white paper recently published by UC Berkeley’s Center for Long-Term Cybersecurity, I propose a two-part solution to tech’s attention problem. First, we need to measure attention costs imposed by digital products so as to better understand just how much tech’s engagement maximization practices are costing us as we navigate ubiquitous digital infrastructures. Second, we need to develop measures to reduce attention costs when they are unacceptably high.

Maximizing for engagement, maximizing for attention

Digital products consume vast quantities of our attention, often for profit as part of a larger practice the scholar Tim Wu fittingly christened “attention harvesting.” Digital distractions may summon us by our names or private interests, as with personalized advertising and behavioral targeting. Digital interfaces may trick us into taking actions we don’t intend to, through the use of so-called “dark patterns.” Many or even most consumer-facing digital technologies have elements of design to maximally engross or distract us—a practice known as “engagement maximization.” Such engagement is defined with respect to observable metadata, such as how many times we click on content, how much time we spend interacting with that content, and how often we come back for more content. From such measures has grown a well-developed science for making digital products addictive.

But maximizing our engagement with digital products doesn’t necessarily increase consumer welfare and may even harm us. Researchers have shown correlations between smartphone addiction and low workplace productivity and a link between depression and use of social media networks such as Facebook, a master of engagement maximization. Digital products provide highly engineered distractions that may even undermine the physical safety of children due to digitally distracted caregivers. These harms may disproportionately impact lower-income communities, since affluent people have already begun taking steps to protect themselves from digital engagement. Of course, examples of harms do not prove that the net result of using a technology is necessarily negative, but evidence of such harms calls for more careful study. Internal company documents made public by Facebook whistleblower Frances Haugen illustrate that potential harms of large social-media platforms remain understudied, in large part due to researchers’ lack of access to quality data.

Of course, digital products can and do enrich our lives. They have been essential to maintaining our social lives, work, and education during the pandemic. But the increased use of digital products during this time has also proved a draining experience for many. Digital design paradigms simply do not do enough to reduce attention burdens on users.

Addressing attention costs

Tech has a two-fold attention problem. Sometimes technology excessively consumes our attention by design because of profit motives (engagement maximization, attention harvesting, dark patterns). And sometimes excessive attention costs are imposed by digital products through carelessness (poor product design).

The first step in addressing the attention costs of popular consumer technology is to properly measure these costs. To do so, there are several candidate metrics worth considering. One basic attention metric is the time that a digital product consumes for the average user. Such information already drives the way in which technologists design their products, so it would make sense in the interests of transparency to tell users about these costs. Such information has become available in recent years through screen time reporting metrics in smartphones, but this information isn’t offered when it’s most useful. This information needs to be available ex ante when consumers are deciding whether to use a technology. Many people may weigh whether to download a highly engaging app differently if they were informed of the time costs the app would impose. Such a metric could be readily provided to users in app stores, with reporting such as “typical daily minutes of use”.

And time isn’t the only way to measure attention costs. A list of four proposed attention metrics is presented in the table below, alongside potential informational labels that could be included, perhaps in a smartphone app’s product description in an app store. 

Proposed attention metric Potential warning label
Advertising vulnerability After using this app, people are 10% more susceptible to advertising than are people who do not use this app, and as a result advertisers pay a premium to identify and advertise to people who use this app.
Impacts on cognitive performance After using this app, people score 10% lower on a test of reasoning than do people who do not use this app.
Distraction from external stimuli While using this app, people are less likely to notice physical threats to the safety of themselves or those around them, resulting in 10% more accidents than would otherwise occur.
Time consumed by use of a technology After downloading this app, people spend 10% more time on their smartphones than people who don’t download this app.

These examples of potential metrics were inspired by a variety of sources. In the case of time used by the app, this is a typical key performance indicator for many mobile apps. In the case of distraction caused by an app leading to physical accidents, this was inspired by real world scenarios, such as accidents that led most states to ban texting while driving and concerns that the once-viral Pokémon Go app was causing accidents

Using these proposed attention metrics, I conducted an online study to find out how information about attention costs might influence consumer tendencies to download a smartphone app. I found that warning labels based on proposed attention metrics influence consumers’ interest in downloading an app, all else equal. When given the most effective warnings, research participants’ stated intention to download the app decreased by more than half a point on a 5-point Likert scale. That change moved most participants from a point of neutrality (neither sure nor unsure of downloading the app) to a point of avoidance (indicating they would probably not download the app). If we implement similar measures in the real world, consumers will likely change their digital consumption patterns when they know the attention costs of their choices.

The attention metrics I discuss here are just an initial proposal. Additional work in psychology, human-computer interactions, and economics is required to establish robust and externally valid attention measures. Once robust attention metrics have been developed (mine are only a proposal), consumers, technology designers, and regulators could also respond to attention costs identified with attention metrics.

There are a variety of ways this could happen, ranging from self-regulation to strong government mandates. As an example of self-regulation, technologists could incorporate “attention by design” into their product design practices, to ensure attention costs are imposed only where they will likely benefit users. As another example of self-regulation, technology providers (such as app stores) might consider attention labeling requirements. Such labels could be similar to privacy labels that Apple (and later Google) introduced this past year, which have served as a mechanism to make the privacy costs associated with app use more transparent and salient.

Attention metrics could also drive direct regulatory measures by the government. Legislative proposals introducing auditing requirements or impact assessments for algorithms, for example, already exist. Attention costs could factor into such audits. Alternately, in an economy increasingly driven by competition for the scarce resource of human attention, taxation could be imposed on attention costs, both to reflect value creation and also to account for the externalities imposed on consumers via attention harvesting practices. These are viable ways that law and regulation could better reflect the technical and economic realities of attention harvesting as we experience the digital world of 2021.

In the past, concerns about technology overuse have been depicted as a fringe topic or as evidence of moral panic. More recently, academics are waking up to the problem of attention as a scarce economic resource to which technology companies devote substantial engineering efforts. For examples of emerging scholarly interest, consider recent papers that have highlighted the attention costs imposed by consumer contracts and the antitrust implications of addictive social media.

No doubt there will be naysayers to the idea of taking affirmative steps to protect human attention. The idea that choices about technology use are a matter of personal discipline will be a popular narrative, especially from the companies that stand to profit most by continuing to digitally harvest our attention. But similar arguments about individual discipline were made in the past about seat belts and smoking, giving the lie to the myth of personal responsibility as a solution to all product design problems.

We need technologies that are designed and regulated to respect and protect our human limitations, especially our scarce and valuable attention. It’s time for attention to come into focus.

Aileen Nielsen is a Ph.D. student at ETH Zurich’s Center for Law & Economics Center and a fellow in Law & Tech.

Apple, Facebook, and Google provide financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.