What the European DSA and DMA proposals mean for online platforms

European Commissioner for a Europe Fit for the Digital Age Margrethe Vestager and European Internal Market Commissioner Thierry Breton attend the presentation of the European Commission's data/digital strategy in Brussels, Belgium February 19, 2020.

After a long period of consultation, in mid-December the European Commission finally presented a pair of draft laws that aim to re-write the rules of the internet. While it will be years before the Digital Markets Act (DMA) and the Digital Services Act (DSA) come into force, the two proposals represent a major step forward in updating regulations for online intermediaries, or companies that host third-party content or sell third-party products. Although sticking points and open questions remain, the drafts show how the Commission plans to make digital platforms compete with one another and to ameliorate their potential negative impacts on consumers and society. In many respects, the DMA and DSA offer a well-balanced approach that imposes additional rules where the potential for harm to competition and consumers is highest. Nonetheless, there remains room to strengthen the provisions to make sure they make platform structures more transparent and competitive.

The two proposed laws address issues that interact with one another: The DMA targets the lack of competition in digital markets; the DSA is primarily concerned with transparency and consumer protection. The DMA only affects so-called “gatekeepers,” which are platforms with at least 45 million monthly active users, among a number of other conditions detailed in the table below. The DSA, meanwhile, applies to all intermediaries and imposes additional requirements on those used by more than 10% of EU consumers. The European Commission will be responsible for enforcing the DMA, while national regulators will be responsible for applying the rules of the DSA. Both proposals threaten violators with substantial fines: 10% and 6% of global turnover for the DMA and DSA, respectively.

  Digital Markets Act Digital Services Act
Overarching objective To enable competition by making it easier for new platforms to enter the market To enable transparency, user safety, and platform accountability
Addressees “Gatekeeper” platforms with turnover of at least €6.5bn; activities in at least 3 EU countries; at least 45 million monthly active end users and 10,000 yearly active business users (both in the EU); having met these thresholds in the last three years. Alternatively, an investigation can determine applicability. Intermediaries (covering conduit providers, caching providers, hosting providers), online platforms; special rules for “very large” online platforms with more than 45 million monthly active users
Types of provisions 7 prohibited practices that harm competition, 11 practices that are problematic for competition and require further examination when gatekeepers engage in them Liability rules; transparency reporting obligations; due diligence obligations
Enforcement At the EU level, through the Directorate-General for Communications Networks, Content and Technology Primarily through national regulators, aided by newly proposed European Board for Digital Services (EBDS) as independent advisory group
Sanctions Fines of up to 10% of global turnover, structural separation in case of systematic non-compliance Fines of up to 6% of global turnover; in extreme cases: restriction of access to platforms

The DMA: Balance between platforms, consumers, and business users

The DMA aims to reduce the harm of concentrated digital markets by making them easier to enter and by limiting the ability of players in these concentrated markets to influence other markets. To do so, the DMA needs to target the right markets and platforms, as well as impose the right rules for them. The current proposal provides good initial answers to both challenges.

In determining the targets of the regulation, the Commission only applies the DMA to platforms that have significant numbers of users across the EU and that generate substantial revenues. If a platform does not meet the quantitative thresholds, the European Commission can initiate a market investigation to designate a platform as a gatekeeper based on additional considerations, such as user lock-in or network effects. The market investigation complements the DMA by filling potential gaps and does not enable stand-alone investigations.

In writing the DMA’s rules, the Commission distinguishes between practices that are prohibited outright and those that require closer examination. Many of the proposed prohibited behaviors have been investigated in the past. For example, competition authorities have already restricted the use of price-parity clauses (which prevent business users from offering better terms through other sales channels) by large platforms. An ongoing investigation examines Apple’s practice of tying the use of its app store to its payment and other services, and the European Commission already scrutinized certain advertising practices by Google. The major change in the DMA is that the rules render the practices illegal without requiring a lengthy investigation.

The practices that require scrutiny but are not outright prohibited are more ambiguous, as even large platforms might have good reasons to engage in them in specific instances. For example, forcing platforms to refrain from ranking their own products more favorably can have ambiguous effects and may need further specification to only target harmful behaviors. Some of the rules will require more guidance for firms to be able to apply them. For instance, access to app stores and search engine data is to be granted on fair, reasonable, and non-discriminatory terms, which is a concept borrowed from the licensing of standard-essential patents and has triggered an enormous volume of litigation in the past and remains far from clear.

Accounting for the competitive effects of data

The EU General Data Protection Regulation (GDPR) failed to address the importance of data for competition, and the DMA attempts to address this issue. Several rules in the proposal attempt to level the playing field by preventing personal data from being shared inappropriately and by establishing access to data relevant for competition. Gatekeepers may not combine personal data from different sources. The GDPR generally privileges firm-internal data exchange compared to data exchange across companies, giving platforms with activities across many areas an edge. If enacted properly, the DMA may limit the granularity and comprehensiveness of datasets compiled by companies like Facebook and Google. Among the softer rules, the DMA requires gatekeepers to offer real-time data portability to both business and personal users, real-time data access to business users, and de-personalized search engine data to any competitor.

By addressing the privacy risks of personal data and the competitive aspects of large data sets, the DMA reinforces the GDPR. This is one issue where the interests of major technology companies are not aligned with one another, as evidenced by the recent conflict between Apple and Facebook over Apple’s new privacy features that would hamper cross-app tracking on its devices. These tensions may increase the chances of the provisions being implemented and enforced as currently drafted.

A more systematic, flexible approach to opening up digital markets

As governments around the world consider measures to curb the growing power of Big Tech, breaking up these companies has emerged as a popular remedy. The recent state and Federal Trade Commission lawsuits against Facebook and Google, for instance, raised the prospect of forcing the companies to spin off certain business units or subsidiaries like Instagram (in the case of Facebook). Though this sounds like an effective option, there is serious doubt about whether separate companies would increase competition in any of the key markets. While it would certainly reduce the power of each platform in the short- to medium-term, the underlying market dynamics, such as network effects and economies of scale, would continue to favor concentration.

The DMA represents a major step toward counterbalancing these network effects and economies of scale in digital markets and to open them up for competition and innovation. This will allow European competition enforcers to cut short lengthy proceedings against the largest tech companies and instead ensure minimum conditions in important concentrated markets for more competition to become possible. It is challenging to define what exactly those minimum conditions need to entail, but the current list of practices seem a sensible starting point.

There are many areas that the DMA does not touch, despite being included in the Commission’s consultations on the draft. One example is merger policy. Although gatekeepers will have to provide notice of all acquisitions, there are no additional powers to investigate transactions that do not meet the current threshold for intervention. Hence, Big Tech will continue to be able to buy valuable start-ups, as both Google and Facebook have done recently.

In recognition of the limitations of hard-and-fast rules, the DMA also contains several provisions that allow for revisions and updates. For example, market investigations can be used to address services or practices not covered by the lists and a three-year periodic review of the DMA is intended to ensure it reflects the latest market developments and evidence. This is particularly important given the fast pace of digital markets where app stores and search engines may not have the same central role in five or ten years.

The DSA could be a new kind of tech regulation

The second piece of crucial tech regulation out of Brussels is the DSA. The European Commission bills it as a rebalancing “of the rights and responsibilities of users, intermediary platforms, and public authorities,” all based on “European values.” The implicit promise: In contrast to national regulations that often focus only on defining rules for removing illegal content, the DSA aims to establish a new, comprehensive transparency and accountability regime for online platforms. Within this regime, one important tenet of tech regulation remains the same, though: The liability exemptions for tech companies would still be in place under the DSA, and they would not be subject to a general monitoring obligation regarding user content. Only in certain circumstances would platforms be held liable for third-party content, for instance, when failing to act after being alerted of illegal content (i.e., the “notice-and-action” approach). A “Good Samaritan” clause similar to that in Section 230 of the Communications Decency Act is also included in the DSA, allowing platforms to conduct their own content investigations, while retaining their liability exemption.

Nonetheless, the DSA represents a strong shift from previous regulatory approaches established in the United States and the EU. Just three or four years ago, the Commission was wary of imposing any binding rules on platforms. For example, it developed a voluntary Code of Conduct against hate speech and a Code of Practice on Disinformation (which is to be strengthened within the framework of another Commission initiative, the European Democracy Action Plan). Now, with the DSA, the Commission wants to define binding rules and to sanction breaches.

Due diligence and transparency obligations—especially for large platforms

The proposed rules and sanctions are mostly aimed at establishing a minimum level of transparency, accountability, and compliance mechanisms for tech platforms that have been commonplace in various forms in other industries for decades. For example, platforms need to designate a point of contact within the EU. They need to implement transparency reports for content moderation, specifically regarding the number of actions taken against illegal content. The proposal also introduces EU-wide mandates on real-time disclosures for online advertising: Users need to receive information that they are seeing an ad, who is behind the ad, and why they are seeing an ad. Furthermore, users can lodge complaints against companies for potential violations.

Very large platforms with more than 45 million monthly active users have additional obligations to fulfill. These platforms are required to conduct annual risk assessments regarding illegal content, negative effects on fundamental rights, and intentional manipulation of their services. Crucially, they need to subject themselves to independent audits concerning their transparency and due diligence efforts. This is a major and welcome departure from previous regulatory approaches, where little, if any, independent oversight was possible. The development of this auditing proposal in upcoming negotiations will be critical to watch: Can the EU build a working, independent auditing regime for tech platforms? And can it avoid mistakes and weaknesses of existing auditing mechanisms such as the Federal Trade Commission’s “privacy audits”?

Another important proposal requires very large platforms to give regulators and scientists access to platform data via databases or application programing interfaces (APIs). For years, academic and civil society experts have been demanding such access to enable independent research into issues such as the spread of disinformation, the potential effects of blocking or deleting content, and verifying platforms’ claims regarding content moderation.

There is still room to strengthen the DSA. The ad transparency guidelines, for example, merely prescribe the status quo. Many large platforms already allow users to see some basic information about ads and ad targeting and have created ad databases. However, giving users greater insight and experts better options to study targeted online advertising would require improved explanations on ad disclaimers and more expansive ad databases than are currently available. Another example is the question of data access: While it is hugely important that the Commission establish a data access framework, the proposal may be too restrictive by focusing only on academic researchers and thus failing to allow third-party, independent oversight. These examples show that the DSA could be a game-changer for European tech regulation if the Commission is willing to go beyond the status quo of currently existing voluntary corporate efforts.

Beyond such specific policy questions, the overarching issue of implementation and the DSA’s relation to national laws could become a bone of contention. Some European countries, among them Austria and Germany, have introduced their own national rules on content moderation and transparency regimes for platforms. In Germany, there are now even specific rules for algorithm transparency for search engines and social networks. It is not clear yet how all these rules will relate to, or be subsumed by, the DSA. Considering the Commission’s proposal explicitly aims to establish EU-wide rules for a common digital single market, aligning national (or, in the German case, even regional) regulators with the envisioned European oversight structure will require some effort.

The uncertain road ahead

The DMA and DSA have been generally welcomed by EU member states, but numerous vague and unclear passages leave the door open for corporate stakeholders to water down the proposals. Now, the Commission, the European Parliament, and national governments will start negotiating the draft, so changes are likely. Heavy lobbying will continue to accompany the process. Civil society organizations hope to make their voices heard despite being grossly outspent by the tech industry. Many of the proposals in the DSA and DMA go in the right direction: They significantly enhance transparency and accountability and lower barriers to competition. While it makes sense to adjust some of the details and even strengthen some rules, it will be vital for European parliamentarians as well as civil society and academic actors to ensure that commercial interests do not manage to slow down the process altogether.

Aline Blankertz and Julian Jaursch are project directors at Stiftung Neue Verantwortung (SNV), a not-for-profit, non-partisan tech policy think tank in Berlin. Aline focuses on data and the platform economy. Julian works on disinformation topics and platform regulation.

Apple, Facebook, and Google provide financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.