Sections

Commentary

Four lessons for U.S. legislators from the EU Digital Services Act

European Commissioner for a Europe Fit for the Digital Age Margrethe Vestager and European Internal Market Commissioner Thierry Breton attend the presentation of the European Commission's data/digital strategy in Brussels, Belgium February 19, 2020.

Last month, the Digital Services Act (DSA) draft, the long-awaited reform package for the EU E-Commerce Directive was released to the public. While there are already several explainers out there, we highlight four of the most important lessons other legislators may learn from the DSA.

Choose better analogies (or avoid them altogether)

European Commissioners Margrethe Vestager and Thierry Breton introduced two new draft regulations (yet to be discussed by the European Commission) dedicated to regulate content and services online with an interesting comparison between the internet and vehicle traffic. While a metaphor is always welcome in order to simplify complex topics, an analogy like the one above makes it clear that the perspective used to build the DSA may be the reason for some of the more obvious downsides.

While part of the internet relies on information transmission or in content traffic, perceiving DSA as a mere “traffic light” is short of the intricacies internet intermediaries present, as well as the actual implications internet regulation often has on the provision of services. The main concern is that without a clear vision of what the DSA is actually trying to accomplish on a grand scale, the patchwork approach to building the legislation is visible in the confused perspective for the role of online intermediaries, which puts non-discrimination, free speech, and democratic participation rights in peril.

A unifying vision is important in trying to “sell” a landmark piece of policy, and while the need to update the European Union’s two-decade old E-Commerce Directive is a good catalyst, it should not be the driving force or even the goal of the drafting of the DSA. The lack of an analogy doesn’t necessarily mean better policy. Ironically, trying to tie legislative reform to some immediate validation of the concern du-jour overlooks the importance and staying power of such legislation and the imperative of getting it right. An example closer to home is the rather intense movement to replace or otherwise cripple liability protections for internet service providers without an overarching vision.

Asymmetric regulation, while tricky, is an important step

Talking about “the internet” means so many different things that regulating it as a monolith is bound to lead to negative effects. The previous approach of giving the most leeway to all the different actors within the internet ecosystem seems to be slowly on the way out, partially in recognition of the fact that it failed to adequately offer more than baseline protection for marginalized groups, and partially because the scope of the internet and how it is used has changed dramatically. However, a blanket, one-size-fits-all approach to actively regulating internet intermediaries won’t be more successful. Controlling Facebook, YouTube or TikTok’s influence on democracy and society in a top-down manner that applies to all intermediaries without acknowledging the effect of regulations on smaller companies is destined to fail. It kills competition and will, one day, prevent intermediaries from existing in the online world.

Conversely, the asymmetric regulatory perspective and the duality between providers of intermediary services and “very large platforms”, as the DSA calls them, seems an interesting first take on the diversity of intermediaries. However, the fast evolution of the online environment means that rules that separate the truly powerful and consequential from the rest may be outdated soon. The DSA’s drafters were cognizant of the different scales of internet intermediaries have both in terms of the content and revenue, but any legislation, particularly in the US, that seeks to distinguish between different types of actors has to avoid being guided by specific services or platforms.

Limiting liability AND increasing obligations?

Provisions that enshrined immunity, or even safe harbor, from liability were the best ways to promote moderation by intermediaries, by removing a sense of fear that best intentions when monitoring, flagging, and deleting content could be used against the company. Reform proposals should balance the need for a sense of responsibility on behalf of platforms with this equally important desire to avoid liability for attempting to maintain a better platform. The somewhat clumsy and potentially menacing way the DSA fits this square peg in a round hole is making mandatory the requirements for transparency of policies and practices of content moderation, online advertisement, or algorithmic curation, as well as notice and action (or takedown) processes.

The DSA also argues for a duty of care whose obligations seem to be built on inaccurate and overbroad language, ripe for harmful interpretations that can ultimately hurt users, or even just establish very high thresholds of the definition of an intermediary acting within this duty of care. Crucially, if a duty of care is deemed necessary for any reform legislation, it should be created with clear and precise language, define the relationship between it and due-diligence requirements (if any) and conditions of liability, and how remedies may be put in place to counteract the over-moderation for fear of litigation.

Tie up loose ends

The DSA proposal has a mixed record on making sure concepts and perspectives are properly realized in the legislation. For instance, it very deftly aims to create a network of regulators, as coordination between competent authorities is crucial in order to avoid eventual discrepancies and confusion in compliance. The unique makeup of this single market with more than two dozen autonomous member-states makes standardization important, as the framework encourages them to each set their own rules for illegal content related processes.

However, the DSA’s implementation of risk assessments is one that leaves room for improvement, as they are done entirely by the companies. Without strong and potent oversight, perhaps in the form of a network, there are no mechanisms to check for the validity of the platforms’ claims. While it keeps the prohibition on national legislation asking for proactive monitoring of content, the concept of duty of care may end up indirectly establishing such monitoring.  Legislators should always look for ways in which an otherwise compelling idea can be defeated and how much of the solution to that should be part of the proposed legislation itself.


Facebook is a general, unrestricted donor to the Brookings Institution. The findings, interpretations and conclusions in this piece are solely those of the author and not influenced by any donation.

Authors