Technology companies increasingly hide the world’s most powerful algorithms and business models behind the shield of trade secret protection. The legitimacy of these protections needs to be revisited when they obscure companies’ impact on the public interest or the rule of law. In 2016 and 2018, the United States and the European Union each adopted trade secret protection laws that fail to clarify exemptions for algorithms and other digitized processes used by governments but built by private companies.
It is abundantly clear that digitization often means privatization. Algorithms are not merely the secret sauce that allows technology companies to make profits. They form the very products and architecture of our entire information ecosystem, affecting economic and democratic processes, fundamental rights, safety, and security. While governments depend on commercial companies for ever more services and products, the matching oversight is lagging behind. Pressures on fundamental rights are significant. To rebalance, laws may need to be modernized, and for that, citizens, politicians, and regulators also need to have meaningful access to information.
There is heated debate over whether new regulations are needed to ensure that new technologies and innovation can thrive while respecting democratic rights. Yet this debate has overlooked the application and enforcement of existing regulations. While trade secrets significantly benefit technology companies, there has been little public and political debate about the impact on oversight when governments outsource that task. If we allow trade secret protections to circumvent public scrutiny, more and more digitized and automated processes will happen in black boxes. This is eroding the rule of law.
Conventional wisdom suggests that laws should apply online as they do offline, but there are also situations where digitization creates new and specific contexts. The massive impact of algorithmic processing will likely exponentially increase with the rise of artificial intelligence.
In order to assess whether principles such as fair competition, non-discrimination, free speech, and access to information are being upheld, the proper authorities need to be allowed to look under the algorithmic hood. The scrutiny levels algorithms face should be determined by considering the scale of data processed and a company’s impact on the public interest.
This is not revolutionary: The recipe of Coca-Cola does not need to be published on the front page of the New York Times for regulators to assess its health impacts.
A middle way between publishing the details of a business model for everyone to see and applying oversight over algorithms when the processes involved may have a significant impact can and should be found. Moreover, given the impact on democratic values and principles, a model for the democratic world that ensures trade secrets do not trump the public interest of oversight and accountability is urgently needed. Ideally, the United States and the European Union would take the lead to set a new standard together, or at least they should clarify and update the trade secrets exemptions when democratic oversight is at stake.
Marietje Schaake is the international policy director of the Cyber Policy Center at Stanford University and a former member of the European Parliament.
Commentary
Trade secrets shouldn’t shield tech companies’ algorithms from oversight
May 4, 2020