Sections

Commentary

Senator Amy Klobuchar seeks to quell health misinformation on social media

Senator Amy Klobuchar (D-MN) speaks to the media

Like Howard Beale, the angry TV executive in the film Network, Senator Amy Klobuchar is mad and she’s not going to take it anymore. She is fed up with Facebook’s failure to control the spread of health misinformation on its social network. Last week, she introduced a bill to do something about it. Under her proposal, Facebook will lose its immunity from lawsuits under Section 230 of the Communications Decency Act if it algorithmically promotes health misinformation, as defined by the Department of Health and Human Services (HHS), during a health crisis.

It is vital to understand that Klobuchar’s proposal does not make it illegal for Facebook or any other tech company to distribute health misinformation on their systems. Under the bill, HHS would be authorized to define what counts as health misinformation, but the bill would not make it a violation of law to distribute material meeting that definition. Instead, she proposes that if an underlying law makes it illegal to distribute heath misinformation, then a lawsuit against Facebook or any other company could proceed by alleging violation of that law. Previously, Facebook and other tech companies would have had immunity from any such lawsuit under Section 230 of the Communications Decency Act.

Of course, there is no such underlying law that bans the distribution of health misinformation. It is perfectly legal to distribute health misinformation today, and it will continue to be legal to distribute it if the bill were to become law tomorrow. If there were an underlying law making the distribution of health misinformation illegal, a lawsuit against Fox News alleging violation of this law would have been filed a long time ago. But no such lawsuit against Fox has materialized because there is no cause of action against a media company for reportedly distributing health misinformation. If Senator Klobuchar’s bill became law, neither Facebook nor any other tech company would have anything to fear. Legally, the proposed law is an empty gesture.

So, what’s going on? In policy terms, this is an effort to shame Facebook into doing more to suppress dangerous misinformation. Senator Klobuchar is also making a political point. Like President Biden, she is seeking to convince the progressive elements of the Democratic base that this Administration and Congress are on board for their tech regulatory agenda. It is part of the same signaling that made Lina Khan head of the Federal Trade Commission and Jonathan Kanter the nominee to lead the Antitrust Division.

The Klobuchar bill could have gone further. It could have created a new cause of action making it illegal for any person to distribute, facilitate, or promote the distribution of health misinformation, as defined by HHS, during a medical crisis, as also defined by HHS. Creating a new cause of action is what Congress did when it created an exception from Section 230 immunity for facilitating or promoting sex trafficking.

This new cause of action would have given teeth to the bill’s removal of Section 230 immunity. It also would have had the advantage of applying to traditional media as well as online platforms.

But that approach would have created First Amendment issues. The distinction between publicly airing a legitimate scientific disagreement about the effects of vaccines and conducting a willful disinformation campaign to undermine public health is razor thin. Do we really want a government agency to define that line between scientific truth and falsity in a way that has legal consequences?

If a bill containing such a new cause of action ever passed into law, it would prompt what is called “strict scrutiny,” a constitutional challenge alleging that such a content-based restriction on speech was not narrowly tailored to achieve a compelling government interest. As the well-known legal adage has it, this kind of First Amendment scrutiny is strict in theory, but fatal in fact. Under existing First Amendment jurisprudence, the courts would almost certainly reject such a measure as unconstitutional.

“Do we really want a government agency to define that line between scientific truth and falsity in a way that has legal consequences?”

What can be done about health misinformation then? Facebook’s reaction to the Klobuchar bill recognizes that clarification on questions about health misinformation would be helpful. But crafting a good health misinformation policy is difficult. If the Biden Administration or members of Congress had any good regulatory ideas to reduce the spread of misinformation, we would have heard about them by now. Outsiders suggest various techniques such as slowing the velocity of new or suspect information in the hopes of reducing the reach of misinformation. But no one really knows what works, largely because the social companies have all the information and refuse to share it with outsiders. Perhaps a way forward would be legislating transparency requirement to let regulators and outside researchers study how well the companies are doing in combatting misinformation through content moderation and other initiatives.

Such slow accumulation of public knowledge through transparency might point the way to an effective strategy. It is not a cure for today’s misinformation about COVID 19. But perhaps policymakers should be aiming for a longer-term fix that puts in place institutional mechanisms to regulate the conduct of social media companies in the public interest. Substantially more transparency about what they are doing to protect the public from health misinformation would be one element of these public interest duties.

Facebook is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and not influenced by any donation.