This is a Brookings Center on Regulation and Markets policy brief.
There has been increasing political awareness regarding the national security issues posed by TikTok, the popular social media app owned by Chinese company ByteDance. Lawmakers and the public are right to be concerned—numerous data points are now available, from ByteDance’s connections to the Chinese Communist Party to its potential for social manipulation to its gathering of American personal data. Some efforts have been made to limit the potential risks, for example storing TikTok data on U.S. (rather than Chinese) servers. However, more could be done. Additional measures should also be considered by lawmakers, including the forced sale of ByteDance’s U.S. operations or even a complete nationwide ban on TikTok.
TikTok’s national security threat is only one component of the myriad challenges facing regulators looking to protect citizens utilizing social media platforms. For that reason, TikTok is a useful case study to examine broader issues with social media at large. TikTok and other social media platforms, including Instagram and Facebook, often have adverse effects on the mental health of minors. Unrealistic beauty standards, bullying, and sexual harassment are all very real problems on these platforms, and policymakers should do more to encourage the companies to seek solutions. Additionally, mis- and dis-information often run rampant due to social media companies’ varying and often lax rules on content moderation, which can pose a particular challenge to American democracy when a social media company is under the influence of a foreign adversary.
Holding social media companies accountable is difficult under present law, most notably due to Section 230 of the Communications Decency Act, which provides broad legal liability protections to companies for content their users post. Additionally, ensuring that U.S. user data is secure and cannot be exploited by foreign adversaries may require CFIUS to make a recommendation either leading to the forced sale of TikTok or banning the platform altogether. We explore policy solutions to the aforementioned issues, including passing an updated data protection law, incentivizing the development of better age verification practices by social media companies, and re-establishing a “duty of care” standard requiring companies to take reasonable steps to prevent harm to users.
Download the full policy brief here.
The Brookings Institution is financed through the support of a diverse array of foundations, corporations, governments, individuals, as well as an endowment. A list of donors can be found in our annual reports published online here. The findings, interpretations, and conclusions in this report are solely those of its author(s) and are not influenced by any donation.
Acknowledgements and disclosures
The authors would like to thank James Kunhardt and Rayan Sud for their excellent contributions to this piece.