Sections

Commentary

6 principles for independent research in a digital world

Filippo Lancieri,
Filippo Lancieri Associate Professor - Georgetown Law
Christian Peukert, and
Christian Peukert Professor - University of Lausanne
Joshua A. Tucker
Joshua Tucker's headshot
Joshua A. Tucker Professor of Politics, Co-Director Center for Social Media and Politics - New York University

February 19, 2026


  • The increasing centralization of data, legal constraints, and risk-averse academic practices have hindered independent researchers’ ability to investigate how digital platforms affect our societies, thereby challenging public accountability.
  • In response, we propose six principles to guide research on digital markets: (1) public information should be available for research, (2) independent research is needed to monitor and evaluate firm operations, (3) firms must facilitate the sharing of research-relevant data with external researchers, (4) firms should not block research findings, (5) institutions, in particular universities and journals, should protect researchers from legal risks, and (6) the academic community must play a role in safeguarding research ethics and integrity.
  • In combination, these principles aim to foster data transparency, enabling rigorous, reproducible research to inform policy and public debate on the digital economy’s broader societal consequences.
Magnifying glass examining digital files
Shutterstock / Sorapop Udomsri

Introduction

Digital platforms have tremendous economic and societal power. Initially, digitization promised to make information freely available to all, create transparent markets, connect people, and democratize communication. The last decade or so saw rising awareness of privacy concerns, the spread of information of varying quality, increased polarization amidst a public health crisis, and worldwide abuses of dominant market positions, which impacted our public ethos and democracy in ways that we are yet to fully comprehend (Jiravuttipong, 2026, Goldfarb and Tucker 2012, Garcia 2017, Tucker et al. 2018, Coyle 2019, Finkel et al. 2020; Parker et al. 2020, Lancieri and Sakowski 2021, Farronato et al. 2023, Budak et al. 2024, Persily and Tucker 2020, Ruggeri et al. 2024, Pape and Rossi forthcoming, Van Angeren et al. forthcoming).

Growing concerns about the potential harms associated with an increasingly digital economy create a need for external access to internal firm data so that policymakers and the broader public can understand and regulate these markets effectively. However, data access is often restricted by strategic interests and legal barriers. This represents a fundamental challenge to empirical research that studies digital markets in general—and digital platforms in particular—to map their behaviors and the corresponding societal effects.

While sometimes interests in promoting research are aligned between firms and researchers (e.g., if firms stand to benefit from learning the findings of research that they might not have the resources or ability to carry out themselves) and firms now share more data than a few decades ago, data is often controlled by those with the least interest in disclosure—large digital platforms. This can lead to selective data sharing that could allow these parties to influence the direction of research. As a result, those outside of these organizations may be oblivious to important problems associated with increased digitization or receive a skewed understanding of the issues being studied, undermining the ability of academic research to inform the public and public policy (Barrios et al. 2025). It is not surprising that platforms interested in keeping potentially harmful practices from scrutiny might not voluntarily provide their data to external researchers, but such data can sometimes be accessed via other means, such as scraping of publicly available webpages. Research using such data can be highly socially valuable and entirely legal to access. Yet, researchers can be chilled from such research because of the costliness of threatened litigation.

Legal barriers, including the fear of lawsuits, significantly hinder researchers. Various laws govern, enable, or restrict data access and sharing. For example, contract law (e.g., platforms’ terms of service), intellectual property rights (e.g., copyright and neighboring rights), and even criminal law can be put forward as a justification for prohibiting automated data collection. The same obstacles are in the way of effectively sharing data between academic researchers, which is essential to ensure reproducible science. Further, privacy laws can restrict data accessibility. In all of the above, international differences in the relevant law further complicate the legal landscape for researchers.

Journals and editors are often risk-averse, which impacts the acceptance of papers involving contentious data sources. This caution stems from concerns about legal liabilities, ethical considerations, and the potential backlash from publishing research based on disputed or sensitive data. As a result, groundbreaking studies that rely on unconventional or controversial data sources may struggle to find a platform for dissemination, hindering the advancement of knowledge in certain areas. Journal practices that allow research only if the firms being studied have authorized the use of certain types of firm-related data can make the prevention of research by intimidation much more potent. This creates a perverse change in research direction away from socially useful work that might interrogate harmful effects of, say, dominant platforms and toward consulting reports that platform sponsors approve.

Certain data deserves to be protected by privacy law, contract law, and intellectual property law. However, as discussed in detail in Persily and Tucker (2020), there are significant differences between private entities that analyze data to support for-profit businesses, often with no obligation to share their findings publicly and sometimes even bound by duties to shareholders to keep this information private, and other societal actors whose aim is to analyze data to advance scientific knowledge and share their discoveries publicly or develop tools for nonprofit social good. Additionally, there are substantial economic, political, and social benefits that can result from the public sharing of insights gained from analyzing data from online platforms. Conversely, there are risks associated with formulating public policy without the insights derived from rigorous and independent data analysis.

This creates an urgent need to promote access to the data needed for independent academic research on digital markets. At the conference “Mapping and Governing the Online World,” held in Ascona, Switzerland, an international and interdisciplinary group of researchers from economics, law, computer and political sciences discussed the trade-offs and challenges associated with data access for empirical research on the digital economy and developed six core principles to guide data access in an increasingly digitalized world

In the rest of this article, we seek to accomplish three goals. First, we lay out these six principles clearly and concisely in the hope that this will facilitate recognition of their importance by multiple stakeholders. Second, we suggest concrete legal reforms that would help realize the implementation of these principles. Finally, we provide a roadmap for what sort of steps the external researcher/academic community should take to ensure that research using public digital data is carried out in a responsible manner. To be very clear, we are not in any way suggesting that either the reforms or suggestions for the research community are novel nor that they are not being pursued elsewhere—see especially the herculean work of the European Digital Media Observatory as well as centers such as the Knight-Georgetown Institute (among many others) —but rather to centralize the different components of this discussion in one place in a concise and summarized manner.

Six principles for empirical research of digital markets and online platforms

  1. Public information should be available for research: Researchers rely on access to public information to conduct studies that inform policy and enhance societal understanding. The definition of what qualifies as public information will vary across contexts and time, but societies should protect those working with such data.
  2. Independent research is needed to monitor and evaluate firm operations: Independent monitoring is crucial to detecting and addressing potential abuses, biases, or harmful practices within digital platforms. Evaluating firm operations transparently helps maintain accountability and trust among users and stakeholders.
  3. Firms must facilitate the sharing of research-relevant data with external researchers: Data sharing between firms and researchers increases transparency and accountability, enables a deeper understanding of digital ecosystems, and can foster innovation. Data sharing and collaboration on equal footing can lead to the development of tools and insights that benefit society, such as improved algorithms for detecting illegal content.
  4. Firms must not be able to block or vet research findings: Corporate blocking or vetting of academic research undermines the scientific process and prevents the dissemination of critical findings that could inform public debate and policy. Ensuring that findings are published, regardless of corporate interests, supports the integrity and progress of scientific inquiry.
  5. Institutions, in particular universities and journals, need to protect good-faith research and researchers from legal risks associated with their work: Universities and journals must establish robust legal support systems to defend researchers against potential lawsuits and other legal challenges. By doing so, they create a safe environment for academic freedom and the pursuit of knowledge without fear of retribution.
  6. The academic community must safeguard research ethics and integrity: The community must develop clear, enforceable guidelines covering many areas: from the protection of the data and subjects involved in the research to ensuring the information collected is archivable and reproducible, allowing longitudinal studies that can track changes and trends over time (among others). To preserve integrity, journals/editors should not accept articles when firms and other third parties have the power to engage in pre-publication review or to exercise editorial control unless such arrangements are explicitly disclosed and explained. Researchers and journals must also properly and holistically disclose potential conflicts of interest.

These principles, individually or in combination, apply broadly to a variety of data sources, including web-scraped data, data gathered through collaborations with companies, data obtained from the government through freedom of information laws, data obtained from companies through mandatory data sharing laws, or primary data collected through other means (e.g. sensors, apps, etc.). With these principles in mind, we can now turn to the matter of what practical steps can be taken to ensure their realization.

How governments and regulators can help better promote and safeguard independent research in digital markets

Clearer guidelines and legal frameworks are necessary to support research activities. Access to data is a key issue for both researchers and firms in the digital economy. The redistribution of third-party content has sparked debates and legal battles since the early days of the internet. With generative AI, data access concerns have resurfaced, as commercial AI developers often use scraped data to train models, operating in the same legal gray areas as independent researchers. This has led to stricter terms of use and more walled gardens, lawsuits, and calls to modernize copyright and neighboring rights to support innovation (Peukert and Windisch 2025).

The regulatory debate around commercial data access (including the modernization of copyright and other laws) is important and necessary, but it needs to be broadened to consider society’s overall interests in promoting independent academic investigations. In particular, when we change data access rules for, say, AI developers to enable AI innovation, we also need to consider changing and/or clarifying data access rules for researchers to increase transparency and accountability of technology and the firms that develop and commercialize it.

However, piecemeal changes such as modernizing copyright alone cannot solve the issue of data access for the public good. New legislation around the world should require large platforms to share data with vetted researchers under strict privacy protections, enhancing transparency and improving accountability. Further, determining what constitutes public and private information is vital. Information on public websites should be treated as public, aligning with the public interest in archiving and analyzing the information. Of course, the extent to which this data should be accessible for research needs careful consideration. While private information should remain confidential by default, companies have abused general “privacy-protection” claims to prevent data sharing and avoid public scrutiny, even when the social benefits of analyzing such data are clear (Van Loo, 2022; Richards, 2022; Lancieri, 2019).

There are ways to develop different levels of controlled access to sensitive data to balance out public interests in data sharing with private interests in privacy protection. For example, the European Digital Media Observatory (2022) published an extensive report on how researchers can engage with personal data in a way that complies with European data privacy laws, which are the strictest in the world (see also Vogus 2023). Indeed, regulatory frameworks in other societally important domains demonstrate the possibility of achieving a balance between the necessity for data transparency in achieving greater societal goals and the imperative to protect individual privacy and sensitive business information.

For example, in the U.S. and elsewhere, researchers hoping to engage with non-anonymized Census or equivalent data (e.g., French CASD data) must sometimes undergo background checks, receive specific training, gain approval from universities’ Institutional Review Boards, be approved by the Census Bureau itself, and sign a data handling agreement. Some types of data can only be accessed in secure rooms, and the Census conducts reviews to ensure that confidential data is not inadvertently disclosed. That said, there is only limited vetting of the research content—it must have statistical merit and not pose a threat to the mission and reputation of the Census. The focus is more on methods than on outcomes—as it should be. Ausloos et al. (2020) discuss the example of access to Finnish health data, where national law provides specific safeguards for researchers as long as they meet similar requirements and can demonstrate that a similar scientific research question could not be adequately answered through the exclusive use of aggregate data.

Similar approaches should be implemented for research that requires access to online platform data, where data sharing for scientific research purposes must also navigate the delicate balance between public interest and privacy or business protection. For the digital economy, the European Union, with its Digital Markets Act (DMA) and Digital Services Act (DSA), is breaking new ground in this direction (see Edelson et al. 2023).  For example, policymakers can compel data sharing for questions with high societal relevance, as exemplified in Article 40 of the DSA. However, at a global scale, we need to go further. EU regulation applies only to one jurisdiction and only to some firms (designated as “gatekeepers” or “very large online platforms”). Broader access is needed.

More specifically, governments around the world should consider making at least three important changes to laws and regulations:

  1. Pass laws that guarantee independent researcher access to internal digital platform data: The European DSA offers a promising starting point since it establishes a formal framework for researcher access. Yet, its scope remains limited: It applies only to the very largest platforms and restricts access to investigations tied to systemic risks within the EU. The older, bipartisan Platform Accountability and Transparency and Act (PATA)—which was just reintroduced in the U.S. Senate on January 7, 2026—proposed researcher access contingent on National Science Foundation approval and empowered the Federal Trade Commission to require public disclosure of selected datasets. These efforts, taken together, show that both Europe and the United States have already developed workable models. Policymakers should use them as building blocks for more comprehensive legislation with stronger mandates.
  2. Protect researchers who independently collect public platform data: Laws should explicitly safeguard external researchers who obtain public data through methods such as web scraping or automated collection tools. Existing frameworks point in the right direction but remain fragmented. The European DSA provides some protection for certain forms of research access, yet it does not fully cover independent data collection, nor does it resolve how researchers should navigate overlapping legal regimes. Although exceptions like fair use in the United States and research exceptions in European copyright and neighboring rights law exist, their scope is often uncertain, especially when research activities intersect with contract terms, privacy rules, or cybersecurity restrictions. Clear statutory guidance is needed to ensure that good-faith research does not expose scholars to legal risk and that independent data collection can complement formal access channels.
  3. Legal protection for academics against strategic lawsuits: Researchers who study digital markets can face litigation tactics aimed at discouraging scrutiny. One important tool to address this problem is anti-SLAPP legislation, where SLAPP stands for strategic lawsuits against public participation. Most U.S. states have adopted such rules, and Congress has considered federal versions, but protection from these rules in general—and the coverage of academics in particular—remains uneven. The European Union does not yet offer a direct analogue, although some member states have begun debating protections for journalists, civil society groups, and potentially academics. Comparable safeguards should extend to academic researchers, since the risk of costly or prolonged legal action can chill independent inquiry. Legislation that enables courts to dismiss weak or strategic claims at an early stage would help ensure that good-faith, well-documented research can proceed without undue pressure.

Having identified concrete steps that can be taken by governments and regulators globally to ensure access to platform data for external academic researchers, we turn next to what the academic community can do to ensure that this data is used in a responsible manner, balancing the potential benefits of better informed public policy against the potential threats to the integrity of that research and the privacy of platform users.

How the academic community can better promote and safeguard independent research in digital markets

Data sharing between firms and researchers has great potential to lead to insights that are valuable for both society and firms. However, often societal goals are not aligned with firms’ goals (Barrios et al. 2025). While changes to the law are crucial to enable data access for research and protect academics from direct and indirect pressure, the academic community can and must also do its part to ensure the independence and integrity of research. In particular, to address the power imbalance between small academic teams and the large corporations behind online platforms, the academic community must adopt collaborative solutions and establish best practices for rigorous empirical research on these platforms. Some areas we see potential in this regard include:

Developing ethical standards for work with public data: The academic community should set ethical standards for navigating legal uncertainties surrounding public data (e.g., obtained through web scraping) and private/personal data (for example, codes of ethics and codes of necessary practices for research involving digital data). The academic community must seize the opportunity to generate knowledge that policymakers and other stakeholders need to hold private actors accountable, mitigate societal harm, and achieve greater societal goals.

Updating journals’ policies to facilitate the publication of studies that used independently collected public data: Academic journals should immediately update their policies to permit the publication of studies using web-scraped public data, despite legal uncertainties. Beyond correcting methodological errors, peer review inherently involves making qualitative judgments about contributions to scientific advancement. Why not also take responsibility for assessing whether the societal benefits of revealing a new fact outweigh the strategic interests of those who prefer that fact to remain hidden?

Updating journals’ policies to require the publication of the data agreement between researchers and data providers: While access to data is an increasingly essential input and currency for academic papers, journals’ policies with regard to data access remain outdated. For example, while many journals require the disclosure of whether a firm had the right to review the paper before publication, they do not require the authors to disclose exactly under what conditions academics obtained access to the data. Journals should require that academics share the underlying terms of access to databases, with the potential exception of the price paid for access when the database was acquired from a third party. This policy will also have the upside of increasing academics’ bargaining power with data providers, as they can point out that the terms of the agreement will ultimately be disclosed to the public.

Better recognizing the value of shared data as a key contribution to academia: Having access to certain data is increasingly pivotal for an academic career, so researchers are sometimes skeptical about sharing their own data under a general fear that others can then engage in similar research. To align incentives, the academic community should treat data sharing as a substantive contribution and reflect this in hiring, promotion, and funding decisions. One concrete step is for journals to create or expand dedicated article formats that focus on describing and documenting datasets. Some fields already publish such data papers in leading outlets, providing clear credit to the scholars who generate and curate the underlying resources (e.g. Strategic Management Journal, Marketing Science). Wider adoption of this model would help make data sharing incentive-compatible from an individual career perspective and support a more open and cumulative research ecosystem.

Updating journals’ policies to require better disclosure of potential conflicts of interest: Journals should ensure that their disclosure policies are comprehensive and up to date, requiring disclosure of all relationships that could lead to third-party influence on research (Barrios et al. 2025). This includes, as mentioned above, the conditions under which academics obtained access to private datasets.

Creating legal defense funds to protect exposed researchers: Finally, universities, foundations, and professional associations should consider creating legal defense funds and leveraging support from law school clinics to assist researchers facing legal challenges. Consortia of journals should retain lawyers, and perhaps a funder could sponsor a substantial retainer for legal disputes. Such initiatives can counterbalance the chilling effects of current legal uncertainties regarding the use of independently collected public data or other forms of data access.

Conclusion

Data access for external researchers is crucial for ensuring that society understands the impact of the rapidly evolving digital landscape and its new tech titans. It is also crucial for informing a wide range of public policy from issues such as children’s health to market competitiveness to safeguarding the information environment. In this essay, we have laid out a set of six core principles that we believe can help ensure that the enormous power of modern technology can be harnessed for the good of society in addition to the good of the firms that are profiting from developing that technology. We have also identified concrete steps that can be taken by policymakers and the academic research community to help bring about this more optimistic vision of a future where the impact of digital platforms on society is more transparent than it is today. With the fast growth of new AI platforms such as OpenAI and Anthropic, as well as the moves of existing tech giants such as Google, Meta, and Amazon into the world of AI, the need for such action could not be more urgent.

Authors

  • References

    Jef Ausloos, Paddy Leerssen & Pim ten Thije, Operationalizing Research Access in Platform Governance What to learn from other industries?, (2020), https://algorithmwatch.org/de/wp-content/uploads/2020/06/GoverningPlatforms_IViR_study_June2020-AlgorithmWatch-2020-06-24.pdf

    Barrios, John M., Filippo Lancieri, Joshua Levy, Shashank Singh, Tommaso Valletti, and Luigi Zingales. The conflict-of-interest discount in the marketplace of ideas. No. w33645. National Bureau of Economic Research, 2025

    Budak, C., Nyhan, B., Rothschild, D. M., Thorson, E., & Watts, D. J. (2024). Misunderstanding the harms of online misinformation. Nature, 630(8015), 45-53.

    Coyle, D., 2019. Practical competition policy implications of digital platforms. Antitrust Law Journal, 82(3), pp.835-860.

    Edelson, L., Graef, I. and Lancieri, F., 2023. Access to Data and Algorithms: For an Effective DMA and DSA Implementation. CERRE. Available at: https://cerre.eu/publications/access-to-data-and-algorithms-for-an-effective-dma-and-dsa-implementation

    European Digital Media Observatory, 2022. Report of the European Digital Media Observatory’s Working Group on Platform-to-Researcher Data Access. Available at: https://edmo.eu/wp-content/uploads/2022/02/Report-of-the-European-Digital-Media-Observatorys-Working-Group-on-Platform-to-Researcher-Data-Access-2022.pdf

    Farronato, C., Fradkin, A. and MacKay, A., 2023, May. Self-preferencing at Amazon: evidence from search rankings. In AEA Papers and Proceedings (Vol. 113, pp. 239-243)

    Finkel, E. J., Bail, C. A., Cikara, M., Ditto, P. H., Iyengar, S., Klar, S., … & Druckman, J. N. (2020). Political sectarianism in America. Science, 370(6516), 533-536.

    Garcia, D., 2017. Leaking privacy and shadow profiles in online social networks. Science advances, 3(8), p.e1701172.

    Goldfarb, A. and Tucker, C., 2012. Shifts in privacy concerns. American Economic Review, 102(3), pp.349-353.

    Jiravuttipong, G, 2026 (forthcoming). The Global Race to Rein In Big Tech. The University of Pennsylvania Journal of International Law

    Knight-Georgetown Institute, 2025. Better Access: Data for the Common Good. Available at https://kgi.georgetown.edu/wp-content/uploads/2025/11/Better-Access_Data-for-the-Common-Good_Knight-Georgetown-Institute_November2025.pdf

    Lancieri, F. and Sakowski, P.M., 2021. Competition in digital markets: a review of expert reports. Stan. J.L. Bus. & Fin., 26, p.65.

    Lancieri, F., 2019. Digital Protectionism: Antitrust, Data Protection, and the EU/US transatlantic drift. Journal of Antitrust Enforcement 7(1), pp. 27-53.

    Pape, L.D. and Rossi, M., forthcoming. Is Competition Only One Click Away? The Digital Markets Act’s Impact on Google Maps. Marketing Science.

    Parker, G., Petropoulos, G. and Van Alstyne, M.W., 2020. Digital platforms and antitrust. In The Oxford Handbook of Institutions of International Economic Governance and Market Regulation.

    Persily, N. and Tucker, J.A., 2020. Conclusion: The Challenges and Opportunities for Social Media Research. In: N. Persily and J.A. Tucker, eds. Social Media and Democracy. Cambridge: Cambridge University Press, pp. 313-331. Available at: https://doi.org/10.1017/9781108890960.

    Peukert, Christian, and Margaritha Windisch. “The economics of copyright in the digital age.” Journal of Economic Surveys39, no. 3 (2025): 877-903.

    Richards, N. (2022). The GDPR as Privacy Pretext and the Problem of Co-Opting Privacy. Hastings LJ73, 1511.

    Ruggeri, K., Stock, F., Haslam, S. A., Capraro, V., Boggio, P., Ellemers, N., … & Willer, R. (2024). A synthesis of evidence for policy from behavioural science during COVID-19. Nature, 625(7993), 134-147.

    Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3144139

    van Angeren, J., Miric, M. and Ozalp, H., forthcoming. Platform Ecosystems, Bottlenecks, and M&A Activity: Implications for Platform Regulation. Academy of Management Perspectives.

    Van Loo, R. (2022). Privacy Pretexts. Cornell L. Rev., 108, 1.

    Caitlin Vogus, Defending Data: Privacy Protection, Independent Researchers, and Access to Social Media Data in the US and EU, (2023), https://cdt.org/wp-content/uploads/2023/01/2023-01-23-CDT-Defending-Data-Independent-Researcher-Access-to-Data-report-final.pdf

  • Acknowledgements and disclosures

    Acknowledgments:

    This paper results from a workshop held at the conference “Mapping and Governing the Online World,” held in Ascona, Switzerland. An international and interdisciplinary group of researchers from Economics, Law, Computer and Political Sciences discussed the trade-offs and challenges associated with data access for empirical research on the digital economy and developed the six core principles that we better outline here. The following individuals helped develop and signed on to these principles, as initially developed at the ASCONA meeting, although this should in no way imply endorsement of the rest of the text in this article: Isin Acun, WU Vienna; Sverrir Arnorsson, ETH Zürich; Mireia Artigot Golobardes, Pompeu Fabra University; Luca Baltensberger, ETH Zürich; Ruy Camilo, University of São Paulo; Avinash Collis, Carnegie Mellon University; Abhisek Dash, Max Planck Institute for Software Systems; Bipasa Datta, University of York; Gaétan de Rassenfosse, EPFL; Nestor Duch-Brown, Joint Research Center, European Commission; Laura Edelson, Northeastern University; Chiara Farronato, Harvard University; Jens Frankenreiter, Washington University in St. Louis; Co-Pierre Georg, Frankfurt School of Finance & Management; Hiroki Habuka, Kyoto University; Aniket Kesari, Fordham University; Pankhudi Khandelwal, European University Institute; Kholofelo Kugler, University of Lucerne; Filippo Lancieri, Georgetown University; Katja Langenbucher, Goethe University Frankfurt; Yi-Shan Lee, Chinese University of Hong Kong; Amruta Mahuli, Max Planck Institute for Software Systems; Daniel Markovits, Yale University; Florencia Marotta-Wurgler, New York University School of Law; Jakob Merane, ETH Zürich; Sameer Metha, Erasmus University Rotterdam; Hans-Wolfgang Micklitz, European University Institute; Christian Peukert, University of Lausanne; Verina F. Que, University of Toronto; Martin Quinn, Erasmus University Rotterdam; Meike Ramon, University of Lausanne; Kumar Rishabh, University of Lausanne; Tim Samples, University of Georgia; Ohad Somech, Netanya Academic College; Lior Strahilevitz, University of Chicago; Katherine J. Strandburg, New York University; Alexander Stremitzer, ETH Zurich: Joshua A. Tucker, New York University; Jennifer M. Urban, University of California Berkeley; Maria Vásquez Callo-Müller, University of Lucerne; Kai Zhu, Bocconi University.

    Néstor Duch-Brown acknowledges that his signing of the ASCONA principles does not necessarily reflect the position or opinion of the European Commission. Neither the European Commission nor any person acting on behalf of the Commission is responsible for the use that might be made of those principles publication. Jennifer M. Urban signed to the original ASCONA principles in her individual academic capacity; the views there expressed should not be attributed to the California Privacy Protection Agency or the California Privacy Protection Agency Board.

    Author contributions:

    CP, FL, and JAT developed this framework for this article on the basis of the six principles listed above. CP and FL administered the project and wrote the manuscript based on the authors’ and signatories’ contributions to the workshop at the conference “Mapping and Governing the Online World” and subsequent discussions.  JAT provided feedback and editing of the manuscript.

    Disclosure statements:

    Lancieri: I received a research grant from CERRE, a Brussels Think Tank, to engage in the research that led to the report on Access to Data and Algorithms: for an effective DMA and DSA Implementation. This report was jointly supported by the British OFCOM, the French ARCOM, Google, Booking.com, and TikTok. I had no engagement with these parties other than through CERRE. We retained complete discretion in writing the report, and we conditioned the writing of the report on receiving the support of at least two regulators to ensure our independence.

    Peukert: I received research funding from Google to perform research projects related to copyright.

    Tucker: received a small fee from Facebook to compensate him for administrative time spent in organizing a 1-day conference for approximately 30 academic researchers and a dozen Facebook product managers and data scientists that was held at NYU in the summer of 2017 to discuss research related to civic engagement. J.A.T. is also one of the co-leads of the external academic team for the 2020 U.S. Facebook & Instagram Election Study, a project that began in early 2020 and is still ongoing at the time of the writing of this article. He was not compensated financially for his participation in this project by Meta, but the project involves working collaboratively with Meta researchers.  J.A.T. received a 2024 Google Research Grant to support a research project on “From Search Engines to Answer Engines: Testing the Effects of Traditional and LLM-Based Search on Belief in the Veracity of News”.

  • Footnotes
    1. See for example https://researchaccelerator.org/about#mission-values.
    2. See European Digital Media Observatory, 2022, Knight-Georgetown Institute, 2026.
    3. This should not be construed as prohibiting reasonable requests for review, such as making sure information contained in the article does not violate legal obligations around preserving the privacy of users of the firm’s products.
    4. E.g. Viacom International, Inc. v. YouTube, Inc., 676 F.3d 19 (2d Cir. 2012), among many others.
    5. On the use of SLAPP lawsuits against academics, see https://anti-slapp.org/slapps-targeting-academia ; https://www.coe.int/en/web/education/-/slapps-and-other-legal-threats-against-academics
    6. See https://www.ifs.org/anti-slapp-report/, tracking the status of anti-SLAPP legislation in all U.S. states. Twenty-five states get an A+, A or A- from the Institute for Free Speech (IFS).
    7. See https://www.coe.int/en/web/education/-/slapps-and-other-legal-threats-against-academics
    8. Again see for example European Digital Media Observatory, 2022.
    9. One very positive example of this has been the work of the Knight First Amendment Institute at Columbia University: https://knightcolumbia.org/

The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).