Sections

Commentary

Around the halls: The cost of compromising federal data

Phone with BLS logo
Shutterstock / T. Schneider

On Friday, August 1, President Donald Trump fired Erika McEntarfer, head of the Bureau of Labor Statistics (BLS), following the release of the August jobs report and revision of previous releases’ numbers. This firing is the latest in a series of moves by the administration that will affect federal data collection and distribution. 

Below, Brookings experts weigh in on the consequences of these actions and implications for the integrity of data used by researchers and policymakers to inform critical policy decisions.

Lauren Bauer

Accurate inflation data are essential to ensuring SNAP benefits reflect the real cost of food

Supplemental Nutrition Assistance Program (SNAP; formerly the Food Stamp Program) benefits are calculated through government-collected and government-produced data.

Intermittently, the U.S. Department of Agriculture (USDA) reevaluates the Thrifty Food Plan (the Thrifty), a model food plan that establishes, through extensive data work, the cost of purchasing a modest, nutritious diet in the United States; the Thrifty for a four-person family works out—to the cent—the maximum SNAP benefit. Each month, USDA applies the Consumer Price Index to the Thrifty to continuously monitor how the changing cost of groceries affects SNAP benefit adequacy. Then, once a year, this calculation determines the inflation-adjusted maximum SNAP benefit for the following fiscal year.

While the One Big Beautiful Bill prevents future Thrifty reevaluations from increasing in value, it affirms that “[the Secretary shall] on October 1, 2025, and on each October 1 thereafter, adjust the cost of the thrifty food plan to reflect changes in the Consumer Price Index for All Urban Consumers, published by the Bureau of Labor Statistics of the Department of Labor, for the most recent 12-month period ending in June.” This annual adjustment to SNAP benefits relies solely on accurate inflation information produced by BLS.

Integrity matters. Data integrity matters, too. If the Bureau of Labor Statistics is forced to cook the books on the Consumer Price Index to make inflation look lower than it is in truth, then the annual inflation adjustment to SNAP (and Social Security, among other programs) will produce inadequate benefits.

Manann Donoghoe

The erasure of climate and weather data are making Americans less safe this Hurricane and wildfire season

The Trump administration’s recent actions to fire the Bureau of Labor Statistics (BLS) commissioner is only the latest attack on the integrity of federal data. In fact, these efforts started with the elimination of climate and weather data that remains nothing short of a national campaign of climate erasure

Following President Trump’s Executive Order on January 20, 2025, that began the process of revoking many of the Biden administration policies to advance climate action, the administration has removed troves of public data. This includes removing data tools that forecast the impacts of climate change and analyze the social and economic vulnerability of communities to climate impacts; eliminated data projects, including from NASA and NOAA, to monitor weather and track deadly disasters; removed thousands of webpages of scientific literature on climate change and disasters from the Environmental Protection Agency and the Center for Disease Control and Prevention; and ended the National Climate Assessment, the Congressionally required annual stock-take to monitor climate change’s impacts on communities across the nation.

These actions undermine decades of largely bipartisan efforts to understand and minimize the impacts of climate change and disasters on Americans. Data on climate change is crucial for federal, state, and local planning. It’s an input into urban and rural community’s plans for adapting to new threats, like from wildfires, floods, sea level rise, and desertification. Caught in the crosshairs have been farmers who rely on climate data to plan future crops, insurers who rely on risk information to price premiums, local planners who rely on climate data to help inform where to build housing, and emergency managers who rely on climate models to forecast future disasters risks.

But Americans ought to be outraged today. By undermining the accuracy of weather forecasts, the erasure of climate data and devaluation of expert knowledge has placed everyday Americans at greater risk. With inaccurate or patchy data during an emergency, there is a higher likelihood that otherwise manageable disasters will cause greater loss of life or property damage. We’re in the middle of hurricane and wildfire season. Local weather data are crucial to forecasting incoming threats and providing timely warnings for residents to evacuate. Yet, with the whiplash from firings and rehiring affecting  Weather Services across the U.S., forecasters this year are likely to be in short supply.

It’s difficult to manage what we don’t measure, but it is impossible to combat a threat that has been made invisible. By eliminating tranches of data on climate change and weather, the administration seems to be wishing climate science would disappear. In the process, they’re endangering American lives and livelihoods.

William G. Gale and Elena Patel

Diminishing the IRS will have ripple effects across government

The recent firing of the commissioner of the Bureau of Labor Statistics has sparked concerns about political interference in the data produced by U.S. statistical agencies. Their independence, and the high-quality of the data they produce, is a hallmark of the U.S. economy. These statistics offer timely insights that guide consumer spending, shape monetary policy, and reveal how government programs influence behavior. When trust in the accuracy and objectivity of these data falters, the foundation of evidence-based governance begins to crumble.

Beyond employment and other labor market figures, the federal government generates a wide array of statistical data to inform decisionmaking. The IRS, through its Statistics of Income (SOI) program, produces equally critical data that shapes tax policy, federal budgeting, and economic research. Since 1916, the SOI has compiled detailed annual statistics describing how individuals, businesses, and nonprofit organizations earn income. These statistics, based on multi-year analyses of federal tax records, form the analytic backbone for agencies and economists across the country.

Among the most frequent and consequential users of the SOI data are the Office of Tax Analysis at the U.S. Treasury Department, the Joint Committee on Taxation, and the Congressional Budget Office. The staff of these agencies rely heavily on SOI datasets to assess the revenue effects of proposed legislation, such as the budgetary effects of the recently passed One Big Beautiful Bill Act; to analyze the distributional effects of the tax system, or who bears the burden of various taxes; to forecast federal tax receipts, allowing lawmakers to plan federal spending and policy priorities; and to more broadly model taxpayer behavior. Nearly every major tax policy analysis begins with SOI data at its foundation.

Beyond tax-focused agencies, SOI data also plays a critical role across a broad range of federal institutions. The Bureau of Economic Analysis, for example, incorporates these datasets into its construction of the National Income and Product Accounts, which is a foundational dataset used to measure the size and scope of the U.S. economy. Similarly, the Federal Reserve Board collaborates with the SOI to implement its Survey of Consumer Finances—a triennial study that comprehensively details household wealth, income, and financial behaviors. Finally, watchdogs like the Government Accountability Office depend on this rich data to conduct audits, assess efficiency of federal programs and functions, and inform recommendations to Congress. These broader applications underscore SOI’s foundational value not just in tax policy but in shaping evidence-based analyses across the federal government.

Ongoing staffing reductions at the IRS, particularly as prescribed by the Department of Government Efficiency, alongside shrinking budgets, pose a serious threat to the agency’s capacity to produce timely, reliable, and high-quality statistical data. The Research, Applied Analytics, and Statistics Division, which oversees the SOI program, reported a 29% reduction in staff as of June 4, 2025. Combined with broader personnel losses that have totaled more than 26,000, enforcement capacity has weakened, and core statistical functions are increasingly at risk. Notably, the SOI has suspended its Joint Statistical Research Program, curbing independent analyses that support not only evidence-based tax policy but also core statistical functions across the federal government. At a time when trusted data are integral to supporting sound decisionmaking, safeguarding the integrity of the federal statistical infrastructure, including the IRS, should remains a top priority.

Carol Graham

What gets measured is what matters

There is an old saw that has proven true time after time: What we measure is what matters. Many of the myriad discussions about what is wrong with or missing from GDP, for example, identify flaws in what the metric does and does not capture. It counts oil spilled into the ocean as a positive entry, as it was produced before it was spilled, and does not place a value on the often thankless work that home caregivers do in order to take care of small children or family members with disabilities or diseases that prevent them  from caring for themselves. These flaws result in the overvaluation of material goods production and the undervaluation of non-material activities that are essential to investing in or maintaining the health and welfare of our population. Those valuations, in turn, influence how those respective aspects of our welfare are weighted in policy and budgetary decisions. In other words, what gets measured is what matters.

Yet GDP was not originally designed to be an all-encompassing aggregate measure of progress. Simon Kuznets, the creator of the GDP metric, warned early on that “the welfare of a nation can scarcely be inferred from a measure of national income.” Beyond GDP, our national statistics system produces a range of invaluable economic and other measures that are critical to the decisions of economic and other policy decisions. Over time, our federal statistics system has grown more sophisticated and made progress to measuring what matters beyond the production of material goods, such as many aspects of our physical and mental health; how citizens allocate their time between work, caregiving, leisure, and learning; and novel kinds of health and other data that are essential to the medical and scientific innovations that are foundational to our country’s leadership in science and  technology (See, for example, the CDC’s Healthy People Survey, the National Survey on Drug Use and Health, and the Bureau of Labor Statisitics’ American Time Use Survey).

The current climate, in which our national data collection and reporting is being politicized, not only threatens our long-held national reputation for scientific excellence but also our nation’s health in an immediate and dangerous way. Key data sets tracking the mental health of our population; health disparities across socioeconomic, racial, and regional cohorts; and recently introduced metrics to assess population well-being, among others, are being eliminated or tampered with by the current administration. These tracking mechanisms are critical to identifying emerging vulnerabilities, such as the recent downward spiral of youth mental health, the recent spread of our epidemic of deaths of despair to a broader range of ages and racial groups, and the resurgence of deadly diseases such as measles.

Precisely because what gets measured is what matters, eliminating those data and metrics makes the risks from these trends much greater, as our ability to identify their occurrence in a timely manner is being eroded. The latest and most blatant example was the firing of the talented, non-partisan, and well-respected director of the Bureau of Labor Statistics for no reason other than completely unsupported claims of rigging the jobs report. Without robust, non-partisan economic data, key policymakers cannot make good decisions about how to manage the economy, mitigate shocks, and make forward looking decisions. 

Economic and health systems, science, and democracy more broadly cannot survive in a context where the basic tenets of their operations—credible data and the institutions that collect and report on it—are being eliminated in order to serve the political whims of particular leaders or parties. We are entering a dangerous stage which threatens the foundations upon which our democracy, institutions, and scientific leadership were built. If we are unable to reverse course soon, we will duplicate the fate of many empires that collapsed precisely because greed and ambition replaced truth and trust in institutions.  

Joseph Kane

Politicizing the BLS undermines the integrity of labor market data and the professionals who produce it

Erika McEntarfer’s dismissal as the Bureau of Labor Statistics (BLS) commissioner is jarring on several fronts. It not only raises concerns around the potential politicization of one of the country’s oldest and most important federal statistical agencies, but the dismissal also raises questions around ongoing operation and oversight of BLS, including impacts to future data releases. Economists agree: reliable and unbiased labor market data are essential to policymakers, businesses, and many other users.

As a former BLS economist, the news hit closer to home for me. While it’s easy to point the finger at a single agency head, the reality is that hundreds of economists and other civil servants fill the rank and file at BLS—and they are collectively responsible for supporting the agency’s mission. For them, collecting, cleaning, and releasing quality and timely data are of utmost importance, and doing all that work represents a herculean lift, whether administering annual surveys or preparing a monthly statistical release. The multiple rounds of fact checking and review were a constant during my time at BLS, requiring a keen eye for detail, frequent collaboration, and extreme care before every public release; entry to certain offices was quite literally restricted in many cases.

That’s why pointing the finger at the chief economist is the same as pointing the finger at all the other dedicated staffers. It’s not just about those in the agency headquarters in Washington either; many staff are also scattered across the country. Time will tell how this dismissal—alongside a wave of other federal firings and resignations—will impact agencies such as BLS, but the data and policy ramifications could extend for years to come.    

Jonathan Katz and Renee Rippberger

Political interference threatens data integrity and democracy

The firing of the Bureau of Labor Statistics (BLS) Commissioner Erika McEntarfer can be interpreted as an attempt by the Trump administration to bury transparent and accountable data and undercut independent expertise and statistical agencies. President Trump’s former BLS Commissioner William Beach said there was no reason to fire McEntarfer and that the data was not rigged, despite the administration’s claims to the contrary.

Accurate and unmanipulated data are not only vital in managing the U.S. economy but also essential to government transparency and accountability. State and local governments across the country rely significantly on federal data for everything from budgeting to public infrastructure. U.S. businesses, civil society, and academic and research institutions also rely on U.S. government data to conduct their activities.

Trump’s firing of McEntarfer is part of a larger picture, in which the BLS has faced debilitating budget cuts, hiring freezes, and the disbanding of expert advisory committees. All of this raises fresh concerns that the Trump administration’s economic policies may be performing poorly and that data and information will be manipulated to downplay a potentially faltering U.S. economy. With public trust in government institutions and expertise steadily declining, doctored data and information, if widespread across the administration, could decrease this trust further and could have serious ramifications for our democracy and the economic well-being and security of Americans.

The economy consistently ranks as a top issue for many Americans. Without trusted and accurate data, American voters will have less access to unbiased resources to help them make informed decisions and hold officials accountable.

Officials at the national, state, and local levels must prioritize protecting the public interest and guard against misusing or undercutting transparent and accountable data. To ensure this happens, Americans across the country must demand data transparency and accountability from government officials, regardless of party.

Aaron Klein

Truth in numbers: Why accurate data are the backbone of bank regulation

High quality data about the state of the economy is essential for bank regulation. Many banks fail because regulators do not have access to the most accurate data quickly enough. Cooking the books may help a bank avoid problems in the short term, but in the long run those problems catch up to them, becoming worse over time.

America’s financial regulatory system relies on high quality data both within banks and throughout the broader economy. A reduction in the quality or integrity of key data sets, like the jobs report and GDP, will result in regulatory problems down the road. Bank regulators are used to dealing with facts that cause problems. The solution is never to ignore the data or turn uncomfortable facts into rosier lies. 

If American savers and businesses lose confidence in the accuracy of data around banks and the financial system, it could result in a financial crisis. Finance, at the end of the day, is about trust. Trust involves a shared commitment to truth, however uncomfortable that truth may be. George Orwell may not have been thinking about bank regulation when he wrote in “1984,” “The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.” Bank regulators who reject truth and embrace lies as facts will eventually find themselves in a much worse situation and risk taking much of the American economy with them.

 

Tracy Hadden Loh

Why federal data integrity matters for your neighborhood, city, and region

The integrity of federal data about our job market and population is about more than just understanding the health of the country’s economy—trillions of dollars in federal funding are at stake. Congress and federal agencies use public data every day in order to manage the flow of federally directed dollars to places with specific characteristics, and undermining confidence in these numbers by politicizing their credibility is also an attack on the legitimacy and fairness of the programs that are managed using these metrics.

For example, the U.S. Economic Development Administration targets potentially up to $1 billion in funding over time for the Distressed Area Recompete Pilot Program to areas where prime age employment falls below the national rate—a mapping exercise that relies in part on data from the Bureau of Labor Statistics (BLS), whose commissioner was fired by President Trump on August 1, 2025, because the monthly employment report was disappointing.

Trump’s rejection of a report from the BLS is alarming because it is both highly personalized (in the form of firing Commissioner Erika McEntarfer) and political (his reasoning alleging she faked numbers because she was a President Biden appointee). However, the accuracy of federal data collection across multiple statistical agencies is actually a much larger and longstanding problem, related to funding, that spans administrations led by both parties. Census Bureau data products, in particular, guide trillions in federal spending across transportation, community development, health, and other sectors. Trump’s most recent attempt to disregard the 14th Amendment to the Constitution and change the Census’ methodology in order to influence the apportionment process for the House of Representatives is an enormous distraction from our need to modernize our data collection mechanisms, for not just the decennial Census but the annual American Community Survey, to maintain accuracy as these trillions flow.

Federal funding isn’t limitless. If we want to manage our resources effectively and get them where they are needed, we must invest in and protect both the human and fiscal capital that can generate a shared set of publicly accessible facts. The political debate about what criteria to use and when, and the spin about why can come after, but not during, that process.

Robert Maxim, Manann Donoghoe, and Hannah Stephens

Government data are already inadequate data for Native Americans. Politicizing statistical agencies will only make it worse.

President Trump’s decision to fire the commissioner of the Bureau of Labor Statistics (BLS) risks exacerbating already severe labor market data issues for Native Americans. Native Americans have always been excluded from the monthly jobs report and only recently had any monthly labor market data published about them. Politicizing government data could undermine Tribal governance and harm the wellbeing of Native American people.

High quality data matters for Tribes as governing entities. While other levels of government can rely on tax revenue to fund government operations, taxes are frequently not an option for Tribes, many of which have small populations, a significant number of low-income citizens, and are trying to maintain competitiveness with off-reservation communities. Because of this, federal funding accounts for a significant portion of Tribal government budgets. Without accurate data, federal funding can’t meet the real needs of Native communities.

However, as we write in a forthcoming report assessing data challenges for Native Americans in Southern California, Tribes must govern their citizens and territories while relying on data that would be considered inadequate for nearly any other group in the United States. Data about Tribes and Native American people suffers from a variety of shortcomings, including insufficient sample sizes, data that lags by years, misalignment with Tribal boundaries, misclassification of Native people into other racial or ethnic groups, and data sets and statistical publications that are designed without Tribal input.

As one example, the total U.S. population varies by a range of 1% across three common federal data sets: the 2020 decennial census, the 2023 ACS one-year estimates, and the 2023 ACS five-year estimates. In comparison, the total Native American population varies by over 24% across those same three data sets, a difference of up to 2.3 million people depending on which data set is used.

Recent data challenges for Tribes extend beyond just the BLS. Since the start of 2025, the federal government has removed a growing number of federal data sets and research reports from the public domain, including those assessing demographic information on minority groups and public health statistics, in the name of eliminating “DEI” from the federal government.

Moving forward, it will be essential that the federal government not only restore and depoliticize data collection but also take steps to improve the quality of data for Tribes and Native American people. In the absence of federal action, now is the moment for states, regions, and municipalities to fill the gap by improving their data relationships with Tribes and Native American communities. This can include developing state and reginal Indigenous data strategies, investing in Tribal data capacity to help Tribes scale up their own data operations, expanding Tribal access to state and regional government-held data, and making state and local data about Native Americans more accurate by increasing sample sizes and updating practices around race and ethnicity data collection to meet emerging best practices. By taking these steps, policymakers can help build a data ecosystem that better supports not only Native Americans but Americans of all backgrounds.

John Sabelhaus

Behind the headlines: The 2 threats to federal economic statistics

Federal economic statistics are an essential part of decisionmaking for policymakers, businesses, and households. Economic statistics covering topics like inflation, unemployment, income growth, and household wealth dominate news cycles with every data release.

There are two types of threats to economic statistics. The first is political manipulation, meaning deliberate attempts to alter or prevent the release of the key economic facts that decisionmakers need. The second is a deterioration in the quality of economic statistics because of a failure to make the necessary investments in data infrastructure.

Political threats to economic statistics have never really been an issue in the United States. No one seriously questions that the key statistical agencies report honestly. Statistics evolve after an initial data release because they start as estimates based on samples from currently available data. The statistics are subject to revision as more and higher-quality data become available. Statistics are locked down only when all the relevant information is in hand.

One key protection against political interference is integrity within the statistical agencies (and as a former producer of federal economic statistics, I can attest to the strength of that conviction). However, there are other safeguards against political manipulation as well because of the elaborate and evolving interconnections between various headline statistics and the availability of multiple measures for the same outcomes.

If someone attempted to manipulate, for example, headline employment numbers from the Bureau of Labor Statistics (BLS) Current Employment Statistics, that would create glaring distortions in productivity and other downstream statistics that rely on job counts. It would also introduce inexplicable gaps between the BLS reported job counts and those from other sources, including state-level Unemployment Insurance records and private sector payroll statistics.

However, the same interconnections and alternative data sources that limit the risk of political manipulation draw our attention to the second risk. The reason we have interconnected statistics and alternative data sources is because researchers at the agencies responsible for economic statistics and their academic partners have done the necessary work.

We measure economic outcomes using different data sources and methods because we want to know if the alternatives give different answers, and if so, why? We worry that one method of collecting information—knocking on doors or making phone calls—that worked in the past may not work well today. We have embraced the availability of administrative data sources, such as the Unemployment Insurance records from states and private payroll employment mentioned above, to test for better (or cheaper) answers to key questions about economic performance. We need to think about how to measure economic success in a world where intangible capital (ideas) has come to dominate the world of machines and physical goods.

If we stop investing in economic measurement, the usefulness of economic statistics becomes questionable, and decisionmakers will no longer have the tools they need. The environment also becomes ripe for political manipulation.

Although political manipulation is a potential threat, the second threat to economic statistics is already becoming a reality. Budgetary pressures and antagonism towards federal employees are weighing heavily on statistical agencies, who were already struggling with the challenges of measuring outcomes in a rapidly evolving economy.

There are no simple answers to producing high quality economic statistics. There are, thankfully, many government researchers and their academic partners who want to keep doing the work that needs to be done. If we are worried about political manipulation of economic statistics, we need to make sure we will know it when we see it.

Darrell M. West

Federal data are the backbone of fair funding—and without it, communities risk losing vital support

Federal data matter for a myriad of government activities. Many people don’t realize the essential role this agency information plays in grant distributions, school aid, and poverty alleviation, among other things. Without accurate information and public trust in public sector data, it will be impossible to have fair and unbiased distributions of federal money to state and local areas. Some places will get more than they deserve while others would receive less, and that would compromise public sector performance.

One of the reasons we need trustworthy government data is that grant distributions are based, in part, on demographic and other kinds of data at the state and local levels. Around $2.8 trillion in federal money is provided using U.S. Census Bureau numbers. This material enables hundreds of agencies to determine need through established aid formulas and allocate government grants based on population, income levels, and the like. Census “undercounts” may deprive cities of millions of dollars and even lead to the loss of a congressional seat.

School aid also is dependent on accurate, trustworthy numbers. Many states have grant formulas for school assistance that depend on factors such as community need, property valuations, and local tax levels. If the underlying numbers are not accurate or people do not have confidence in the integrity of the information, then this lack of confidence compromises the distribution of school funding and imperils education-related opportunities across the country.

Finally, having high quality data is important for poverty alleviation. Programs such as food stamps, public aid, and housing assistance are partially based on income levels, nutrition needs, and community costs. These programs require reliable data sets to ensure fairness in the money distributions. Small differences in the quality of this underlying information could have a substantial impact on how much support people and communities receive.

Vanessa Williamson and Ellis Chen

Global autocrats manipulate data to protect their power—America must not follow that path

The Trump administration’s firing of the Bureau of Labor Statistics (BLS) commissioner does not just hurt the American economy—it further erodes U.S. democracy. Without reliable data, policymakers, journalists, and citizens cannot evaluate the consequences of public policy. By extension, Americans will find it more difficult to attribute credit and blame come Election Day.

The firing is the latest in a long line of attacks on the country’s independent and nonpartisan civil service, another signal that their job security is contingent not on the quality of their work but on matching the whims of politicians. The administration has repeatedly undermined government data collection on a whole host of topics from climate change to student performance. The BLS commissioner’s firing is part of this systematic dismantling of the institutions promoting accountability in our democracy.

Around the globe, antidemocratic leaders try to solidify power by controlling information about their governments’ performance. The Chinese government addressed unfavorable levels of youth unemployment by suspending its reporting of youth unemployment data. In Turkey, President Recep Tayyip Erdogan fired the head of the government’s statistics agency after the agency published unflattering inflation data. India has seen a series of controversies, including state institutions pressuring data producers to change or attempting to delay the release of data reflecting poorly on the governing party. These are not one-off examples; the data show that autocracies systematically exaggerate their GDP growth. When unencumbered by an independent civil service, authoritarian leaders have the means, motive, and opportunity to undermine the publication of accurate data.

Firing officials for reporting accurate data found unflattering to the current administration is straight out of the playbook used by authoritarians. It is an attempt to mislead people, to avoid being held to account for policy failures, and to rewrite history. Unfortunately, we can expect much more of this in the months and years to come.

Authors

The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).