Across the partisan chasms of American politics, there has been for eight decades broad bipartisan consensus on a policy that truly made the country great. Following our victory in WW II, President Truman and then President Eisenhower inaugurated an era of scientific and technological leadership by applying recommendations of Vannevar Bush, President Roosevelt’s science adviser. His seminal report, “Science, The Endless Frontier,” proposed a unique tripartite coalition: Because fundamental research is a public good, the federal government would fund most basic science, which would be conducted primarily by independent scholars in research universities, the results of which would be converted by the private sector into commercializable and socially useful technologies.
If you are reading this on your mobile phone or tablet, you are enjoying one of the innumerable fruits of that partnership: The underlying physics and computer science in ubiquitous cellular technology were developed in federally funded university-based labs and classrooms. Similarly, the lithium-ion battery, which powers Tesla automobiles (to choose a random example), was the result of federally funded research. The National Science Foundation and other agencies supported the work of John B. Goodenough of the University of Texas at Austin and M. Stanley Whittingham of the State University of New York at Binghamton for more than 30 years. In 2019, Goodenough and Whittingham shared the Nobel Prize in chemistry with Akira Yoshino of Asahi Kasei Corp. of Japan, for work that made possible what owners of electric cars and many other devices today take for granted.
Those examples of long-term benefits of federal support for university-based basic scientific research are by no means outliers. The Association of American Universities reports that “new scientific and technological advancements emerging from America’s leading research universities led to the creation of 622 new startup businesses just in 2022 alone. The discoveries also led to 5,724 transfers of new innovations to private sector businesses so they could further develop and bring them to market.” Patricia Pelfrey and Richard Atkinson estimate that 80% of leading new industries derive from research in universities, which, according to Stanford professor Steven Blank, “spin off more than 1,100 science-based start-up companies each year, leading to countless products that have saved and improved millions of lives, including heart and cancer drugs, and the mRNA-based vaccines that helped to bring the world out of the COVID-19 pandemic.”
Data from the National Science Board affirm that the U.S. is still the largest performer of R&D, ahead of China, Japan, Germany, and South Korea. Our commitment to science and education was a linchpin of what Harvard economists Larry Katz and Claudia Goldin so aptly called the American “human capital century.” But this success story is now at risk, with evidence mounting that our position is slipping. As reported by the Information Technology and Innovation Foundation, “from 2019 to 2023, China’s R&D investment grew at 8.9 percent annually, compared to just 4.7 percent in the United States.”
If that trend continues, fueled by current efforts to radically reduce federal investments in university-based research, American science will suffer a potentially irreversible setback. Future historians (along with our children and grandchildren) will be puzzled, if not appalled, when they learn that the decline was largely self-inflicted.
What’s going on? In his first term, President Trump enthusiastically revved the public-private engine that enabled development of COVID-19 vaccines at “warp-speed.” Over a million Americans died of the virus, but millions more were saved. It was hailed at the time and still is regarded as a “stunning, some would say miraculous…” scientific achievement, and stands as one of the great validations of Vannevar Bush’s foresight. As celebrated by scientists grateful for the National Institutes of Health (NIH), “the origins of this historic accomplishment can be traced back directly to publicly funded innovations in basic science research and biotechnology” (italics added). Key to this history is its longevity: Research in the 1950s and 1960s established the basis for work decades later by Katalin Karikó, an immigrant from Hungary, and her University of Pennsylvania colleague Drew Weissman, who showed how the basic science might eventually lead to practical breakthroughs. The citation for their shared Nobel Prize in 2023 is as clear a statement as is possible on the long-term value of basic science which, in its first instances, often does not point to obvious applications: “In 2005 … Karikó and Drew Weissman discovered that certain modifications of the building blocks of RNA prevented unwanted inflammatory reactions and increased the production of desired proteins…[which] laid the foundation for effective mRNA vaccines against COVID-19 …”
Why, then, in his second term, is President Trump reversing course? His proposed fiscal 2026 budget includes, inter alia, an $18 billion reduction in funding for the NIH. Rather than celebrate our success during the pandemic (not to mention prevention of a host of other diseases), which as one journalist put it, is “a big part of the reason people might be able to throw away their masks,” the budget request includes this unfortunate distortion: “NIH has broken the trust of the American people with wasteful spending, misleading information, risky research, and the promotion of dangerous ideologies that undermine public health.”
We have weathered this kind of rhetorical storm before, although never one with such ferocity. Senator William Proxmire (D-Wis.) famously gave “Golden Fleece Awards” for funded projects with frivolous sounding titles. As one of his biographers noted, “[the award] became a Washington standard as Proxmire critiqued everything from an $84,000 study on why people fall in love to the U.S. Army spending $6,000 for a set of instructions on purchasing Worcestershire sauce.” (In one case the Senator learned the hard way that he had overstepped, and following a complicated legal battle had to settle with a scientist who had sued for defamation.) Proxmire’s purpose was to use humor to raise public consciousness about congressional responsibility in spending taxpayers’ money—a theme that has remained popular especially among critics of government spending. Against these clever jabs at seemingly “silly science,” though, there is abundant evidence that even some of the funniest-sounding or allegedly “useless” research can yield significant long-term public benefit.
In any case, humor is clearly not on the agenda of the current administration. If our labs and teaching programs are decimated as threatened, it will take years to recover, and society will be at risk from delayed weather forecasting, slashed capacity for tracking educational progress (30% reduction in the assessment budget of the Department of Education), a roughly 50% cut to the budget for the National Science Foundation, and pause in research on substance abuse disorders (about $1 billion). Whether we should shift from fossil fuels to clean energy will continue to be hotly debated, but why threaten to cut research that might save the lives of coal miners?
Proponents of federal cuts to university-based research rely on a blend of dubious arguments: universities spend too much on administration, costs of education are skyrocketing out of control, we don’t need the federal government to pay for basic science, and the private non-academic sector does research cheaper and better. The facts are more subtle.
First, while it is fashionable to admonish universities for bureaucratic obesity, the ratio of instructional to non-instructional personnel hasn’t changed much, even if growth in professional administrative staff has outpaced growth of support staff. As William Bowen and Mike McPherson showed already 10 years ago, what we are seeing is not ‘administrative bloat,” but the professionalization of non-faculty staff, a trend that has continued and that “echoes developments in the economy as a whole.” Why headline writers insist on blaming colleges for increased spending, without even noting the rise in student enrollments and other salient factors, is unfortunate especially in an era of declining public trust in the academy writ large.
Second, prices at public and private colleges have actually been trending downward—not exactly the popular perception. As reported by the College Board, “after adjusting for inflation, the average net tuition and fees paid by first-time full-time in-state students enrolled in public four-year institutions peaked in 2012-13 at $4,340 and declined to an estimated $2,480 in 2024-25.” A recent analysis by researchers at the University of Pennsylvania found that elite and expensive private institutions “offer similar levels of support to low- and middle-income families. For middle-income families — those earning between $75,000 and $200,000 per year, typically with additional consideration for those with multiple children in college at the same time — not only is college tuition frequently fully paid for, but students often receive additional aid.” Pushing back against popular and somewhat hysterical rhetoric requires a full-time truth squad. Of course it is true that college leaders could do better in managing resources, setting priorities, and making postsecondary education more affordable; but overall, as Mark Twain might have put it, rumors of the managerial failure of American universities are greatly exaggerated.
Third, it is true that the business sector plays an increasingly prominent role in basic science. But context matters. Business investment in university-based research was roughly $6 billion in 2022, an 11% increase over 2021; and though this is a relatively small share of the total spent by universities (close to $100 billion in 2022), it is important to keep in mind that university research thrives on a blend of public and private sources. (A growth area for advanced professional staff is the management of complex research budgets that include money from multiple sources—not a spreadsheet project for the faint of heart.) Further depletion of the public share would have a deleterious effect on total research output and, therefore, on the sustainability of its economic and social benefits. In this regard, it is worth noting that research carried out in private (non-university) labs is mostly conducted by scientists and engineers who learned their craft while pursuing advanced degrees in institutions receiving federal research dollars, which qualified them to work in those places upon graduation. One widely discussed but controversial idea, promoted by prominent tech figures like Peter Thiel, is to subsidize young people who choose to leave college early—following in the footsteps of Bill Gates, Mark Zuckerberg, or Larry Ellison.; the fact that the overwhelming majority of people who work for those billionaires do have college degrees is an inconvenient and omitted datum. As Vannevar Bush predicted, universities produce not only scientific breakthroughs, but they cultivate the human capital required for basic and applied research wherever it is conducted.
Measuring the returns to scientific investment is complicated, but the consensus of leading scholars is overwhelmingly supportive of the wisdom in “Endless Frontier” and corroborated by mainstream economic theory: Investments deemed unprofitable for rational entrepreneurs or corporations, such as basic research, are the province of government because they are necessary and beneficial to society at large. Again, focusing on medical research, “A new report … shows that every dollar of research funded by the [NIH] delivers $2.56 in economic activity, a multiplier effect that extends the agency’s impact.” Beyond health related outcomes, Forbes magazine cites research showing that “every dollar invested in non-defense public R&D yields $1.40–$2.10 in economic output.” In a recent paper published by the National Bureau of Economic Research, the findings go further: “returns to federally funded R&D appear to be substantially higher than the returns to other forms of federal investment … the literature does not support the standard assumption that public investments are less productive than private investments … federally funded R&D appears to generate returns on a shorter time horizon … and federally funded R&D appears to be a complement – rather than a substitute – with private investment.”
Those arguments speak for themselves. But on that last point, about complementarity, two additional nuances are worth considering. First, as Donald Stokes argued in his landmark book, the distinction between basic and applied research may be too stark, given that some of the most interesting work is best characterized as “use-oriented basic research” (italics added). Second, increasing the relative share of privately sponsored vs. federally funded research may compromise the credibility of results. Even ardent conservatives lured by the fundamentalism of free market economic theory have understood why it is crucial to shield science from the real or perceived influence of commercial and political interests. Indeed, one need only consider the original charter of the National Academy of Sciences, in 1863, that specified its “obligation … to provide scientific and technical advice to any department of the Government … with the Academy receiving no compensation for its services” (italics added). As I noted in a 2010 paper, this prescient understanding of the need to ensure purity and believability of scientific inquiry is usually credited to President Lincoln.
Finally, what about the oft-repeated claim that the private sector does R&D more efficiently and with better results? This is another favorite theme among the market faithful who lobby for shrinking government in favor of unregulated capitalism. With all due respect, a good place to explore that question might be at the next Starship launchpad.
Speaking of space travel, we are approaching the 70th anniversary of Sputnik, a good occasion to think historically, be grateful for the incredible advances we have achieved thanks to investments in basic research, and unite in support of the government-university-commercial partnership that made American science great. We should exploit a rarity these days, namely a strong and enduring bipartisan consensus: As James Pethokoukis of the conservative-leaning American Enterprise Institute argued recently, “It’s a no-brainer that American public policy should aim to significantly increase both government and private-sector R&D investment to boost innovation-driven productivity and economic growth.”
I’m for “yes-brainers.” Let’s follow the evidence and make sure we don’t let “the endless frontier” end on our watch.
-
Acknowledgements and disclosures
The author thanks Sandy Baum, George Bohrnstedt, Daryl Chubin, Sarah Feuer, Elaine Kamarck, Mike McPherson, and Holden Thorp for helpful comments.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
The end of the endless frontier? A warning on research cuts
July 24, 2025