A decade ago, scientists expected the quantum technology revolution to lie in the distant future—say, 30 years out. But recent developments have shortened that timeline dramatically, generating greater confidence than ever that the quantum revolution really is around the corner, although major technical challenges remain to be overcome. Just as U.S. government funding and policy allowed the internet to flourish, the United States now has the opportunity to help shepherd the next revolution in technology.
Today, the field of quantum information science and technology (QIST), stands at the cusp of a series of breakthroughs that could finally bring quantum technology—and the great benefits it will likely bring with it—into the mainstream. But progress in QIST is fragile, and sustaining this progress requires investment and coordination by the U.S. government and a continued policy of openness toward the scientists that will deliver these breakthroughs.
The promise of quantum technology
Understanding the state of the art and the potential payoffs of QIST requires delving into the technical details. While quantum computing is the best known technology in the field, quantum-enhanced sensing and communications are two other technologies of equal importance for delivering on the promise of quantum breakthroughs.
Quantum computers endeavor to use the unique quantum-mechanical behaviors of elementary objects—atoms, electrons, or photons—to process information in ways that vastly outperform conventional digital computers in some tasks. By using the states of quantum objects, called “qubits,” to represent and process information, quantum computers promise to solve more rapidly certain types of problems, such as the development of materials and drugs; the design of certain artificial intelligence applications; searching for patterns in large graphically represented data sets; solving certain logistics problems (such as scheduling and planning); and breaking common forms of message encryption.
Quantum computers are still in their infancy. What gives us hope that useful quantum computing will happen in the foreseeable future is a concept called fault-tolerant computing—a mathematical theorem that puts a sharp limit on the amount of error permissible in each computational operation on qubits. If errors are kept below this limit, then fault-tolerant quantum computation would be possible with a manageable amount of overhead in the number of additional error-prone qubits, blazing a path toward larger-scale quantum computing. A few leading research groups have already met and surpassed the fault-tolerance threshold, although with small numbers of qubits. For example, a consortium of U.S.-government-funded researchers recently demonstrated a system of rudimentary quantum error correction, while similar advances have been reported by a team of scientists in Europe.
Quantum sensors respond to extremely subtle changes in their surroundings in a way that provides useful information. They can outperform conventional sensors in some applications, although their cost, performance, and portability tradeoffs have not yet made them viable for wide commercial use. An example is using enhanced gravity-strength sensors to “image” dense structures underground, such as sub-city infrastructure, mineral or oil deposits, and groundwater flows. Another is using ultrasensitive quantum-based accelerometers for navigation in places where the Global Positioning System is not available, such as underwater, underground, or in the event of a catastrophic GPS failure.
Quantum communications systems will transport quantum information from one place to another by encoding it into a stream of photonic qubits and sending these photons through optical fibers, the air or near-Earth space. This process is analogous to how classical information—such as voice signals, emails, and video—is transported. Just as today’s internet enables what we do with conventional information technology, or IT, a quantum internet will be needed for interoperation of quantum information technologies, or QIT. Quantum communications systems span the entire range of distance scales: They are needed at small scales inside a quantum computer to connect quantum processing modules with each other and with computer memories that store quantum information; they are needed to interconnect separate quantum computers within a facility such as a research lab or data center; and they are needed to interconnect different facilities across interstate or even intercontinental spans.
Identifying the cutting edge
Some of the most exciting work in QIST lies at the boundary of quantum science and industry innovation. Scientists are making astounding breakthroughs in atomic clocks, quantum sensors, quantum communication networks, and quantum computers through a combination of experimental and theoretical research. New startups seem to pop up monthly, if not weekly, and at least one, IonQ, is publicly traded on the U.S. stock market. Nonetheless, a gap remains between the promise of quantum technology and what is commercially available, though efforts are being made to close it. Companies sell systems for all-but-unhackable encryption for internet applications, quantum-guaranteed random numbers for encryption and mathematics research, web access to small but powerful quantum computers, quantum computing software, and the many hardware bells and whistles needed to make such technologies work.
Many in the quantum research community believe that quantum sensors might prove to be the earliest quantum technology to impact society at large. Auto companies are already investing in research in quantum techniques for sensing the motion of vehicles. AOSense, a company based in Fremont, California, manufactures atom-optics-based sensors for precision navigation, time-and-frequency standards, as well as precision gravity measurements. ColdQuanta in Boulder, Colorado, is pursuing both quantum computing and quantum sensing using cold-atom qubits, potentially enabling powerful networks of entangled sensors. Quantum sensors could provide major advances in biomedical imaging and chemical sensing. They might also be used in telescopes that could resolve distant objects far better than conventional telescopes, and in radio-frequency antennas with quantum-enhanced sensitivity. The measurement precision achievable by the Laser Interferometer Gravitational-Wave Observatory (LIGO)—a large-scale physics experiment and observatory designed to detect cosmic gravitational waves—has recently been improved substantially by using quantum-squeezed light.
Finally, many companies have emerged in recent years that are focused on the electronics and software side of things. For example, Riverlane, a company based in Cambridge, Massachusetts, is building operating systems for fault-tolerant quantum computing. Headquartered in Boston, Zapata Computing is developing open-source quantum algorithm libraries and software for chemistry, logistics, finance, oil and gas, aviation, pharmaceuticals, and materials aimed at delivering real-world advances in computational power for applications. There is a huge number of companies worldwide engaged in various aspects of electronics, control, software, and protocols for the various forms of quantum computers under development.
The challenge of quantum communication
Even as QIST rapidly advances, key engineering obstacles remain, particularly around quantum communication. To contemplate the challenge of quantum communication, consider the following question: What is information? Information serves as an intangible “resource” that allows you to do something you couldn’t do without it. To a data scientist, information is data, measured in megabytes. Quantum information is a form of information that cannot be written onto paper or encoded in a stream of bits, (or even onto the electrons in an ordinary computer) but must be “written” into the very nature of exquisitely controlled and sequestered groups of atoms, electrons, or photons. This “very nature” of a quantum object is described by its “quantum state”—a mathematical tool that has been around since the 1920s and more recently recognized as a central concept in information theory. Just like the “classical state” of a physical entity (e.g., the trajectory of a projectile, the arrangement of point in a painting, or the words in written prose) can be faithfully represented into a stream of bits and transmitted over the internet, the “quantum state” of a physical entity can be faithfully represented into a stream of qubits. The quantum state of a qubit—the quantum version of a binary digit—can be indeterminant between the two classical states 0 and 1 and is fragile in the presence of disturbances. Therefore, the challenging but primary task for a quantum communication system is to transfer qubits faithfully from one place to another.
This act of quantum communication could be done by physically moving qubits that are stored in long-lived quantum memories based on trapped ions or superconductors. A far more promising approach is to encode the qubits into a long, redundant stream of photonic qubits and send them over light-transmitting optical fiber. The photonic encoding of qubits is the only method of communication that allows quantum information to travel long distances over optical fiber or through space quickly and without significant degradation. This is why photonic qubits are the only means to connect far-flung quantum computers with links that can carry qubits from one place to the other. A third method, which is more exotic but still practical, is by so-called quantum teleportation, which transfers a quantum state from one quantum object to a distant quantum object. Teleportation of a quantum state requires the sender and receiver to use pre-shared entanglement, and to be able to communicate on the side using an authenticated classical communication channel, e.g., over the conventional internet.
The concept of quantum-state entanglement is fundamental to quantum systems, which use entanglement to gain their advantage over conventional systems. This slippery concept refers to the ability of physically separated objects to maintain strong, highly structured, and synchronized correlations of their behaviors. While the deep origin of this capability, which does not exist in classical physics, is somewhat beyond a common-sense, intuitive understanding, physicists have come to regard it as routine, and thus are able to design QIT systems that exploit such correlations for processing and transporting quantum information.
The great challenge is generating and keeping quantum entanglement “alive” over large distances and long times. Over distances of up to about a hundred kilometers, photons traveling through optical fibers (the same fibers that presently transmit most internet data) can do the job. Beyond a few hundred kilometers, photons are lost or disrupted. The ultimate scheme for long-range communication is likely to be transmission through fiber aided by yet-to-be-built devices called quantum repeaters. A quantum repeater can, in theory, receive and resend quantum states, enabling quantum entanglement to be created across very large distances at very high rates and at very high fidelities. Creation of a quantum repeater is a sought-after breakthrough technology and is being pursued, for example, at the new National-Science-Foundation-funded and University-of-Arizona-led Center for Quantum Networks, of which both authors are members.
The payoffs of creating such a communication system capable of creating entanglement across long distances will be diverse and, as with the existing internet, hard to predict. New capabilities to be provided may include, for example: 1) secure quantum computing in which a remote user sends a posed problem encoded in a stream of qubits to a quantum computer, which sends back a stream of qubits containing a proposed solution, leaving no trace in the quantum computer of either the problem posed or the solution; 2) distributed quantum computing in which two or more quantum computers can team up to solve a hard problem in a fundamentally more powerful fashion than they could if they were collaborating just using classical information exchange over the internet; 3) arrays of quantum-enabled sensors linked by entanglement, which would outperform conventional sensor arrays that communicate classically.
A policy agenda for quantum computing
Just as federal investment in the 1960s to the 1980s incubated the breakthrough technologies that made today’s internet possible, U.S. policymakers now have an opportunity to facilitate major advances in quantum computing—and make them as widely available as possible. As with the internet, the development of a quantum internet and associated systems like quantum computers and quantum sensors should be aimed initially at providing new capabilities to scientists and other researchers to make new discoveries. To this end, it is important to provide open access to those who wish to use federally supported infrastructure for research. Private companies have a significant role to play by making available open platforms for quantum computing, like IBM’s Quantum Experience.
Ensuring progress in the development of QIST requires sustained funding. The passage in 2018 of the National Quantum Initiative Act funneled $1 billion toward quantum research between 2019 and 2021 and laid the foundation for the U.S. National Quantum Initiative (NQI), an all-of-government effort to nurture and invest in the development of QIST and the quantum workforce. By supporting a “science-first” approach that aims to solve grand challenges in the field that have a potential for transformative impact, the NQI is creating the building blocks for scientific breakthroughs. In the past two years alone, the Department of Energy (DoE) and National Science Foundation (NSF) have funded 12 U.S.-led centers with budgets ranging from $50 to $100 million per center to study sub-disciplines and technical approaches in QIST. While some duplication of efforts is likely healthy and necessary to bring about creative scientific outcomes, more coordination at the leadership level of these NSF and DoE centers to enable sharing expertise and coordinating technology transition would be beneficial to the NQI’s mission.
But in order for university researchers to translate theoretical advances into practical breakthroughs, they need better engineering support infrastructure. To address this and related challenges, the U.S. Congress recently funded the creation of the National Science Foundation’s first new directorate in 30 years—the Directorate for Technology, Innovation, and Partnership (TIP)—whose purpose is to “advance science and engineering research and innovation leading to breakthrough technologies…and accelerate the translation of fundamental discoveries from lab to market.” Based on community-defined metrics, the NSF, the DoE, and the National Institute of Science and Technology (NIST) should continue to strengthen their programs fostering use-inspired, real-world QIST. In particular, the NSF’s new TIP directorate should encourage and support goal-oriented and translational research and development projects that will likely lead to a convincing quantum advantage and/or foster technology innovation that will support such projects.
The Biden-Harris administration appears to recognize the promise of and risks posed by quantum technology. In an executive order signed last month, President Joe Biden elevated the National Quantum Initiative Advisory Committee to report directly to the White House, a move aimed at providing “the most current, accurate, and relevant information on quantum information science and technology to drive forward U.S. policymaking and advance our technological edge.” An accompanying national security memorandum laid out a strategy for addressing the cybersecurity risks posed by quantum computers capable of breaking many forms of encryption. Taken together, these orders position the United States to remain a global leader in quantum science and technology development, while coordinating with the private sector in a way that protects national security interests.
Another key challenge in delivering on the promise of quantum technology is the shortage of quantum-ready workers to help this sector grow, and leaders in academia, government, and industry are working to devise education and training opportunities to help close this gap. Because QIST overlaps basic science and industry-relevant engineering, these players are working to design new university programs and curricula for a new subject area, quantum engineering.
While protection of national competitiveness is important in the commercial and government sectors, in science overprotection is detrimental. No country is a closed ecosystem in research innovations and capabilities. Each country should be able to recruit the best-suited talent for its needs. The evidence supporting this view is historical and vast. In the United States, the number of science leaders—Nobel Prize winners and tech founders—who are immigrants “recruited” to the U.S. by its openness is legendary. More so than money, creative individuals are the currency on which top-level research trades. Rather than closing borders to “protect” U.S. science and technology, policymakers should work hard to attract and retain the very best talent the world has to offer. If the U.S. doesn’t, other countries will.
Of course, in certain areas scientific espionage—whether by outright spies or by other less devious individuals—needs to be kept to a minimum. Some within the U.S. government are concerned about the potential for ambitious graduate students from other countries to be sent to the U.S. and funded by their home government with expectations of reporting back home what they have seen or learned. Rather than banning students from certain countries from studying in the U.S., instituting a system of scientific professionalism and accountability would allow foreign students to study in the United States while addressing security concerns. Under such a scheme, foreign students funded by their home country would be asked to sign a statement declaring whether they are required to report formally to their home government as a condition of their being supported financially. Each university would have their own ways of dealing with any discovered misrepresentation in such statements—from a warning to disenrollment. Such a system would be consistent with maintaining the openness of science along with scientific professionalism, while ensuring the United States is able to attract the talent necessary to deliver the scientific breakthroughs to realize the promise of quantum science.
Overall, QIST has made extraordinary advances in recent years. The potential for further breakthroughs and their ensuing benefits is enormous, yet major challenges remain. American policymakers should do all they can to help realize that promise.
Michael G. Raymer is a professor of physics, the Philip H. Knight Professor of Liberal Arts and Sciences Emeritus at the University of Oregon, and the author of a book for the general reader, Quantum Physics What Everyone Needs to Know.
Saikat Guha is a professor of optical sciences, electrical engineering and applied mathematics and the Peyghambarian Endowed Chair of Optical Sciences at the University of Arizona, Tucson. He is the director of the National Science Foundation-funded Engineering Research Center, the Center for Quantum Networks.
IBM provides financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.