Quantum computing is one of the biggest over-the-horizon issues facing Washington: A technology capable of exponentially faster computers, with both huge potential and major risks, implicating everything from national security to materials science to agriculture. But there’s a big question mark looming over all of it: When, exactly, is any of this going to happen? Quantum computing has been exciting science for years without ever moving past the prototype phase. Ask most researchers about the main barriers to effective quantum computers, and they’ll list a series of technical hurdles that need surmounting before the devices can consistently outpace classical computers in solving real-world problems. Jay Gambetta, the head of IBM Quantum, doesn’t dispute the existence of those hurdles. But as his team knocks down one theoretical domino after another, he worries about one other hindrance to the quantum revolution: persistent skepticism of the technology from scientists themselves. “If you want to go faster, you have to take more risk,” Gambetta told me during a recent visit to IBM’s quantum research center in Yorktown, New York. “And it’s hard when you’ve got a new technology that you haven’t had experience with. There’s a lag in the system.” The skepticism stems in part from the fact that quantum computers aren’t just faster: they’re different . Unlike classical computers, which operate with sequential pulses representing “bits” of either 0 or 1, quantum computers are built on “qubits,” particles kept in a fuzzy state that allow them to simultaneously act as multiple combinations of 0 and 1. In theory, that structure could allow a computer to solve hugely complex problems very fast. In practice, there are serious questions about how soon scientists will be able to harness the weird properties of quantum mechanics reliably. Gambetta doesn’t share that skepticism. Within just a couple of years, he says, “we'll be at a point where you will have a tool that is more powerful than any classical computer.” And he worries researchers across a variety of fields — from chemistry and medicine to machine learning and industrial optimization — are leaving valuable knowledge on the table by refusing to think big. Researchers, he suggests, aren’t pushing the limits of the technology as hard as they should. “When your metrics are rated to publishing papers, if you want to publish something, you want to do something that you know is going to work,” Gambetta said. Many of the scientists now tinkering with existing quantum computers, he explained, “limit the size of the problems they want to explore based on knowing what the results will be. They don’t push how to get the most out of the hardware.” Gambetta admits technical problems like decoherence — where the natural breakdown of quantum systems leads to increasing error rates in quantum computing processes — remain a major challenge. But he points to IBM’s consistent progress across its seven-year timeline for quantum breakthroughs , and says IBM will unveil Osprey, its latest 433-qubit quantum chip, in a matter of weeks. Given that rapid progress, Gambetta said his primary concern has shifted from whether quantum computers will work to whether scientists will have the confidence and know-how to produce new algorithms that can exploit their tremendous power in specific fields. That’s where the U.S. government comes in . Gambetta wants Washington to leverage its Quantum User Expansion for Science Technology (QUEST) program — a new project being stood up as part of the Chips and Science Act — to get advanced quantum computers in front of top scientists as soon as possible. But the QUEST program remains largely unfunded. And without new appropriations from Congress, Gambetta fears the technical capabilities of quantum computers will quickly outpace the ability of researchers to use them effectively. “If you want to get the best science to happen, you need to provide the scientists with the best tools,” he said. “That’s what the QUEST is set out to do. But is it enough? Maybe not.”
|