IBM wants to build a 100,000-qubit quantum computer


via MIT Technology Review https://ift.tt/aAOBmpf

Late last year, IBM took the record for the largest quantum computing system with a processor that contained 433 quantum bits, or qubits, the fundamental building blocks of quantum information processing. Now, the company has set its sights on a much bigger target: a 100,000-qubit machine that it aims to build within 10 years.

IBM made the announcement on May 22 at the G7 summit in Hiroshima, Japan. The company will partner with the University of Tokyo and the University of Chicago in a $100 million dollar initiative to push quantum computing into the realm of full-scale operation, where the technology could potentially tackle pressing problems that no standard supercomputer can solve.

Or at least it can’t solve them alone. The idea is that the 100,000 qubits will work alongside the best “classical” supercomputers to achieve new breakthroughs in drug discovery, fertilizer production, battery performance, and a host of other applications. “I call this quantum-centric supercomputing,” IBM’s VP of quantum, Jay Gambetta, told MIT Technology Review in an in-person interview in London last week.

Quantum computing holds and processes information in a way that exploits the unique properties of fundamental particles: electrons, atoms, and small molecules can exist in multiple energy states at once, a phenomenon known as superposition, and the states of particles can become linked, or entangled, with one another. This means that information can be encoded and manipulated in novel ways, opening the door to a swath of classically impossible computing tasks.

As yet, quantum computers have not achieved anything useful that standard supercomputers cannot do. That is largely because they haven’t had enough qubits and because the systems are easily disrupted by tiny perturbations in their environment that physicists call noise. 

Researchers have been exploring ways to make do with noisy systems, but many expect that quantum systems will have to scale up significantly to be truly useful, so that they can devote a large fraction of their qubits to correcting the errors induced by noise. 

IBM is not the first to aim big. Google has said it is targeting a million qubits by the end of the decade, though error correction means only 10,000 will be available for computations. Maryland-based IonQ is aiming to have 1,024 “logical qubits,” each of which will be formed from an error-correcting circuit of 13 physical qubits, performing computations by 2028. Palo Alto–based PsiQuantum, like Google, is also aiming to build a million-qubit quantum computer, but it has not revealed its time scale or its error-correction requirements. 

Because of those requirements, citing the number of physical qubits is something of a red herring—the particulars of how they are built, which affect factors such as their resilience to noise and their ease of operation, are crucially important. The companies involved usually offer additional measures of performance, such as “quantum volume” and the number of “algorithmic qubits.” In the next decade advances in error correction, qubit performance, and software-led error “mitigation,” as well as the major distinctions between different types of qubits, will make this race especially tricky to follow.

Refining the hardware

IBM’s qubits are currently made from rings of superconducting metal, which follow the same rules as atoms when operated at millikelvin temperatures, just a tiny fraction of a degree above absolute zero. In theory, these qubits can be operated in a large ensemble. But according to IBM’s own road map, quantum computers of the sort it’s building can only scale up to 5,000 qubits with current technology. Most experts say that’s not big enough to yield much in the way of useful computation. To create powerful quantum computers, engineers will have to go bigger. And that will require new technology.

One example of what’s needed is much more energy-efficient control of qubits. At the moment, each one of IBM’s superconducting qubits requires around 65 watts to operate. “If I want to do 100,000, that’s a lot of energy: I’m going to need something the size of a building, and a nuclear power plant and a billion dollars, to make one machine,” Gambetta says. “That’s obviously ludicrous. To get from 5,000 to 100,000, we clearly need innovation.”

IBM has already done proof-of-principle experiments showing that integrated circuits based on “complementary metal oxide semiconductor” (CMOS) technology can be installed next to the cold qubits to control them with just tens of milliwatts. Beyond that, he admits, the technology required for quantum-centric supercomputing does not yet exist: that is why academic research is a vital part of the project.

The qubits will exist on a type of modular chip that is only just beginning to take shape in IBM labs. Modularity, essential when it will be impossible to put enough qubits on a single chip, requires interconnects that transfer quantum information between modules. IBM’s “Kookaburra,” a 1,386-qubit multichip processor with a quantum communication link, is under development and slated for release in 2025.

Other necessary innovations are where the universities come in. Researchers at Tokyo and Chicago have already made significant strides in areas such as components and communication innovations that could be vital parts of the final product, Gambetta says. He thinks there will likely be many more industry-academic collaborations to come over the next decade. “We have to help the universities do what they do best,” he says. Google is of the same mind: in a separate deal, it is devoting $50 million to funding for quantum computing research in the same two universities. 

Gambetta says the industry also needs more “quantum computational scientists,” people skilled in bridging the divide between the physicists creating the machine and the developers looking to design and implement useful algorithms.

Software that runs on quantum machines will be vitally important too. “We want to create the industry as fast as possible, and the best way to do that is to get people developing the equivalent of our classical software libraries,” Gambetta says. It’s why IBM has worked to make its systems available to academic researchers over the last few years, he says: IBM’s quantum processors can be put to work via the cloud using custom-built interfaces that require minimal understanding of the technicalities of quantum computing. He says there have been some 2,000 research papers written about experiments using the company’s quantum devices: “To me that’s a good indication of innovation happening.”

There is no guarantee that the $100 million earmarked for this project will be enough to achieve the 100,000-qubit goal. “There’s definitely risk,” Gambetta says.

Joe Fitzsimons, CEO of Horizon Quantum, a Singapore-based quantum software developer, agrees. “This is unlikely to be a completely smooth journey without surprises,” he says. 

But, he adds, it’s a risk that has to be taken: the industry has to face the fear of failure and make attempts to overcome the technical challenges facing large-scale quantum computing. IBM’s plan seems reasonable, Fitzsimons says, although there are plenty of potential roadblocks. “At this scale, control systems will be a limiting factor and will need to evolve significantly to support such a large number of qubits in a reasonably efficient way,” he says.

Comments