We finally understood what IBM’s quantum volume was

IBM presented its roadmap for the quantum computer at the beginning of September: By the end of 2023 it will be possible to build a quantum processor, the Condor, with 1121 qubits. It sounds a lot better than IBM’s previous announcement of a quantum volume of 64, except that the two performances aren’t talking about the same thing at all.

Continuation of the article below

“The Condor’s 1121 qubits will be physical qubits, aluminum molecules – transmons – to which we send impulses that correspond to the instructions of an algorithm. The more qubits there are, the more pulses can be sent in parallel and the more complex algorithms can be written. The challenge is to achieve the quantum advantage, ie the threshold above which a quantum processor becomes more efficient than a conventional processor to solve such complex problems, ”explains MagIT Olivier Hess, who is responsible for quantum activities for France at IBM Q, who is responsible for the IBM department responsible for quantum computers.

“The problem is that the algorithm can only work as long as the transmons remain in a superimposed state. However, this duration depends on about twenty parameters, including the number of qubits, but also the type of each pulse. The performance of the quantum processor can therefore not only be measured by its number of qubits. This is why we envisioned quantum volume: a measurement that averages these twenty parameters and allows us to better localize the processor’s performance. “”

Nevertheless, the quantum volume is calculated a posteriori. It is not yet known which of the Condor will reach. Just as we still don’t know which of the quantum processors IBM will develop by then: the Eagle with 127 qubits in 2021 and the Osprey with 433 qubits in 2022. We also don’t yet know what that is. The quantum volume of Hummingbird, the 65-qubit quantum processor that IBM developed a few days ago. It was not until last August that IBM had enough observation elements to calculate that its 27-qubit Falcon processor, launched a year earlier, would eventually reach a quantum volume of 64.

Quantum Volume for Standardizing Power Measurement …

So far, the quantum volume presented by IBM seems to be the standard for measuring the performance of quantum processors. The other actors involved in the development of this type of machine no longer express their intention to achieve better results.

Aside from the fact that it can be dangerous to use as a yardstick for measuring the performance of quantum computers that are unrelated to each other. So we’ve since learned that Honeywell, who first boasted a quantum volume of 64 last June, was actually talking about a measurement made on a homemade quantum processor with just … 5 qubits.

It can therefore be concluded that IBM’s technology, which is based on the superconductivity of aluminum molecules, when cryogenized in a 3.5 m high vessel is more efficient at burning a large number of qubits. And that Honeywell’s, which consists of trapping electrons in a gas by bombarding them with a laser, would be better off keeping them working for a long time. But that’s not the problem. The problem is that we suspect that an algorithm that sends only a few consecutive instructions to a large number of parallel computing units (in the case of IBM) has little to do with an algorithm. This sends a large number of consecutive instructions to a few parallel computing units (the Honeywell case).

… for the standardization of quantum algorithms

However, behind the raw performance and pursuit of the famous quantum advantage, one of the quantum computer’s greatest economic challenges is to write exactly how its algorithms are written.

“This is why I am in Montpellier, a stone’s throw from the University of Science and Technology with which the IBM Q Unit works closely. We have two locations. The first is to work on the use cases that lend themselves to quantum processing and programming their algorithms. The second is how we will train the programmers of these algorithms, ”says Olivier Hess.

Olivier Hess is thrilled with the idea that these efforts will help make Montpellier the cradle of the next ecosystem of startups dedicated to quantum computing. This would make France one of the main epicentres in the industry. The IBM Q research centers in Watson (USA) and Zurich (Switzerland) are more dedicated to the design of the processors. At least if the quantum computer from IBM is needed. Because it is still completely unclear whether the algorithmic knowledge provided in Montpellier can be transferred to Honeywell’s quantum machine. Not even with other quantum architectures that the French Atos identified in order to initially sell quantum accelerators to supercomputers.

Suddenly find the solution that best connects to a problem

Let’s go ahead. The goal of the quantum computer is to find the right solution to a problem more quickly, not by evaluating all possibilities individually, as a classic computer does, but by immediately finding the one that best combines with the others. Problem. This ability is theoretically made possible by the properties of quantum physics: on a microscopic scale, a particle can be in all energy states at the same time (the “quantum superposition”) until it is sent an electromagnetic pulse that freezes it in a certain state (“quantum decoherence”) . This state can best be combined with the received electromagnetic pulse.

In order to transfer this physical principle to computing, the researchers came up with the idea of ​​using particles as transistors in electronics. In this case, the point is to align them to a certain topology so that particles, by freezing them in a certain state, according to the algorithm, affect the type of pulses that are then sent to other particles. At the end, all the particles are frozen in a certain state, which is either 0 or 1, and the amount indicates in binary the result of the operation.

In classical computing, the connections between transistors are physically etched onto the silicon of the processors. You draw logic gates (AND, OR, XOR, etc.), which are grouped there in electronic circuits in order to carry out an arithmetic operation there, in order to store information in a register there, in order to compare two registers there, jump to another point in the program … These electronic circuits correspond to instructions that can be executed by a program.
Be careful, we are talking about assembly instructions that can be interpreted directly by the processor and that were automatically generated when an application was compiled. Nowadays this application is written by a developer in a high level language (Java, C, Python … or even in a completely graphical low-code environment) that is much more understandable to a human.

No circuit is etched in quantum computing. “The only thing we engrave are transmons, that is, a collection of particles that behave like an aluminum atom, but are larger than an aluminum atom in order to remain in the superimposed state for as long as possible,” explains Olivier Hess.

Programming in quanta: more FPGA than data science

“Our current know-how is to achieve 65 transmons on one surface, which can remain superimposed for about a hundred microseconds and during this time receive a series of electromagnetic pulses at 5 GHz. By improving our cryonics and microscopic assembly techniques, we aim to achieve 1121 transmons by 2023, which theoretically should remain in a superimposed state for at least as many microseconds, ”he explains.

In short, the task of the quantum algorithm developer is tedious, that he first has to define the connections between the transmons himself, so that they primarily behave like logic gates. That said, the expertise needed at this point is more that of an electronics engineer programming the circuitry of an FPGA than that of a data scientist juggling high-level languages ​​to determine the most likely course of a stock market share, one of the goals of the quantum computer.

“The developer has about fifteen instructions for writing his algorithms. Among these there is the one that puts a transmon at the very beginning of the algorithm into a superimposed state that synchronizes the state of one transmon with that of another (“quantum entanglement”). or the one that guarantees decoherence at the end of the algorithm, ”says the IBM manager.

The quantum volume primarily measures the possible complexity of algorithms

It should be noted that, contrary to the famous thought experiment of Schrödinger’s cat, which alone illustrates quantum physics, it is possible here to send several electromagnetic impulses one after the other to a superimposed transmon without causing its collapse immediately a given state.

However, the number of consecutive electromagnetic pulses is limited: Above a certain number, the transmon would collapse in a state before it has reached 100 microseconds. That number would also be a function of momentum: some instructions would wear away the particles faster than others. It should also be taken into account that in any case the device that transmits the 5 GHz pulses to the transmons is no more capable than a conventional computer of sending a new signal every 0.000000002 seconds (a second divided by five billion or theoretically 2000 electromagnetic impulses at most per transmon for 100 microseconds). Taking into account all these parameters in combination with the number of available transmons enables the calculation of the quantum volume.

“I invite everyone to test the development of the algorithms for free by registering in the IBM Quantum Experience portal. We tried to set up very graphical tools that represent the qubits and the sequence of instructions sent to them as musical staves. It’s a lot of fun, ”concludes Olivier Hess.