The role of a powerful classical computer in the evolution of quantum systems
Hello, this is Elena Kuznetsova, automation specialist at Sherpa Robotics. Today I translated an article for you on the reasons why every quantum computer will need a powerful classical computer. Let's figure out together why a quantum computer needs a strong classical accomplice.
Fixing bugs in quantum computers: processing data at 100 TB per second
Quantum computers are not just another technology that, despite its promising prospects, has so far failed to demonstrate much practical use. However, an ecosystem of startups has already formed around them, seeking to create more than just qubits. This may seem like an attempt to capitalize on the hype around quantum technology, but it's worth paying attention to the problems these companies are solving. They can give us insight into the complexities in the quantum computing field that giants such as Amazon, Google, IBM or Intel have not yet tackled.
One example is the British company Riverlane, which focuses on the huge amount of classical computing required for quantum hardware to work correctly. In particular, the company is targeting the massive amounts of data that will be needed for a critical aspect of quantum error correction: determining when an error has occurred. Effective error correction in quantum systems requires not only fast response, but also significant resources for analyzing the state of qubits. It is estimated that to achieve the desired level of correction, it will be necessary to process data at enormous speeds – up to 100 terabytes per second. This opens up new horizons for computing technology and highlights how challenging problems in quantum information science remain. Thus, despite current limitations and uncertainties, the work of startups like Riverlane demonstrates that beyond theory, practical solutions are already being actively developed to realize the potential of quantum computers. These efforts will help us not only better understand quantum computing, but also bring us closer to the point where it will become truly useful for science and industry.
Error detection and data
All qubits, regardless of the technology used – be it cold atoms, superconducting transmons or something else – are highly fragile and prone to losing their state both during operations and over time. These error levels place severe limits on the amount of computation that can be performed before an error becomes inevitable. This makes it virtually impossible to perform most useful calculations directly on existing hardware kBits.
The generally accepted solution to this problem is to use so-called logical qubits. These qubits are created by linking multiple hardware qubits and distributing quantum information between them. Additional hardware qubits are connected to monitor errors, which allows them to be corrected. To create one logical qubit can require dozens of hardware qubits, meaning that even the largest existing systems can only support about 50 reliable logical qubits.
Riverlane founder and CEO Steve Briley noted in an interview with Ars that error correction puts a strain not only on the hardware qubits, but also on the classical part of the system. Each kBit measurement used to monitor a system must be processed to detect and interpret possible errors. To perform even the simplest interesting calculations would require approximately 100 logical qubits, resulting in the need to monitor thousands of hardware qubits. More complex calculations may require thousands of logical kbits.
Error correction data (syndromic data in field terminology) must be read between each operation, creating enormous amounts of data. “At scale, we're talking about a hundred terabytes per second,” Briley said. “At a million physical kBits, we would process about a hundred terabytes per second, which is equivalent to global Netflix streaming.”
In addition, this data must be processed in real time, otherwise the calculations will be suspended while waiting for error correction. To avoid this, errors must be detected in real time. For transmon-based qubits, syndromic data is generated approximately every microsecond, which means that data – possibly terabytes – must be processed at a frequency of about a megahertz. Riverlane was founded to develop hardware capable of meeting this challenge.
Data processing
Riverlane presented its system, described in an article published on arXiv. It is designed to process syndromic data after other equipment has converted analog signals to digital form. This solution allows Riverlane's hardware to operate independently of the low-temperature devices required for some types of physical qubits.
To detect errors, an algorithm called “Collision Clustering Decoder” is used in the article. As an example of its effectiveness, the company implemented the decoder on a typical programmable logic integrated circuit (FPGA) from Xilinx. Interestingly, it only takes up about 5% of the chip, but is capable of processing a logic qubit built on top of almost 900 hardware qubits (in this case, they were simulated).
In addition, the company demonstrated its own chip that processes an even larger logic qubit, occupying only a tiny fraction of a square millimeter and consuming only 8 milliwatts of power.
Both versions of the system are highly specialized: they simply pass along error information for processing by other parts of the system. This solution is certainly highly specialized, but at the same time it demonstrates significant flexibility, working with different error correction codes. Critically, it integrates with systems designed to control qubits based on a variety of physical principles, including cold atoms, trapped ions, and transmons.
“It seemed like a real puzzle at first,” says Brierley. “We have all these different types of physics; How do we implement this? However, as it turned out, this did not become a serious problem. “One of our engineers was working in Oxford on superconducting qubits and spent the afternoon working on a trapped ion system. He returned to Cambridge in a ecstatic mood and said: 'They use the same control electronics.' It turned out that regardless of the physics driving the qubits, everyone was borrowing the same hardware from other fields (according to Brierley, it was an RF system-on-chip from Xilinx, designed for prototyping 5G base stations). This makes it easier for Riverlane hardware to integrate with a variety of systems and opens up new horizons for quantum computing.
What's next?
Recently, Riverlane presented an ambitious plan to accelerate the development of its quantum chips. As Briley told Ars Tech, “We now have a single quantum error correction chip that supports a single logical qubit on up to a thousand physical qubits. The next generation will support 10,000 physical qubits. This is a major challenge – there is a lot of engineering work to be done. This will be the first step towards creating error-correcting quantum computers.” According to Briley, the company plans to increase production tenfold every 12 to 18 months.
A recent post on arXiv also mentions that the algorithm currently stores the entire data stream, but in the future it will need to be adapted to “forget” outdated information and work in a narrower time window. The system is designed so that individual functional units can be combined on a single chip (Briley calls them “chiplets”), and, as complexity increases, multiple chips can be combined. He noted that the algorithm can be executed in parallel on the same data stream, provided that there is a temporary overlap between the signals that different chiplets process.
Riverlane's interest in this area is driven by the fact that similar problems must be addressed by all participants in the quantum computing market to move forward in the field of qubit error correction. As Briley acknowledged, there's nothing stopping them from creating their own solution. However, he shared his personal motivation why it is important for him to solve this problem:
“I spoke at a conference with a new development [квантового] algorithm and was very proud of it. Then the audience was asked who thought that a useful quantum computer would appear in five, ten, or fifteen years. About a third of the participants voted that this would never happen. I was a little shocked and thought, 'I've just invented an algorithm for a computer that will never exist.'”
What has changed?
Now Briley is much more optimistic. “I think within the next 12 months we will see the first durable logic qubit, and in two to three years we will quickly get to hundreds of logic qubits,” he said. For a technology that some critics constantly consider to be “behind the horizon,” this is a very short time indeed.