Shining a light on QEC controller requirements: The Crucial Role of Fast Classical Processing for Quantum Computation with Quantum Error Correction
In the ever-evolving realm of quantum computing, the quest for fault-tolerant quantum computation has been one of the most challenging yet promising endeavors. At the heart of this pursuit lies the intricate field of Quantum Error Correction (QEC), a frontier where quantum gates and measurements dance in a delicate balance to stabilize computational qubits so that the quantum wavefunction of a single logical qubit is encoded within a set of computational qubits. The promise of QEC is that quantum computation errors have the potential to reduce exponentially given enough quantum resources with a low enough error probability. While the spotlight has often been on the development and evaluation of QEC codes and decoding algorithms, a critical aspect has remained in the shadows – the classical control system running QEC codes.
In QEC, the quantum control system assumes a pivotal role. Apart from executing the specific physical gate sequence with the highest fidelity, the success of QEC computation depends on real-time classical computations that convert quantum measurements into estimated logical Pauli frame updates or logical measurement results. However, despite the control system’s pivotal role, its specifications and requirements have, until now, been largely overlooked. That is why our research scientists at Quantum Machines, together with our partners at NVIDIA, decided to put these control requirements and benchmarks for QEC on paper with their recent publication [1]. In this blog post, we give a glance at the new findings, emphasize the importance of integrating low-latency feed-forward quantum operations, and showcase why the defined near-term benchmarks aimed at addressing classical bottlenecks in QEC quantum computation.
Latency Requirements for Successful Fault-Tolerant Quantum Computation with QEC Feed-Forward Â
Achieving practical quantum computation with Quantum Error Correction (QEC), and specifically executing fault-tolerant non-Clifford gates, mandates that the control unit executes quantum gates contingent upon the decoding output. Furthermore, to avert ever growing logical gate clock cycles it is imperative for the control system to establish a tightly closed loop, characterized by its latency, spanning from the physical quantum measurement through the classical decoding process to a conditional feed-forward quantum operation. Until this paper, the only condition for the success of a control system lay in the ability to process the classical data faster than it is created [2].Â
Employing an innovative dynamical system analysis, QM researchers illustrated how the performance of the QECÂ control system latency dictates the operational scenario. This encompasses latency divergence, rendering quantum calculations unfeasible; classical-operation limited runtime, where the control system determines the logical clock; or quantum-operation limited runtime, where classical operations do not impede the quantum circuit.
As shown in the diagram above, the decoding utilization is a critical parameter whose value can lead to distinct operational scenarios. Depending on the processing performance, one may encounter a Quantum Operation Limited regime (blue), where the logical clock is determined by the quantum operations, a Classical Operation Limited regime (grey), where the logical clock is determined by the classical processing, or a latency divergence regime (red), where the decoder’s throughput is smaller than the syndrome arrival rate, preventing the realization of any meaningful quantum computations.
Quantifying Quantum Error Correction Control Processor Performance: Latency Benchmarking
Deviating from the conventional practice of exclusively evaluating decoder performance, Kurman et al. defined for the first time benchmarks that evaluate the overall performance of the Quantum Error Correction Control Processor (QECCP). Acknowledging the critical impact of classical control and computational on fault-tolerant QEC computations, QM scientists have introduced benchmarks based on the feed-forward latency. Specifically, measuring the time elapsed from a series of physical measurements implementing a logical measurement to the execution of a conditional set of decoding-dependent feed-forward operations in simple logical circuits that stand as the building blocks for fault-tolerant quantum computation. Â
These new benchmarks go beyond our previous work on pulse-level benchmarks [3], encapsulating three crucial elements:Â
- The feed-forward latency, spanning from the last RF input of measurement signals to the controller to the initial conditional output RF signal from the controller, depends on complex classical processing in the form of QEC decoding.
- The capability to concurrently execute quantum operations during the decoding process
- The decoding throughput, extracted through the feed-forward latency of syndromes that were generated during previous decoding tasks.
The authors mention that the specific experiments defined in the benchmarks can be expanded in various manners to examine the different required aspects of classical hardware. These criteria define the ability of a QECCP to support state-of-the-art QEC computation, and set the stage for scaling quantum hardware.
As the dimensions and applications of quantum computers expand, the role of control systems becomes crucial in optimizing the Quantum Error Correction Control Processor (QECCP) performance. In the realm of benchmarking, our focus on quantum feedback takes center stage, proving pivotal for a diverse array of quantum algorithms. Quantum Machines’ OPX+ controller, with its specialized architecture and QUA programming language, facilitates the seamless integration of comprehensive feedback, enhancing the QECCP’s performance. The Pulse Processing Unit (PPU) within OPX+ dynamically responds to measurements, executes calculations, and intricately coordinates control flow and qubit drives based on measured data.Â
In essence, this strategic amalgamation of advanced control capabilities aligns seamlessly with the pursuit of quantifying Quantum Error Correction Control Processor Performance through latency benchmarking, underscoring the critical interplay between precise control and efficient quantum processing.
Learn more about quantum feedback and feed-forward with OPX on our dedicated page.
All the above-mentioned features and capabilities are now brought to scale, with Quantum Machines’ brand-new OPX1000, the best-performing, agile, and scaled-up quantum controller on the market.
[1] Kurman, Y., Ella, L., Szmuk, R., Wertheim, O., Dorschner, B., Stanwyck, S., & Cohen, Y. (2023). Control Requirements and Benchmarks for Quantum Error Correction.
[2] B. M. Terhal, Quantum Error Correction for Quantum Memories, Rev. Mod. Phys. 87, 307 (2015).Â
[3] Lior Ella, Lorenzo Leandro, Oded Wertheim, Yoav Romach, Ramon Szmuk, Yoel Knol, Nissim Ofek, Itamar Sivan, & Yonatan Cohen. (2023). Quantum-classical processing and benchmarking at the pulse-level.