How Will Quantum Computers Evolve?
Quantum computers are advancing faster than ever, but what are the challenges we still need to overcome?
The new White House budget calls for an increase in annual spending on quantum computing (QC) to reach $860 million within two years. The announcement came just a few days after India declared that it would invest more than a billion dollars in QC following similar investments by Russia, the EU, and many tech giants. The massive investments and interest in QC are well deserved. Quantum computers hold great promise for unprecedented computational power well beyond the reach of current, non-quantum technology. Quantum systems will help solve currently unsolvable challenges in essential fields, including materials science, computational chemistry, optimization, AI, and cryptography. Moreover, with the continuous and rapid development of quantum technology over the last couple of decades, the once far away dream of harnessing this power seems closer now than ever. Still, despite significant progress, building quantum computers powerful enough to solve real-world problems remains one of the most challenging tasks in today’s technology frontier.
What makes it so difficult to build a reliable and scalable quantum computer? What are the key challenges we need to overcome to make it a reality? I’ll try to tackle these questions in this post. But first, let’s establish how quantum computers are different from classical computers and then examine how these differences give simultaneous rise to the extra power of quantum computers and the challenges ahead.
Today’s Computers: let’s flip some coins
To understand the power of quantum computers, we must first understand regular, non-quantum computers, also commonly called classical computers. A classical computer stores information in bits. Each bit in the computer is like a coin sitting on a table – it is in one of two binary states: heads or tails, more commonly known as “0” and “1”. Since there are many bits in the computer, it can store A LOT of information. For example, with only 3 bits, we can create an alphabet with 8 letters: ‘a’=000 (all bits are zero, or all coins are heads), ‘b’=001, ‘c’=010, ‘d’=011, etc. With 300 bits acting as letters, we can write a 100-character text. And if you have a computer with 1-gigabyte memory, that’s about 8 billion bits, you could write a library of thousands of books.
A computer is much more than just a big data storage device, however. What makes computers unique is their ability to process this data. The processing is executed within the heart of the computer – the processor. If we get back to our coin analogy for a moment, you can think of the processor as a super-fast coin-flipping machine that flips bits according to particular programs that we feed it. For example, as you’re reading this blog post on your browser, a program runs on your computer: the processor receives bits of information sent from your mouse and flips the bits that encode the images and text on your screen accordingly. All programs are eventually made of a sequence of very small and basic coin-flipping operations, called gates. We measure the computer’s processing power by the number of such gate operations it can perform in one second. A single-core 3GHz processor, for instance, can perform billions of gates every second.
But there is one thing that can destroy the computer’s operation – errors. Imagine a sudden wind gust blowing on our table that’s full of coins. The wind causes one of the coins to flip. During the runtime of a computer program, a flipping bit can cause the program to produce the wrong results, get stuck, or even lead to the dreaded blue screen of death. Fortunately, this does not happen very often. John Martinis, head of Google’s Quantum Computing development, gave a great analogy that explains why the errors in classical computers are rare and why those in quantum computers are frequent, as we will see below. In this analogy, bits in a classical computer are similar to heavy coins – it is tough for random environmental changes (like the wind or the electrical noise in the computer’s power supply) to flip them over. The average bit in a modern computer would work for about a billion years before it accidentally flipped.
Quantum Computers: not lost in space
Quantum computers replace the classical bits and gates with quantum bits and quantum gates. While a classical bit can be either ‘0’ or ‘1’ at any given point in time, a quantum bit, also called a qubit, can be in BOTH ‘0’ and ‘1’ states simultaneously. In fact, it can be in any combination of ‘0’ and ‘1’. This phenomenon is called superposition.
In a way, instead of a coin sitting on a table, we are now dealing with a coin floating in outer space where there is no gravity to fix it to the table. It does not have to be heads or tails anymore, but instead can be exactly on its side, tilted at a 45-degree angle, etc. But wait, it gets even crazier. If we have 3 qubits and we use our 8-letter alphabet again (‘a’=000, ‘b’=001, etc.), then these 3 qubits can take the form of ALL 8 letters at the same time! The more qubits we add, the more powerful this parallelism is – it grows exponentially with the number of qubits in the quantum computer. Essentially, adding just one more qubit to an existing system doubles its computational power.
Now, just like with classical computers, this story does not end with storing data and information. Quantum computers use quantum gates to process the quantum information stored in their qubits. A quantum program or quantum algorithm is a sequence of quantum gates designed to solve a certain problem. And due to the massive parallelism of the quantum information stored in qubits, quantum computers can, in some situations, do “exponential multi-tasking” and very quickly solve difficult problems that even the strongest supercomputers of today can never solve.
For example, most internet security today relies on large prime numbers. The base assumption is that our computers, no matter how powerful, cannot calculate the prime factors of these numbers in a reasonable time to cause security concerns. It would take the biggest computer in the world a billion years to decrypt a simple message let alone break advanced security protocols. In 1994 Peter Shor developed a quantum algorithm that can discover the prime factors of a very large number in a very short amount of time. Shor’s algorithm demonstrated just how disruptive quantum computers could be when they become operational, essentially proving that a large-scale quantum computer could decrypt today’s security protocols in minutes or hours.
Although able to perform functions that traditional computers cannot, quantum computers are not without their challenges. Going back to Martinis’ analogy, while classical bits are akin to heavy coins and are stabilized by gravity to the table, qubits are coins floating in outer space and can be affected by almost anything. Any small disturbance can rotate the qubit and even a small one-degree rotation can cause errors in the computation. And while there are many challenges in scaling up quantum computers, the high error rate, which stems from this sensitivity of qubits to their environment, is by far the greatest.
Fighting Errors: the way towards quantum evolution
To move quantum computers forward to the point where they start living up to our high expectations, we must overcome their propensity for errors. Here are four important strategies that can be used to accomplish this.
Better Qubits: We must design better qubits that are less sensitive to noise. From the qubits’ structure, through the materials they are made from and their purity, to the fabrication processes used to produce them. Everything affects the sensitivity of the qubits and clever design can really help. For instance, in superconducting qubits (one of the most promising types of qubits) there has been a continuous and very dramatic improvement in the qubit sensitivity over the last couple of decades. Qubits must continue to improve in order to allow for efficient scaling up of quantum computers.
Better Environments: Another important strategy that proved itself very effective, besides improving the qubits themselves, is to focus on the environment in which the qubits are set. Designing a better environment for the qubits in order to isolate them from the relevant noise that disturbs them will allow for the dramatic improvement of quantum computers to continue.
Better gates: Currently, the way gates are performed on the qubits actually introduces a lot of noise and causes further errors. This can be improved significantly using super clever control methods such as optimal and robust control, a hot topic developed by some of the leading commercial players today.
Quantum Error Correction: The ultimate way of fighting errors is to perform Quantum Error Correction (QEC). The development of QEC in the late 90s gave the quantum computing community hope of actually scaling up quantum computers one day. In quantum error correction, one takes a bunch of physical qubits, and together, they can behave as a single qubit, also known as a logical qubit. There are then protocols for detecting and correcting errors in this logical qubit during a quantum program’s runtime.
Quantum Error Correction is considered the ultimate way of dealing with errors. Once the basic error-rate of the physical qubits that make up the logical qubit is low enough, using enough physical qubits enables a significant reduction of the logical qubit’s error rate. The current error rate of physical qubits is already low enough to start doing QEC. However, for this QEC to be effective, one would need thousands of qubits for each logical qubit, which is impractical due to the large overhead and other limiting bottlenecks that exist today. Thus, it’s essential that the other error-fighting strategies continue to reduce physical qubits’ error rates in the next several years. This way, the overhead of QEC can be reduced to the point where error correction — and thus useful quantum computing — can become a reality.
This post was originally published on HackerNoon.