Decoherence: Quantum Computer’s Greatest Obstacle
Okay, so I have explained the basic theory behind how quantum computers will work in my previous article (A Brief Introduction to Quantum Computing) but how the heck do we get small atoms to solve math problems?
To understand the solution we need to know the problem.
Decoherence, Quantum Computing’s Greatest Challenge
Superposition for subatomic particles is like balancing a coin, any small movement, vibration or even sound can affect the coin from being in a neutral state to collapsing on either heads or tails (0 or 1).
The decoherence theory is reverting a quantum system back to classical through interactions with the environment which decay and eliminate quantum behaviour of particles.
Due to decoherence qubits are extremely fragile and their ability to stay in superposition and or entangle is severely jeopardized. Radiation, light, sound, vibrations, heat, magnetic fields or even the act of measuring a qubit are all examples of decoherence.
Which basically means if we don’t factor in the precautions for completely eliminating decoherence then there is no quantum system aka Quantum Computer.
Decoherence leads to errors in quantum computational systems where information is lost.
Entanglement gives qubits more computational power because theoretically as extra qubits are added to a system, it doubles the amount of parallel operations that can be done. This is why about 300 perfectly entangled qubits in superposition could map all the information in the universe starting from the Big Bang.
However a larger system of qubits starts to entangle with its environment which causes decoherence to affect the superposition state and disrupt quantum information processing. This brings another problem because the system has to be loosely connected with the outside world for us to read the processed data. There is a limit to how long qubits can retain their quantum properties before errors falter the computational mechanism. This is called Coherence Length and it is a great way to measure the strength and stability of a qubit.
Coherence Length is the time a qubit can survive its quantum properties. World’s longest lasting qubit holds the record of 39 minutes in superposition state, it may seem short but that amount of time could calculate more than 200 million operations. Now the idea is to have a qubit system with a coherence length long enough to compute mathematical problems.
“The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations,” ~ Andrea Morello
To increase coherence length and build fault-tolerant quantum computers we must introduce the idea of Quantum Error Correction.
QEC is a theory loosely built on conventional error correction where information is copied and encoded into multiple bits. This helps to deal with any errors that disrupt information and change their state from 0 to 1. Corrupt information can still be retrieved by vote of majority from the original copied versions that help realign a faulty bit. However this cannot be true for qubits because of the no-cloning theorem so we cannot create copies of quantum information. This can be overcome by using Peter Shor’s quantum error correcting code. Essentially the use of entanglement helps detect and correct errors while keeping the state of a qubit intact. For a further understanding the paper Quantum Error Correction for Beginners will help explain the basics of QEC.
Now we know QEC can be used to prolong coherence length by correcting errors caused by decoherence.
Another very important mechanism for qubits to do computations is Quantum Gates.
Quantum gates make up quantum circuits, they perform operations on qubits to train them towards a desired outcome. A huge difference in quantum gates vs. classical gates is that they are reversible. This means as qubits travel through gates and are modified, their previous information can be retrieved (no information loss).
Information loss is a huge problem in our computers today and you can witness this every time your computer starts to heat up. Quantum gates have to be reversible because if prior information is lost then it affects entangled qubits, essentially qubits become unentangled with loss of information.
Quantum gates are made reversible by using Unitary Operations where input is equal to output. As qubits pass through gates they lose their original state and if that quantum state is needed again a qubit can revert back to its original form. Its original quantum state is equal to the output which a unitary operation gives and output = input. So reversible gates are important to keep entangled qubits intact and also helps with energy efficiency.
- Decoherence is the interactions a qubit has with its environment which causes disturbances and collapse superposition
- Decoherence leads to errors in quantum information but there must be loose interactions of a qubit and its environment for us to read processed data
- This introduces coherence length which is the amount of time a qubit stays in superposition long enough to do computations
- Quantum error correction is used to detect and correct corrupted quantum information to help prolong coherence length and other faults in computation
- Quantum gates are like classical gates in circuits except they are reversible so qubits can retain their original states (stay entangled)
Types of Quantum Computers:
There are three types of quantum computers with different levels of computation power, Quantum Annealer, Analog Quantum, and Universal Quantum.
How is the D-Wave Quantum Computer Built?
D-wave’s quantum computer is the first in the world and they are the leaders in the quantum computing market. Their computer is classified as a quantum annealer. Let’s discuss how their computer is built and how it can operate without decoherence.
Quantum Annealing is the method used to solve problems by finding the global minimum (lowest point in a valley) of a plotted function. All solutions are mapped out on a grid and quantum annealing algorithm is applied to find the optimal solution at the lowest point/energy. Imagine water on top of a hill that travel downwards at the same time and pools at the lowest dip in the landscape. This is what qubits essentially do under quantum annealing. The lowest dip is the best possible solution.
The computer must be built in such a way so that it can eliminate decoherence. The quantum processor is very isolated from its environment. This is done by a massive refrigeration system that cools the temperature extremely close to absolute zero ( 0 kelvin / -273 celsius) which is colder than outer space. This prevents any outside disturbances such as magnetic fields, vibrations, etc. The refrigerator loops liquid helium which is recycled to keep temperature constant.
To learn more about the quantum processor, fault-tolerant material, and its software refer to this document The D-Wave Quantum Computer.
- Decoherence is one of the biggest obstacles in quantum computing
- Coherence length can increase while decoherence decreases with Quantum Error Correction
- Quantum gates need to be reversible so that qubits can stay entangled
- D-wave is the first Quantum Computer that uses quantum annealing algorithms and refrigeration systems to compute problems and keep qubits intact
So now we have learned the obstacles in creating quantum computing and the processes to overcome them. Everyday there are new breakthroughs in research but we are still in the early stages of quantum computer development and it may take up to 10 years before we see the universal quantum computer.
I have tried to simplify all concepts, please let me know if any of my information is incorrect and I’d love more suggestions in what areas I should explore next in quantum computing.
Thank you for reading and I hope you have a better understanding of quantum computers!