# Quantum Computing

Quantum computers are next-generation computers that take a novel approach for processing information. They're built on the principles of quantum mechanics. They can solve some difficult problems that are practically impossible using conventional computers. Due to certain quantum properties, quantum computers are a lot faster.

Conventional computing, nowadays called classical computing, uses transistors to process information. Quantum computing works at the level of atoms and subatomic particles. In fact, there's a limit to how much we can miniaturize transistors since beyond a certain point quantum effects come into play. This is where quantum computing becomes relevant.

Quantum computing is a subfield of quantum information science. Many aspects of quantum computing are yet to be understood. Building such computers and making them practical is still a challenge. As such, we expect classical computers to serve their purpose for many more decades.

## Discussion

• What are some possible applications of quantum computers?

Cryptography is one of the applications. Today's digital systems are secure because classical computers can't do integer factorization within a reasonable time. A quantum computer can do this using Shor's algorithm. Quantum computing when applied to cryptography would lead to more secure systems.

Search problems, or more generally optimisation of multivariate problems, are not efficiently solved using classical computers. A quantum computer programmed with Grover's algorithm can solve these problems in polynomial time.

In domains such as chemistry and nanotechnology, there's a need to simulate quantum systems. Classical computers are inefficient with such simulations. Quantum computers are better suited for this purpose. For example, modelling complex molecular interactions at atomic level can lead to drug discovery. Quantum simulations can help us build more efficient energy systems, better superconductors and newer materials.

Machine learning involves training models with lots of data. Quantum machine learning is being explored to speed up learning.

In general, where classical computers are overwhelmed by lots of data and endless calculations, quantum computers can fill the gap.

• Which are the essential quantum properties enabling quantum computing?

Three quantum properties are being harnessed for quantum computing:

• Superposition: Given two states, a quantum particle exists in both states at the same time. Alternatively, we may say that the particle exists in any combination of the two states. The particle's state is always changing but it can be programmed such that, for example, 30% of the time it's in one state and 70% in the other state.
• Entanglement: Two quantum particles can form a single system and influence each other. Measurements from one can be correlated from the other.
• Quantum Interference: Trying to measure the current state of a quantum particle leads to a collapse; that is, the measured state is one of the two states, not something in between. External interference influences the probability of particle collapsing to one state or the other. Quantum computing systems must therefore must be protected from external interference.
• What are qubits and pbits?

In classical computing, the unit of information is the bit. It's value is either 0 or 1. In quantum computing, due to superposition, the classical bit is inadequate to represent a continuum of states. Therefore, quantum bit or qubit has been defined as the unit of quantum information.

To say that a qubit can be both 0 and 1 simultaneously can sound mystical. A simple analogy is to consider an x-y plane and a vector 45 degrees to the x-axis. This vector has both x and y components at the same time. Likewise, a qubit is both 0 and 1 simultaneously. We can visualize it as any point on the Bloch sphere with the poles representing 0 or 1.

Probabilistic bit or pbit is used as a simpler alternative to the qubit. The state of a pbit is a probabilistic combination of the two logical states 0 and 1.

A process called conditioning converts pbits to classical information. Similarly, another process called measurement converts qubits to classical information.

• What makes quantum computers compute faster than classical computers?

In classical computing, an n-bit number can represent a single value of range $$[0, 2^{n-1}]$$. In quantum computing, an n-qubit system represents all the possible $$2^n$$ values at the same time. Alternatively, if we add a single bit to a classical computer, it's memory increases by one bit. The same change to a quantum computer doubles its memory.

Consider a 4-bit input. To process all 16 possible input patterns, a classical computer would execute its algorithm one input at a time. A 4-qubit quantum computer would process all 16 patterns at the same time. As more bits are added to a quantum computer, its processing power increases exponentially.

This is why a 50-qubit system can be compared to a classical computer handling billions of bits. A qubit counts for much more than a classical bit. Fifty-qubit is an approximate threshold at which quantum computers are able to perform calculations not feasible with classical computers.

• What does it take to build quantum computers?

To build a quantum computer, qubits must be housed in an environment that achieves minimum interference and maximum coherence. We need a method to transfer signals to qubits. We need classical computers to send instructions to qubits. Thus, it's not just about computing with qubits but also about converting from/to classical information.

Various physical systems can be used to build qubits: ion traps, crystal defects, semiconductor quantum dots, superconductors, topological nanowires, neural atoms in optical lattices, nuclear magnetic resonance (NMR) and more.

Just as transistors are combined in classical computing to implement Boolean logic using logic gates, there are various quantum computing models. Quantum circuit is one such model that uses quantum gates. Other models include one-way or measurement-based quantum computer (MBQC), adiabatic quantum computer (AQC), and topological quantum computer. Quantum Turing Machine (QTM) is of theoretical importance. All models are equivalent to one another.

Quantum algorithms have to be developed. These are often specific to the computing model.

• What does "quantum" refer to in quantum computing?

A simple answer is that the term "quantum" comes from quantum mechanics. In physics, it refers to properties of atomic or subatomic particles, such as electrons, neutrinos, and photons.

The real question is what aspects of quantum mechanics make quantum computing better than classical computing. There's no definitive answer though some theories have been put forward.

Interference, entanglement, quantum contextuality, logical structure of quantum mechanics and quantum parallelism have been proposed. All of these have their critics. We don't understand this very well. Perhaps this is why we have very few quantum algorithms.

A dominant theory is Quantum Parallelism Thesis (QPT), which in turn suggests Many Worlds Interpretation (MWI). Due to superposition, quantum computers compute on different input values simultaneously. In some sense, computations happen in parallel in many classical universes. Decoherence is used to distinguish one world from another. But this is at odds with the quantum circuit model and its coherence superpositions. Moreover, any measurement collapses the output state. Quantum algorithms must therefore construct 'clever' superpositions to increase the chance of obtaining a useful result.

• What are some practical difficulties of building quantum computers?

Challenges include fabrication, verification and architecture. Since quantum states are fragile, fabrication must be precise. Since any measurement collapses the state in a probabilistic manner, verification is difficult. Errors are more common than in classical computing and error correction is essential to the architecture.

Interactions with external environment results in decoherence. At best, we're able to maintain coherence for only a few microseconds. Coherence becomes harder to achieve with more qubits. Coherence requires that superconductors are cooled to near absolute zero.

Scaling up to more qubits on a single chip is another challenge. Today's systems need multiple control wires for every qubit.

For various reasons that we can simply call noise, qubits change state. We therefore need efficient error correction methods. One approach is to use many physical qubits (as many as 10,000) to map to a single logical qubit. This makes it even more difficult to build a useful quantum machine. Some consider quantum error correction to be the most important problem that needs focused research. Some even estimate that 99% of quantum computations will be spent for error correction.

## Milestones

1975

Poplavskii shows that it's computationally difficult to simulate quantum systems using classical computers. This view is reiterated by Feynmann at a conference in MIT in 1980.

1976

Polish mathematical physicist Roman Stanisław Ingarden publishes a seminal paper titled Quantum Information Theory. It's an attempt to generalize Shannon's information theory.

1980

Paul Benioff describes the first quantum mechanical model of a computer. He uses the Schrödinger equation to describe a Quantum Turing Machine (QTM). Yuri Manin proposes the idea of quantum computing. Benioff further develops his QTM in 1982.

1982

Wootters, Zurek and Dieks state the No-Cloning Theorem. This states that it's impossible to make identical copies of an arbitrary unknown quantum state. The theorem is consistent with the uncertainty principle in quantum mechanics. The theorem implies that a qubit can't be copied. This is a problem for taking backups or error correction. However, it's a useful property for cryptography.

1985

David Deutsch at the University of Oxford describes the first universal quantum computer. Like its classical counterpart in the Universal Turing Machine (UTM), this can simulate any other quantum computer in polynomial time.

1994

Peter Shor at AT&T Bell Labs invents a quantum algorithm that can factor large integers lot faster than classical computers. Theoretically, this can break many cryptographic systems in current use. This sparks interest in quantum computing. In time, this is called Shor's Algorithm. Shor assumes that qubits maintain their states. In practice, decoherence is a problem.

1995

Shor and Steane independently invent the first quantum error correcting codes. This circumvents the no-cloning theorem.

1996

Lov Grover at Bell Labs invents the quantum database search algorithm. This is relevant to a variety of problems. In particular, brute-force search problems obtain quadratic speedup. In time, this is called Grover's Algorithm.

1998

Researchers at the University of Oxford demonstrate a quantum algorithm on a 2-qubit NMR quantum computer. IBM researchers achieve a similar demonstration. The same year Grover's algorithm is demonstrated on an NMR quantum computer.

2015

Cambridge Quantum Computing releases t|ket>, which could be the first operating system for quantum computing. This enables classical computers interface to quantum computers.

2016

IBM's 5-qubit quantum computer becomes available on the cloud for academics and researchers, leading to demonstrations in 2017.

Oct
2019

Google claims to have achieved quantum supremacy with a 53-qubit programmable superconducting processor. Named Sycamore, it's based on quantum logic gates. The machine completes a benchmark test in 200 seconds, what a classical supercomputer would take 10,000 years. However, IBM claims a supercomputer with more disk storage could solve the problem in 2.5 days. In August 2020, 12 qubits of Sycamore are used to simulate a chemical reaction.

Mar
2020

Honeywell reports on a demonstration of a quantum computer architecture based on trapped-ion QCCD (Quantum Charge-Coupled Device). This year sees more interest from start-ups pursuing trapped-ion QCCD. Trapped-ion quantum computers can be traced to 1995 but most companies (IBM, Intel, Google) focused on superconductors instead.

Author
No. of Edits
No. of Chats
DevCoins
4
4
1522
4
0
1378
1
2
42
1920
Words
1
Likes
2329
Hits

## Cite As

Devopedia. 2021. "Quantum Computing." Version 9, February 11. Accessed 2021-03-28. https://devopedia.org/quantum-computing
Contributed by
3 authors

Last updated on
2021-02-11 12:55:33
• Quantum Algorithms
• Quantum Cryptography
• Quantum Supremacy
• Quantum Machine Learning
• Quantum Circuit
• Qubit
• Site Map