Quantum Computing

Comparing classical and quantum computing. Source: CB Insights 2021.
Comparing classical and quantum computing. Source: CB Insights 2021.

Quantum computers are next-generation computers that take a novel approach for processing information. They're built on the principles of quantum mechanics. They can solve some difficult problems that are practically impossible using conventional computers. Due to certain quantum properties, quantum computers are a lot faster.

Conventional computing, nowadays called classical computing, uses transistors to process information. Quantum computing works at the level of atoms and subatomic particles. In fact, there's a limit to how much we can miniaturize transistors since beyond a certain point quantum effects come into play. This is where quantum computing becomes relevant.

Quantum computing is a subfield of quantum information science. Many aspects of quantum computing are yet to be understood. Building such computers and making them practical is still a challenge. As such, we expect classical computers to serve their purpose for many more decades.

Discussion

  • What are some possible applications of quantum computers?
    Use cases of quantum computing. Source: Consultancy.uk 2020.
    Use cases of quantum computing. Source: Consultancy.uk 2020.

    Cryptography is one of the applications. Today's digital systems are secure because classical computers can't do integer factorization within a reasonable time. A quantum computer can do this using Shor's algorithm. Quantum computing when applied to cryptography would lead to more secure systems.

    Search problems, or more generally optimisation of multivariate problems, are not efficiently solved using classical computers. A quantum computer programmed with Grover's algorithm can solve these problems in polynomial time.

    In domains such as chemistry and nanotechnology, there's a need to simulate quantum systems. Classical computers are inefficient with such simulations. Quantum computers are better suited for this purpose. For example, modelling complex molecular interactions at atomic level can lead to drug discovery. Quantum simulations can help us build more efficient energy systems, better superconductors and newer materials.

    Machine learning involves training models with lots of data. Quantum machine learning is being explored to speed up learning.

    In general, where classical computers are overwhelmed by lots of data and endless calculations, quantum computers can fill the gap.

  • Which are the essential quantum properties enabling quantum computing?

    Three quantum properties are being harnessed for quantum computing:

    • Superposition: Given two states, a quantum particle exists in both states at the same time. Alternatively, we may say that the particle exists in any combination of the two states. The particle's state is always changing but it can be programmed such that, for example, 30% of the time it's in one state and 70% in the other state.
    • Entanglement: Two quantum particles can form a single system and influence each other. Measurements from one can be correlated from the other.
    • Quantum Interference: Trying to measure the current state of a quantum particle leads to a collapse; that is, the measured state is one of the two states, not something in between. External interference influences the probability of particle collapsing to one state or the other. Quantum computing systems must therefore must be protected from external interference.
  • What are qubits and pbits?
    Illustrating the classical bit, the pbit and the qubit. Source: Knill et al. 2002, fig. 1.
    Illustrating the classical bit, the pbit and the qubit. Source: Knill et al. 2002, fig. 1.

    In classical computing, the unit of information is the bit. It's value is either 0 or 1. In quantum computing, due to superposition, the classical bit is inadequate to represent a continuum of states. Therefore, quantum bit or qubit has been defined as the unit of quantum information.

    To say that a qubit can be both 0 and 1 simultaneously can sound mystical. A simple analogy is to consider an x-y plane and a vector 45 degrees to the x-axis. This vector has both x and y components at the same time. Likewise, a qubit is both 0 and 1 simultaneously. We can visualize it as any point on the Bloch sphere with the poles representing 0 or 1.

    Probabilistic bit or pbit is used as a simpler alternative to the qubit. The state of a pbit is a probabilistic combination of the two logical states 0 and 1.

    A process called conditioning converts pbits to classical information. Similarly, another process called measurement converts qubits to classical information.

  • What makes quantum computers compute faster than classical computers?
    Superposition gives quantum computing its speed. Source: bergy 2016.
    Superposition gives quantum computing its speed. Source: bergy 2016.

    In classical computing, an n-bit number can represent a single value of range \([0, 2^{n-1}]\). In quantum computing, an n-qubit system represents all the possible \(2^n\) values at the same time. Alternatively, if we add a single bit to a classical computer, it's memory increases by one bit. The same change to a quantum computer doubles its memory.

    Consider a 4-bit input. To process all 16 possible input patterns, a classical computer would execute its algorithm one input at a time. A 4-qubit quantum computer would process all 16 patterns at the same time. As more bits are added to a quantum computer, its processing power increases exponentially.

    This is why a 50-qubit system can be compared to a classical computer handling billions of bits. A qubit counts for much more than a classical bit. Fifty-qubit is an approximate threshold at which quantum computers are able to perform calculations not feasible with classical computers.

  • What does it take to build quantum computers?
    Material systems used to build qubits. Source: Levy Research 2013.
    Material systems used to build qubits. Source: Levy Research 2013.

    To build a quantum computer, qubits must be housed in an environment that achieves minimum interference and maximum coherence. We need a method to transfer signals to qubits. We need classical computers to send instructions to qubits. Thus, it's not just about computing with qubits but also about converting from/to classical information.

    Various physical systems can be used to build qubits: ion traps, crystal defects, semiconductor quantum dots, superconductors, topological nanowires, neural atoms in optical lattices, nuclear magnetic resonance (NMR) and more.

    Just as transistors are combined in classical computing to implement Boolean logic using logic gates, there are various quantum computing models. Quantum circuit is one such model that uses quantum gates. Other models include one-way or measurement-based quantum computer (MBQC), adiabatic quantum computer (AQC), and topological quantum computer. Quantum Turing Machine (QTM) is of theoretical importance. All models are equivalent to one another.

    Quantum algorithms have to be developed. These are often specific to the computing model.

  • What does "quantum" refer to in quantum computing?

    A simple answer is that the term "quantum" comes from quantum mechanics. In physics, it refers to properties of atomic or subatomic particles, such as electrons, neutrinos, and photons.

    The real question is what aspects of quantum mechanics make quantum computing better than classical computing. There's no definitive answer though some theories have been put forward.

    Interference, entanglement, quantum contextuality, logical structure of quantum mechanics and quantum parallelism have been proposed. All of these have their critics. We don't understand this very well. Perhaps this is why we have very few quantum algorithms.

    A dominant theory is Quantum Parallelism Thesis (QPT), which in turn suggests Many Worlds Interpretation (MWI). Due to superposition, quantum computers compute on different input values simultaneously. In some sense, computations happen in parallel in many classical universes. Decoherence is used to distinguish one world from another. But this is at odds with the quantum circuit model and its coherence superpositions. Moreover, any measurement collapses the output state. Quantum algorithms must therefore construct 'clever' superpositions to increase the chance of obtaining a useful result.

  • What are some practical difficulties of building quantum computers?

    Challenges include fabrication, verification and architecture. Since quantum states are fragile, fabrication must be precise. Since any measurement collapses the state in a probabilistic manner, verification is difficult. Errors are more common than in classical computing and error correction is essential to the architecture.

    Interactions with external environment results in decoherence. At best, we're able to maintain coherence for only a few microseconds. Coherence becomes harder to achieve with more qubits. Coherence requires that superconductors are cooled to near absolute zero.

    Scaling up to more qubits on a single chip is another challenge. Today's systems need multiple control wires for every qubit.

    For various reasons that we can simply call noise, qubits change state. We therefore need efficient error correction methods. One approach is to use many physical qubits (as many as 10,000) to map to a single logical qubit. This makes it even more difficult to build a useful quantum machine. Some consider quantum error correction to be the most important problem that needs focused research. Some even estimate that 99% of quantum computations will be spent for error correction.

Milestones

1975

Poplavskii shows that it's computationally difficult to simulate quantum systems using classical computers. This view is reiterated by Feynmann at a conference in MIT in 1980.

1976

Polish mathematical physicist Roman Stanisław Ingarden publishes a seminal paper titled Quantum Information Theory. It's an attempt to generalize Shannon's information theory.

1980

Paul Benioff describes the first quantum mechanical model of a computer. He uses the Schrödinger equation to describe a Quantum Turing Machine (QTM). Yuri Manin proposes the idea of quantum computing. Benioff further develops his QTM in 1982.

1982

Wootters, Zurek and Dieks state the No-Cloning Theorem. This states that it's impossible to make identical copies of an arbitrary unknown quantum state. The theorem is consistent with the uncertainty principle in quantum mechanics. The theorem implies that a qubit can't be copied. This is a problem for taking backups or error correction. However, it's a useful property for cryptography.

1985

David Deutsch at the University of Oxford describes the first universal quantum computer. Like its classical counterpart in the Universal Turing Machine (UTM), this can simulate any other quantum computer in polynomial time.

1994

Peter Shor at AT&T Bell Labs invents a quantum algorithm that can factor large integers lot faster than classical computers. Theoretically, this can break many cryptographic systems in current use. This sparks interest in quantum computing. In time, this is called Shor's Algorithm. Shor assumes that qubits maintain their states. In practice, decoherence is a problem.

1995

Shor and Steane independently invent the first quantum error correcting codes. This circumvents the no-cloning theorem.

1996

Lov Grover at Bell Labs invents the quantum database search algorithm. This is relevant to a variety of problems. In particular, brute-force search problems obtain quadratic speedup. In time, this is called Grover's Algorithm.

1998
Progress of quantum computing in terms of number of qubits. Source: Emilio 2019.
Progress of quantum computing in terms of number of qubits. Source: Emilio 2019.

Researchers at the University of Oxford demonstrate a quantum algorithm on a 2-qubit NMR quantum computer. IBM researchers achieve a similar demonstration. The same year Grover's algorithm is demonstrated on an NMR quantum computer.

2015

Cambridge Quantum Computing releases t|ket>, which could be the first operating system for quantum computing. This enables classical computers interface to quantum computers.

2016
History of quantum computing till 2016 when it becomes available on IBM Cloud. Source: Gumann 2019.
History of quantum computing till 2016 when it becomes available on IBM Cloud. Source: Gumann 2019.

IBM's 5-qubit quantum computer becomes available on the cloud for academics and researchers, leading to demonstrations in 2017.

Oct
2019

Google claims to have achieved quantum supremacy with a 53-qubit programmable superconducting processor. Named Sycamore, it's based on quantum logic gates. The machine completes a benchmark test in 200 seconds, what a classical supercomputer would take 10,000 years. However, IBM claims a supercomputer with more disk storage could solve the problem in 2.5 days. In August 2020, 12 qubits of Sycamore are used to simulate a chemical reaction.

Mar
2020

Honeywell reports on a demonstration of a quantum computer architecture based on trapped-ion QCCD (Quantum Charge-Coupled Device). This year sees more interest from start-ups pursuing trapped-ion QCCD. Trapped-ion quantum computers can be traced to 1995 but most companies (IBM, Intel, Google) focused on superconductors instead.

References

  1. Ball, Philip. 2018. "The Era of Quantum Computing Is Here. Outlook: Cloudy." Quantz Magazine, January 24. Accessed 2021-02-09.
  2. Busvine, Douglas, and Paresh Dave. 2019. "Google unveils quantum computer breakthrough; critics say wait a qubit." Reuters, October 23. Accessed 2021-02-09.
  3. CB Insights. 2021. "What Is Quantum Computing?" CB Insights, January 7. Updated 2021-01-25. Accessed 2021-02-10.
  4. Cho, Adrian. 2020. "The biggest flipping challenge in quantum computing." Science Magazine, American Association for the Advancement of Science, July 9. Accessed 2021-02-09.
  5. Clarke, Jim. 2019. "An Optimist’s View of the 4 Challenges to Quantum Computing." IEEE Spectrum, March 22. Accessed 2021-02-09.
  6. Consultancy.uk. 2020. "Quantum computing market to reach $1 trillion by 2035." Consultancy.uk, April 15. Accessed 2021-02-10.
  7. Cuthbertson, Anthony. 2015. "First quantum computer operating system developed by Cambridge researchers." International Business Times, May 5. Accessed 2021-02-09.
  8. Diamandis, Peter H. 2016. "Massive Disruption is coming with Quantum computing." SingularityHub, October 10. Accessed 2021-02-09.
  9. Dyakonov, Mikhail. 2018. "The Case Against Quantum Computing." IEEE Spectrum, November 15. Accessed 2021-02-09.
  10. Emilio, M. Di Paolo. 2019. "Quantum Computer Design: An introduction." EDN Asia, December 11. Accessed 2021-02-10.
  11. Franklin, Diana and Frederic T. Chong. 2004. "Challenges in Reliable Quantum Computing." In: Shukla S.K., Bahar R.I. (eds) Nano, Quantum and Molecular Computing. Springer, Boston, MA. doi: 10.1007/1-4020-8068-9_8. Accessed 2021-02-09.
  12. Galer, Susan. 2019. "Why Quantum Computers Won’t Replace Classical Computers Anytime Soon." Forbes, September 4. Accessed 2021-02-10.
  13. Gibney, Elizabeth. 2020. "Quantum computer race intensifies as alternative technology gains steam." Nature, November 17. Accessed 2021-02-09.
  14. Gimeno-Segovia, Mercedes. 2016. "Quantum computing and its models." Quanta for Breakfast, August 19. Accessed 2021-02-10.
  15. Gumann, Pat. 2019. "A System Overview of Quantum Computing." Part of Chapter 3 in: Domestic Manufacturing Capabilities for Critical DoD Applications, Emerging Needs in Quantum-enabled Systems, Proceedings of a Workshop, National Academies of Sciences, Engineering, and Medicine, The National Academies Press. Accessed 2021-02-10.
  16. Hightower, Ray. 2019. "Quantum Computing at Caltech." Blog, February 27. Accessed 2021-02-09.
  17. Honeywell. 2020. "Scientific Paper: Demonstration of the QCCD Trapped-Ion Quantum Computer Architecture." News, Honeywell, March. Accessed 2021-02-10.
  18. Knill, Emanuel, Raymond Laflamme, Howard N. Barnum, Diego A. Dalvit, Jacek J. Dziarmaga, James E. Gubernatis, Leonid Gurvits, Gerardo Ortiz, Lorenza Viola, and Wojciech H. Zurek. 2002. "Quantum Information Processing: A hands-on primer." Los Alamos Science, no. 27, November. Accessed 2021-02-09.
  19. Levy Research. 2013. "Materials Issues for Quantum Computation." Levy Research, October 15. Updated 2016-10-31. Accessed 2021-02-09.
  20. Lisitsa, Nikita. 2019. "The Greatest Challenge of Modeling Large Quantum Computers." Blog, Serokell Labs, April 11. Accessed 2021-02-09.
  21. Martinis, John and Sergio Boixo. 2019. "Quantum Supremacy Using a Programmable Superconducting Processor." Google AI Blog, Google, October 23. Accessed 2021-02-09.
  22. Microsoft. 2018. "What is quantum computing?" Microsoft Azure. Accessed 2021-02-09.
  23. Moutafis, Rhea. 2021. "Will Quantum Computers Replace Their Classical Counterparts?" Built In, January 23. Updated 2021-01-24. Accessed 2021-02-09.
  24. Quantiki. 2015. "The no-cloning theorem." Quantiki, October 26. Accessed 2021-02-10.
  25. Resch, Salonik, and Ulya R. Karpuzcu. 2019. "Quantum Computing: An Overview Across the System Stack." arXiv, v3, October 19. Accessed 2021-02-09.
  26. Savage, Neil. 2020. "Google’s Quantum Computer Achieves Chemistry Milestone." Scientific American, September 4. Accessed 2021-02-09.
  27. Stanford University. 2006. "Quantum Computing." The Stanford Encyclopedia of Philosophy, The Metaphysics Research Lab, Stanford University, December 3. Updated 2019-09-30. Accessed 2021-02-09.
  28. Wikipedia. 2021a. "Quantum computing." Wikipedia, February 9. Accessed 2021-02-09.
  29. Wikipedia. 2021b. "Timeline of quantum computing and communication." Wikipedia, January 28. Accessed 2021-02-09.
  30. bergy. 2016. "Quantum computing and cryptography: Are Steemit and bitcoin safe?" On Steemit, August 8. Accessed 2021-02-09.

Further Reading

  1. D-Wave Systems. 2021. "Quantum Computing Primer." Tutorial, D-Wave Systems. Accessed 2021-02-09.
  2. Chen, Frank. 2016. "Quantum Computing: A Primer." Andreessen Horowitz, June 26. Accessed 2021-02-09.
  3. Almudever, C. G., L. Lao, X. Fu, N. Khammassi, I. Ashraf, D. Iorga, S. Varsamopoulos, C. Eichler, A. Wallraff, L. Geck, A. Kruth, J. Knoch, H. Bluhm, and K Bertels. 2017. "The engineering challenges in quantum computing." DATE '17: Proceedings of the Conference on Design, Automation & Test in Europe, pp. 836-845, March. Accessed 2021-02-09.
  4. Knill, Emanuel, Raymond Laflamme, Howard N. Barnum, Diego A. Dalvit, Jacek J. Dziarmaga, James E. Gubernatis, Leonid Gurvits, Gerardo Ortiz, Lorenza Viola, and Wojciech H. Zurek. 2002. "Quantum Information Processing: A hands-on primer." Los Alamos Science, no. 27, November. Accessed 2021-02-09.
  5. Resch, Salonik, and Ulya R. Karpuzcu. 2019. "Quantum Computing: An Overview Across the System Stack." arXiv, v3, October 19. Accessed 2021-02-09.
  6. Cho, Adrian. 2020. "The biggest flipping challenge in quantum computing." Science Magazine, American Association for the Advancement of Science, July 9. Accessed 2021-02-09.

Article Stats

Author-wise Stats for Article Edits

Author
No. of Edits
No. of Chats
DevCoins
4
4
1734
4
0
1566
1
2
42
1920
Words
2
Likes
10K
Hits

Cite As

Devopedia. 2021. "Quantum Computing." Version 9, February 11. Accessed 2024-06-25. https://devopedia.org/quantum-computing
Contributed by
3 authors


Last updated on
2021-02-11 12:55:33

Improve this article

Article Warnings

  • Readability score of this article is below 50 (48.6). Use shorter sentences. Use simpler words.