Ozone Chain
Comment on page

The Arrival of Quantum Computing

If you think you understand quantum mechanics, you don’t understand quantum mechanics. ➤ Richard P. Feynman.
Quantum computing is a rapidly-progressing technology that exploits the laws of quantum mechanics to solve problems that are considered too hard and complicated for classical computers. Quantum mechanics deals with nature at the smallest scales, exploring interactions between atoms and subatomic particles. Quantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations.
Quantum computers are capable of solving certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers.

Bits to Qubits

Classical Computers
Classical computers use binary digits or bits (0s or 1s) to store, transfer, and manipulate data. A bit can possibly be only one of two states: a one or a zero. It is either on or off.
Physically, a bit is represented in terms of a voltage inside a transistor, a magnetic domain on a hard disk, or light in an optical fiber. The binary nature of a bit occurs because the manipulated particles are being manipulated as whole particles.

Quantum computers

in quantum computing the smallest unit of measurement is called a quantum bit or a qubit. Quantum computers allow manipulation of subatomic particles in a non binary way, using their quantum properties.
Quantum superposition allows the particle to occupy both zero and one at the same time. The principle of superposition found in qubits is demonstrated using a Bloch sphere. Qubits contain probabilities for delivering the value of 1 or 0. Upon observation using a powerful electron microscope, the qubit will stick to representing one of those values.
Physically, a qubit can take the form of a a photon or a subatomic particle like a single electron, or neutrino, a superconducting circuit like Josephson junction, or Nuclear magnetic resonance on molecules in solution. Efforts towards building a physical quantum computer focus on technologies such as transmons, ion traps and topological quantum computers, which aim to create high-quality qubits.

A chronology of Quantum Computing

Throughout the 19th century, scientists’ understanding of the atomic model was beginning to shift towards the concept of subatomic particles like neutrons, electrons, and protons and their corresponding characteristics.
  • 1905 - Albert Einstein discovers the photoelectric effect. When light is incident on certain materials, it cause to release electrons from the material. Light itself consists of individual quantum particles or photons. This is in contrast to classical electromagnetism, which predicts that continuous light waves transfer energy to electrons.
  • 1925 - A conceptually autonomous and logically consistent formulation of quantum mechanics called matrix mechanics formulated by Werner Heisenberg, Max Born, and Pascual Jordan. Physical properties of particles are interpreted as matrices that evolve in time.
  • 1935 - Albert Einstein, Boris Podolsky, and Nathan Rosen (EPR) publish a paper highlighting the paradoxical nature of quantum superpositions and arguing that the description of physical reality provided by quantum mechanics is incomplete.
  • 1935 - Erwin Schrödinger, discussing quantum superposition with Albert Einstein and critiquing the Copenhagen interpretation of quantum mechanics, develops a thought experiment in which a cat (known as Schrödinger’s cat) is simultaneously dead and alive; Schrödinger also coins the term “quantum entanglement”.
  • 1985 - David Deutsch of the University of Oxford formulates a description for a quantum Turing machine. The principle states that a universal computing device can simulate every physical process.
  • 1992 - The Deutsch–Jozsa algorithm is one of the first deterministic quantum algorithm that is exponentially faster than any possible deterministic classical algorithm.
  • 1994 - Peter Shor of Bell Laboratories develops a quantum algorithm for factoring integers. It has the potential to break RSA-encrypted communications, a commonly-used method for securing data transmissions.
  • 1994 - The National Institute of Standards and Technology organizes the first US government-sponsored conference on quantum computing.
  • 1999 - Yasunobu Nakamura of the University of Tokyo and Jaw-Shen Tsai of Tokyo University of Science demonstrate that a superconducting circuit can be used as a qubit.
  • 2004 - First five-photon entanglement demonstrated by Jian-Wei Pan's group at the University of Science and Technology in China.
  • 2011 - The first commercially available quantum computer is offered by D-Wave Systems.
  • 2017 - Chinese researchers use quantum entanglement to accomplish the first quantum teleportation of independent single-photon qubits from a ground observatory to a low Earth orbit satellite with a distance of up to 1400 km.
  • 2018 - The National Quantum Initiative Act is signed into law by the US President, establishing the goals and priorities for a 10-year plan to accelerate the development of quantum information science and technology applications in the United States.
  • 2019 - Google's quantum processor named Sycamore having 53 qubits reaches quantum supremacy by performing a series of operations in 200 seconds that would take a supercomputer about 10,000 years to complete.

Current status of Quantum Computing

Recent years saw a rising interest in quantum computing, fueled by several breakthroughs on the technology side and a significant increase in investments both from the private sector and governments.
Quantum machine learning, quantum simulation, quantum computation, quantum artificial intelligence, quantum linear algebra and quantum optimization and search are generating a lot of interest and now have industrial applications.
Quantum cryptography is being implemented in industries such as banking - Swiss private banks use it to protect sensitive data. Cloud providers have released a quantum machine learning toolkit on GitHub. IBM, Google, Alibaba, Microsoft, Amazon and others provide Quantum-as-a-Service (QaaS) cloud computing. Quantum Computing services are being provided at affordable rates, thus marking the mainstreaming of the technology for industries and consum