QUANTUM 101
We have a man called Max Planck to blame for all this. His discovery of energy quanta— the idea that the heat or light coming off something is not a constant force, but a stream of little lumps—won him the Nobel Prize in 1918.
This threw open the whole discipline of quantum mechanics, in which you’ll find names such as Bohr, Fermi, Schrodinger, Pauli, Heisenberg, and Einstein.
Indeed, Einstein won his Nobel
Prize for the discovery of the photoelectric effect in 1921, rather than his more famous theories of relativity.
In the quantum world, particles behave both like points and little waves at the same time, with the magnitude of waves in any given region of space representing the probability of finding a particle at that location. Quantum computing relies on these strange properties of subatomic particles to do calculations in ways no classical computer could, using qubits instead of bits.
These qubits can be set at 1, 0, or both at the same time—known as a superposition— and often have to be cooled to temperatures close to absolute zero in order to work to minimize the effect of noise, or outside interference, on the calculation.
The whole field is in its infancy but performs best on algorithms specially designed for it.
Quantum supremacy, where a quantum computer demonstrates abilities that no classical computer can match, has been claimed, but not yet fully proven, and we’re a long way from generalpurpose quantum computers that can replace your PC.