Has the age of quantum computing arrived?
Ever since Charles Babbage’s conceptual, unrealised Analytical Engine in the 1830s, computer science has been trying very hard to race ahead of its time. Particularly over the last 75 years, there have been many astounding developments – the first electronic programmable computer, the first integrated circuit computer, the first microprocessor.
But the next anticipated step may be the most revolutionary of all.
Quantum computing is the technology that many scientists, entrepreneurs and big businesses expect to provide a, well, quantum leap into the future. If you’ve never heard of it there’s a helpful video doing the social media rounds that’s got a couple of million hits on YouTube. It features the Canadian prime minister, Justin Trudeau, detailing exactly what quantum computing means.
Trudeau was on a recent visit to the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, one of the world’s leading centres for the study of the field. During a press conference there, a reporter asked him, half-jokingly, to explain quantum computing.
Quantum mechanics is a conceptually counterintuitive area of science that has baffled some of the finest minds – as Albert Einstein said “God does not play dice with the universe” – so it’s not something you expect to hear politicians holding forth on.
Throw it into the context of computing and let’s just say you could easily make Zac Goldsmith look like an expert on Bollywood. But Trudeau rose to the challenge and gave what many science observers thought was a textbook example of how to explain a complex idea in a simple way.
The concept of quantum computing is relatively new, dating back to ideas put forward in the early 1980s by the late Richard Feynman, the brilliant American theoretical physicist and Nobel laureate.
He conceptualised the possible improvements in speed that might be achieved with a quantum computer. But theoretical physics, while a necessary first step, leaves the real brainwork to practical application.
With normal computers, or classical computers as they’re now called, there are only two options – on and off – for processing information. A computer “bit”, the smallest unit into which all information is broken down, is either a “1” or a “0”.
And the computational power of a normal computer is dependent on the number of binary transistors – tiny power switches – that are contained within its microprocessor.
Back in 1971 the first Intel processor was made up of 2,300 transistors. Intel now produce microprocessors with more than 5bn transistors. However, they’re still limited by their simple binary options. But as Trudeau explained, with quantum computers the bits, or “qubits” as they are known, afford far more options owing to the uncertainty of their physical state.
In the mysterious subatomic realm of quantum physics, particles can act like waves, so that they can be particle or wave or particle and wave. This is what’s known in quantum mechanics as superposition. As a result of superposition a qubit can be a 0 or 1 or 0 and 1.
That means it can perform two equations at the same time. Two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some