DICK POUNTAIN
We need to be realistic about the gains offered by quantum computing, but that doesn’t mean we can’t look forward to a fantastic future
We need to be realistic about the gains offered by quantum computing, but that doesn’t mean we can’t look forward to a fantastic future.
Recently, I read Quantum Computing: How it Works and Why it Could Change the World by Amit Katwala. It’s a great intro to the present state of and prospects for quantum computing – not too technical, wasting little space on the basics but avoiding the hieroglyphics of quantum algorithms. It’s also honest about the fact that quantum computers barely exist and that their prospects remain rather dim.
Katwala offers a clear summary of the three major current research directions: laser-ion traps (as pursued by Amazon/IonQ), cryogenic Josephson junctions (Google and IBM), and “topological qubits” (Microsoft). As he goes, he points to their weaknesses: ion traps need too many lasers to be scalable; cooling to 0.01°K requires monstrous cryostats that consume lots of energy; and Microsoft’s trick for dodging decoherence would depend on an undiscovered fundamental particle!
Katwala is candid about the problem that all three strands of research share, namely that environmental noise makes qubits rapidly untangle, leaving barely microseconds in which to perform useful computation. This decoherence also renders the results unreliable, and so a huge degree of error-correction is required: each working qubit must be surrounded by dozens of error-correction qubits, and when an error is found the error qubits themselves must corrected.
To give an example, Google’s Sycamore chip, which is claimed to have achieved“quantum supremacy ”, contains 53 qubits, but most will have been doing error-correction. This is the reason John Preskill, one of the leading researchers, has dubbed this the noisy intermediatescale quantum (NISQ) era, where quantum computers exist but aren’t yet robust enough to fulfil their promise. Harsher critics suspect the second law of thermodynamics may make quantum computing inherently unfeasible.
That hardly matters, though, because the quantum bandwagon has become unstoppable. The promise of quantum computing, namely an exponential speed increase over classical computers, threatens to make public-key encryption – as used by the military, banks and even WhatsApp – crackable. This makes research a matter of national security, unlocking unlimited funding and starting a new Cold War-style arms race between China and the West.
But Katwala is candid that this promise is itself dubious. The known problem classes for which quantum algorithms give exponential speed-up are few and the best-known quantum algorithms offer only a quadratic, not exponential, advantage over classical computers. This isn’t peanuts – reducing a million-step calculation to one thousand may be the difference between overnight and almost real-time – but it won’t satisfy the crypto-crowd nor justify such massive research budgets.
On the other hand, conventional deep-learning networks are a source of concern over the colossal amount of power they consume when training on enormous datasets, so this is an area where even a “mere” quadratic speed advantage would be very welcome.
I’d recommend Katwala’s book as a quick read to bring you up to speed with mainstream quantum thinking, but it doesn’t cover any radically different “long shot” directions. I’m convinced that if quantum computing happens it will only be through roomtemperature, solid-state technologies that are barely here yet, and I’m also an enthusiast for neuromorphic architectures that mimic the nervous systems of animals, using electronic components that may employ hybrid digital-and-analog computations.
Neuromorphic engineering was first pursued by Carver Mead in the late 1980s, who used it to design vision systems, auditory processors and autonomous robots. The convolutional neural networks that are driving the recent explosion in AI and machine learning are only one aspect of a far wider domain of neuromorphic computing models, and they obviously employ classical computing components.
Researchers at Purdue University have shown how spintronic quantum measurements might be used to implement neuromorphic processors, and many groups are investigating spin switching in “nitrogen vacancies” within synthetic diamond lattices – diamond-based qubits could resist decoherence for milliseconds rather than microseconds, and at room temperature.
Were I writing a science-fiction screenplay, my quantum computers would be alternating sandwiched layers (think Tunnock’s Caramel Wafer) of epitaxially deposited diamond and twisted graphene, read and written by a flickering laser inside a dynamic, holographic magnetic field. And like a Caramel Wafer, the results could be delicious.
The best-known quantum algorithms offer only a quadratic, not exponential, advantage over classical computers