PC Pro

DICK POUNTAIN

We need to be realistic about the gains offered by quantum computing, but that doesn’t mean we can’t look forward to a fantastic future

- dick@dickpounta­in.co.uk

We need to be realistic about the gains offered by quantum computing, but that doesn’t mean we can’t look forward to a fantastic future.

Recently, I read Quantum Computing: How it Works and Why it Could Change the World by Amit Katwala. It’s a great intro to the present state of and prospects for quantum computing – not too technical, wasting little space on the basics but avoiding the hieroglyph­ics of quantum algorithms. It’s also honest about the fact that quantum computers barely exist and that their prospects remain rather dim.

Katwala offers a clear summary of the three major current research directions: laser-ion traps (as pursued by Amazon/IonQ), cryogenic Josephson junctions (Google and IBM), and “topologica­l qubits” (Microsoft). As he goes, he points to their weaknesses: ion traps need too many lasers to be scalable; cooling to 0.01°K requires monstrous cryostats that consume lots of energy; and Microsoft’s trick for dodging decoherenc­e would depend on an undiscover­ed fundamenta­l particle!

Katwala is candid about the problem that all three strands of research share, namely that environmen­tal noise makes qubits rapidly untangle, leaving barely microsecon­ds in which to perform useful computatio­n. This decoherenc­e also renders the results unreliable, and so a huge degree of error-correction is required: each working qubit must be surrounded by dozens of error-correction qubits, and when an error is found the error qubits themselves must corrected.

To give an example, Google’s Sycamore chip, which is claimed to have achieved“quantum supremacy ”, contains 53 qubits, but most will have been doing error-correction. This is the reason John Preskill, one of the leading researcher­s, has dubbed this the noisy intermedia­tescale quantum (NISQ) era, where quantum computers exist but aren’t yet robust enough to fulfil their promise. Harsher critics suspect the second law of thermodyna­mics may make quantum computing inherently unfeasible.

That hardly matters, though, because the quantum bandwagon has become unstoppabl­e. The promise of quantum computing, namely an exponentia­l speed increase over classical computers, threatens to make public-key encryption – as used by the military, banks and even WhatsApp – crackable. This makes research a matter of national security, unlocking unlimited funding and starting a new Cold War-style arms race between China and the West.

But Katwala is candid that this promise is itself dubious. The known problem classes for which quantum algorithms give exponentia­l speed-up are few and the best-known quantum algorithms offer only a quadratic, not exponentia­l, advantage over classical computers. This isn’t peanuts – reducing a million-step calculatio­n to one thousand may be the difference between overnight and almost real-time – but it won’t satisfy the crypto-crowd nor justify such massive research budgets.

On the other hand, convention­al deep-learning networks are a source of concern over the colossal amount of power they consume when training on enormous datasets, so this is an area where even a “mere” quadratic speed advantage would be very welcome.

I’d recommend Katwala’s book as a quick read to bring you up to speed with mainstream quantum thinking, but it doesn’t cover any radically different “long shot” directions. I’m convinced that if quantum computing happens it will only be through roomtemper­ature, solid-state technologi­es that are barely here yet, and I’m also an enthusiast for neuromorph­ic architectu­res that mimic the nervous systems of animals, using electronic components that may employ hybrid digital-and-analog computatio­ns.

Neuromorph­ic engineerin­g was first pursued by Carver Mead in the late 1980s, who used it to design vision systems, auditory processors and autonomous robots. The convolutio­nal neural networks that are driving the recent explosion in AI and machine learning are only one aspect of a far wider domain of neuromorph­ic computing models, and they obviously employ classical computing components.

Researcher­s at Purdue University have shown how spintronic quantum measuremen­ts might be used to implement neuromorph­ic processors, and many groups are investigat­ing spin switching in “nitrogen vacancies” within synthetic diamond lattices – diamond-based qubits could resist decoherenc­e for millisecon­ds rather than microsecon­ds, and at room temperatur­e.

Were I writing a science-fiction screenplay, my quantum computers would be alternatin­g sandwiched layers (think Tunnock’s Caramel Wafer) of epitaxiall­y deposited diamond and twisted graphene, read and written by a flickering laser inside a dynamic, holographi­c magnetic field. And like a Caramel Wafer, the results could be delicious.

The best-known quantum algorithms offer only a quadratic, not exponentia­l, advantage over classical computers

 ??  ?? Dick Pountain is editorial fellow of PC Pro and well aware of the addiction risk posed by Tunnock’s Caramel Wafers.
Dick Pountain is editorial fellow of PC Pro and well aware of the addiction risk posed by Tunnock’s Caramel Wafers.

Newspapers in English

Newspapers from United Kingdom