PC Pro

DICK POUNTAIN

Even Silicon Valley can’t break the second law of thermodyna­mics, and that’s bad news for the electricit­y supply.

- dick@dickpointa­in.co.uk

Even Silicon Valley can’t break the second law of thermodyna­mics, and that’s bad news for the world’s electricit­y supply

If you enjoy programmin­g as much as I do, you’re likely to have encountere­d the scenario I’m about to describe. You’re tackling a really knotty problem that involves novel data types and frequent forays into the manuals. You eventually crack it, then glance at the clock and realise that hours have passed in total concentrat­ion, with you oblivious to all other influences. You suddenly feel thirsty, rise to make a cup of tea and feel weak at the knees, as if you had just run a mile. That’s because using your brain so intensivel­y is, energetica­lly, every bit as demanding as running: it uses up your blood glucose at a ferocious rate. Our brains burn glucose faster than any of our other organs, up to 20% of our total consumptio­n rate.

How come? The brain contains no muscles and doesn’t do any heavy lifting or moving – all it does is shift around images, ideas and intentions, which surely must be weightless? Not so. I will admit that a constant refrain of mine in this column has been that the brain isn’t a digital computer in the way the more naive AI enthusiast­s believe, but that’s not to say that it isn’t a computing device of a very different (and not fully understood) kind – it most certainly is, and computing consumes energy. Even though a bit would appear to weigh nothing, the switching of silicon gates or salty water neurons requires energy to perform and is less than 100% efficient. It requires rather a lot of energy actually, if a lot of bits or neurons are involved, and their very tiny nature makes it easy to assemble

extremely large numbers of them in a smallish space.

This was brought home to me yesterday in a most dramatic way via an article in the MIT Technology

Review ( pcpro.link/299mit), which describes recent research into the energy consumptio­n of state-of-theart natural language processing (NLP) systems of the sort deployed online and behind gadgets such as Alexa. Training a single really large deeplearni­ng system consumes colossal amounts of energy, generating up to five times the CO2 emitted by a car (including fuel burned) over its whole lifetime. How could that be possible? The answer is that it doesn’t all happen on the same computer, which would be vaporised in a millisecon­d. It happens in the cloud, distribute­d across the world in massively parallel virtual machine arrays working on truly humongous databases, over and over again as the system tweaks and optimises itself.

We’ve already had a small glimpse of this fact through the mining of bitcoins, where the outfits that profit from this weirdly pathologic­al activity have to balance the millions they mine against equally enormous electricit­y bills, and must increasing­ly resort to basing their servers in the Arctic or sinking them into lakes to cool them. Computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact from us: even shoot ’em up games now demand multiple supercompu­tergrade GPUs.

It was a hero of mine, Carver Mead, who first made me think about the energetics of computing in

his seminal 1980 book Introducti­on

to VLSI Systems. Chapter 9 on the physics of computatio­nal systems not only explains, in thermodyna­mic terms, how logic gates operate as heat engines, but also employs the second law to uncover the constraint­s on consumptio­n for any conceivabl­e future computing technology. In particular, he demolished the hope of some quantum computing enthusiast­s for “reversible computatio­n”, which recovers the energy used – he expected this would use more still.

The slice of total energy usage that goes into running the brains of the eight billion of us on the planet is way less than is used for transport or heating, thanks to six billion years of biological evolution that have forged our brains into remarkably compact computing devices. That evolution changed our whole anatomy, from the bulging braincase to the wide female pelvis needed to deliver it, and it also drove us to agricultur­e – extracting glucose fast enough to run it reliably forced us to invent cooking and to domesticat­e grass seeds.

Now AI devices are becoming important to our economies, and the Alexa on your table makes that feel feasible, but vast networks of energy-guzzling servers lie behind that little tube. Silicon technology just can’t squeeze such power into the space our fat-andprotein brains occupy. Alongside the introducti­on of the electric car, we’re about to learn some unpleasant lessons concerning the limits of our energy-generation infrastruc­ture.

Computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact

The slice of total energy usage that goes into running the brains of the eight billion of us is way less than is used for transport or heating

 ??  ?? Dick Pountain is editorial fellow of PC Pro. His favourite energy drink is currently Meantime London Pale Ale. Visit dickpounta­in.co.uk
Dick Pountain is editorial fellow of PC Pro. His favourite energy drink is currently Meantime London Pale Ale. Visit dickpounta­in.co.uk

Newspapers in English

Newspapers from United Kingdom