DICK POUNTAIN
Even Silicon Valley can’t break the second law of thermodynamics, and that’s bad news for the electricity supply.
Even Silicon Valley can’t break the second law of thermodynamics, and that’s bad news for the world’s electricity supply
If you enjoy programming as much as I do, you’re likely to have encountered the scenario I’m about to describe. You’re tackling a really knotty problem that involves novel data types and frequent forays into the manuals. You eventually crack it, then glance at the clock and realise that hours have passed in total concentration, with you oblivious to all other influences. You suddenly feel thirsty, rise to make a cup of tea and feel weak at the knees, as if you had just run a mile. That’s because using your brain so intensively is, energetically, every bit as demanding as running: it uses up your blood glucose at a ferocious rate. Our brains burn glucose faster than any of our other organs, up to 20% of our total consumption rate.
How come? The brain contains no muscles and doesn’t do any heavy lifting or moving – all it does is shift around images, ideas and intentions, which surely must be weightless? Not so. I will admit that a constant refrain of mine in this column has been that the brain isn’t a digital computer in the way the more naive AI enthusiasts believe, but that’s not to say that it isn’t a computing device of a very different (and not fully understood) kind – it most certainly is, and computing consumes energy. Even though a bit would appear to weigh nothing, the switching of silicon gates or salty water neurons requires energy to perform and is less than 100% efficient. It requires rather a lot of energy actually, if a lot of bits or neurons are involved, and their very tiny nature makes it easy to assemble
extremely large numbers of them in a smallish space.
This was brought home to me yesterday in a most dramatic way via an article in the MIT Technology
Review ( pcpro.link/299mit), which describes recent research into the energy consumption of state-of-theart natural language processing (NLP) systems of the sort deployed online and behind gadgets such as Alexa. Training a single really large deeplearning system consumes colossal amounts of energy, generating up to five times the CO2 emitted by a car (including fuel burned) over its whole lifetime. How could that be possible? The answer is that it doesn’t all happen on the same computer, which would be vaporised in a millisecond. It happens in the cloud, distributed across the world in massively parallel virtual machine arrays working on truly humongous databases, over and over again as the system tweaks and optimises itself.
We’ve already had a small glimpse of this fact through the mining of bitcoins, where the outfits that profit from this weirdly pathological activity have to balance the millions they mine against equally enormous electricity bills, and must increasingly resort to basing their servers in the Arctic or sinking them into lakes to cool them. Computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact from us: even shoot ’em up games now demand multiple supercomputergrade GPUs.
It was a hero of mine, Carver Mead, who first made me think about the energetics of computing in
his seminal 1980 book Introduction
to VLSI Systems. Chapter 9 on the physics of computational systems not only explains, in thermodynamic terms, how logic gates operate as heat engines, but also employs the second law to uncover the constraints on consumption for any conceivable future computing technology. In particular, he demolished the hope of some quantum computing enthusiasts for “reversible computation”, which recovers the energy used – he expected this would use more still.
The slice of total energy usage that goes into running the brains of the eight billion of us on the planet is way less than is used for transport or heating, thanks to six billion years of biological evolution that have forged our brains into remarkably compact computing devices. That evolution changed our whole anatomy, from the bulging braincase to the wide female pelvis needed to deliver it, and it also drove us to agriculture – extracting glucose fast enough to run it reliably forced us to invent cooking and to domesticate grass seeds.
Now AI devices are becoming important to our economies, and the Alexa on your table makes that feel feasible, but vast networks of energy-guzzling servers lie behind that little tube. Silicon technology just can’t squeeze such power into the space our fat-andprotein brains occupy. Alongside the introduction of the electric car, we’re about to learn some unpleasant lessons concerning the limits of our energy-generation infrastructure.
Computing can consume a lot of energy when you have to do a lot of it, and the deceptive lightness of a graphical display hides this fact
The slice of total energy usage that goes into running the brains of the eight billion of us is way less than is used for transport or heating