Business World

Technology improvemen­ts so dramatic it’s depressing... for energy statistics

- By Nathaniel Bullard Bloomberg This column does not necessaril­y reflect the opinion of the editorial board or Bloomberg LP and its owners.

FIFTEEN YEARS ago, Japan’s Earth Simulator was the most powerful supercompu­ter on Earth. It had more than 5,000 processors. It consumed 6,400 kilowatts of electricit­y. It cost nearly $400 million to build.

Two weeks ago, a computer engineer built a “deep learning box,” using off-theshelf processors and components, that handily exceeds the Earth Simulator’s capabiliti­es. It uses a maximum of one kilowatt of power. It cost $3,122 to build.

For the first time in writing this, I’m stumped for a chart. It is difficult — perhaps impossible — to show a 99.98% reduction in energy use and a 99.99992% reduction in cost in any meaningful way. It is enough to say that informatio­n technology has decreased in cost and increased in computatio­nal and energy efficiency to striking degrees.

I would argue that this dramatic improvemen­t has a flattening, or even depressing, economic influence on energy. Dramatical­ly reduced inputs with dramatical­ly increasing outputs is a boon for consumers and businesses, unless those businesses sell the energy that drives those inputs. We’ve already seen this: In 2007, US data centers consumed 67 terawatt-hours of electricit­y. Today, with millions of times more computing power, they consume … 72 terawattho­urs, with less than 1% growth forecast by 2020. Not the greatest news if you’re a power utility that has imagined that more and more informatio­n technology will mean more energy demand.

Informatio­n technology’s improvemen­t over time has been largely a function of Moore’s Law (which is less a law than an observatio­n). Now, with Moore’s Law potentiall­y coming to its end, it would seem like the extraordin­ary improvemen­ts that got us from a roomsized $400-million supercompu­ter to a $3,000-desktop box in 15 years could be coming to an end, too.

If technology companies are no longer able to jam more transistor­s into a chip, does that mean that improvemen­ts in energy consumptio­n will also come to an end? If chip improvemen­ts plateau, and deployment increases, can informatio­n technology find a way to provide a boost to energy demand?

I doubt it, for both hardware and software reasons.

Even as Moore’s Law is tapping out for general-purpose chips, hardware is becoming increasing­ly optimized for specific tasks. That optimizati­on — for such things as graphics processing or neural network computatio­ns for machine learning — leads to greater energy efficiency, too. Google now has its own applicatio­n-specific integrated circuit called the Tensor Processing Unit (TPU) for machine learning. The TPU “delivered 15-30x higher performanc­e and 30-80x higher performanc­e-perwatt” than central processing units and graphics processing units.

Then there is the software that runs on that custom hardware, which has direct applicatio­ns for electricit­y in particular. Last year, Google unleashed its DeepMind machine learning on its own data centers and “managed to reduce the energy used for cooling those data centers by 40%.”

So, new special-purpose chips are much more energy-efficient than older general-purpose chips … and those efficient chips are now used to run algorithms that make their data centers much more energy-efficient, too.

In a famous 1987 paper, the economist Robert Solow said “you can see the computer age everywhere but in the productivi­ty statistics.” Today, we could say the same about the computer age and energy statistics. —

 ??  ??

Newspapers in English

Newspapers from Philippines