Cutting the power
HP’s Labs seek ways to cut power consumption
IN THE WORLD of technology it’s often difficult to tell which problems are simple and which are exceedingly complex.
The cooling of data centres is an area that appears quite simple – make sure you have enough air-conditioning units to keep the room at the desired temperature. However, delving below the surface reveals myriad problems confronting companies today. At a media briefing at HP Labs in Bristol earlier this month, Rich Friedrich, director for HP’s enterprise systems and software lab, said the number of servers in use globally had doubled between 2000 and 2006 and, consequently, the amount of power consumed had increased dramatically.
Increases in computing power typically result in an increase in the heat these systems generate, and in a data centre environment the cost of cooling is typically 50% of the energy bill, according to Friedrich. With data centres globally consuming an estimated 6GW of electricity annually, it doesn’t take a rocket scientist to figure out there are potentially big savings to be made.
The boffins at HP’s labs, spread over seven locations globally, have worked for the past eight years on how to cool large data centres more efficiently. They have come up with a number of solutions, the first of which is more accurate measurement of the temperature in the facility.
Conventional systems have measured the temperature of the air returning to the air-conditioning unit, but this only provides a global indication of the temperature of the room, with no indication of where the heat is being generated. Putting hundreds of sensors across the facility provides an accurate picture of which equipment is generating the heat and therefore which equipment needs the most cooling.
“The second step is to understand how air flows through the facility and where the zones of greatest cooling are,” he says.
“We were lucky enough to have a custom-built data centre in California, built as part of a project with film studio DreamWorks, where we could test the various scenarios in a realworld environment.”
“This analysis showed us which zones were covered by which air-conditioning units and we were able to move the equipment generating the most heat into those zones, creating a more equal distribution of heat throughout the facility.”
Another area that needs consideration is what happens if one of the cooling units fails. Friedrich says that if systems running mission-critical applications are located in that zone they could overheat and shut down, with disastrous consequences. Smart design would therefore put these systems in a zone overlapped by two units providing backup in case of breakdown.
Following these basic principles enabled the company to realise energy savings of 50% in its data centre, although he estimates that real-world examples would be closer to 30% or 40%. In a world where a large data centre could cost $1,2m (roughly R8,8m) a year just for cooling, this is money that could be well spent elsewhere.
The ultimate aim is to run the computing facilities at temperatures much higher than at present as the equipment is designed for this. How people working in those buildings will react to the hotter environment remains to be seen.
* Kelly visited HP Labs as a guest of HP.
Thinking ahead. Rich Friedrich