Cut­ting the power

HP’s Labs seek ways to cut power con­sump­tion

Finweek English Edition - - Communication & technology - BENE­DICT KELLY

IN THE WORLD of tech­nol­ogy it’s of­ten dif­fi­cult to tell which prob­lems are sim­ple and which are ex­ceed­ingly com­plex.

The cool­ing of data cen­tres is an area that ap­pears quite sim­ple – make sure you have enough air-con­di­tion­ing units to keep the room at the de­sired tem­per­a­ture. How­ever, delv­ing be­low the sur­face re­veals myr­iad prob­lems con­fronting com­pa­nies to­day. At a me­dia brief­ing at HP Labs in Bris­tol ear­lier this month, Rich Friedrich, di­rec­tor for HP’s en­ter­prise sys­tems and soft­ware lab, said the num­ber of servers in use glob­ally had dou­bled be­tween 2000 and 2006 and, con­se­quently, the amount of power con­sumed had in­creased dra­mat­i­cally.

In­creases in com­put­ing power typ­i­cally re­sult in an in­crease in the heat th­ese sys­tems gen­er­ate, and in a data cen­tre en­vi­ron­ment the cost of cool­ing is typ­i­cally 50% of the en­ergy bill, ac­cord­ing to Friedrich. With data cen­tres glob­ally con­sum­ing an es­ti­mated 6GW of elec­tric­ity an­nu­ally, it doesn’t take a rocket sci­en­tist to fig­ure out there are po­ten­tially big sav­ings to be made.

The boffins at HP’s labs, spread over seven lo­ca­tions glob­ally, have worked for the past eight years on how to cool large data cen­tres more ef­fi­ciently. They have come up with a num­ber of so­lu­tions, the first of which is more ac­cu­rate mea­sure­ment of the tem­per­a­ture in the fa­cil­ity.

Con­ven­tional sys­tems have mea­sured the tem­per­a­ture of the air re­turn­ing to the air-con­di­tion­ing unit, but this only pro­vides a global in­di­ca­tion of the tem­per­a­ture of the room, with no in­di­ca­tion of where the heat is be­ing gen­er­ated. Putting hun­dreds of sen­sors across the fa­cil­ity pro­vides an ac­cu­rate pic­ture of which equip­ment is gen­er­at­ing the heat and there­fore which equip­ment needs the most cool­ing.

“The sec­ond step is to un­der­stand how air flows through the fa­cil­ity and where the zones of great­est cool­ing are,” he says.

“We were lucky enough to have a cus­tom-built data cen­tre in Cal­i­for­nia, built as part of a project with film stu­dio DreamWorks, where we could test the var­i­ous sce­nar­ios in a re­al­world en­vi­ron­ment.”

“This anal­y­sis showed us which zones were cov­ered by which air-con­di­tion­ing units and we were able to move the equip­ment gen­er­at­ing the most heat into those zones, cre­at­ing a more equal dis­tri­bu­tion of heat through­out the fa­cil­ity.”

An­other area that needs con­sid­er­a­tion is what hap­pens if one of the cool­ing units fails. Friedrich says that if sys­tems run­ning mis­sion-crit­i­cal ap­pli­ca­tions are lo­cated in that zone they could over­heat and shut down, with dis­as­trous con­se­quences. Smart de­sign would there­fore put th­ese sys­tems in a zone over­lapped by two units pro­vid­ing backup in case of break­down.

Fol­low­ing th­ese ba­sic prin­ci­ples en­abled the com­pany to re­alise en­ergy sav­ings of 50% in its data cen­tre, al­though he es­ti­mates that real-world ex­am­ples would be closer to 30% or 40%. In a world where a large data cen­tre could cost $1,2m (roughly R8,8m) a year just for cool­ing, this is money that could be well spent else­where.

The ul­ti­mate aim is to run the com­put­ing fa­cil­i­ties at tem­per­a­tures much higher than at present as the equip­ment is de­signed for this. How peo­ple work­ing in those build­ings will re­act to the hot­ter en­vi­ron­ment re­mains to be seen.

* Kelly vis­ited HP Labs as a guest of HP.

Think­ing ahead. Rich Friedrich

Newspapers in English

Newspapers from South Africa

© PressReader. All rights reserved.