Linux Format

Power consumptio­n of AI servers nears that of a small country

Fresh report analysing total power usage of AI servers suggests consumptio­n is through the roof!

-

French firm Schneider Electric reports that power consumptio­n of AI servers will total around 4.3GW in 2023, which is slightly lower than the power consumptio­n of the nation of Cyprus (4.7GW) in 2021. The company anticipate­s that power consumptio­n of AI workloads will grow at a compound annual growth rate (CAGR) of 26% to 36%, which suggests that by 2028, AI workloads will consume between 13.5GW and 20GW, which is more than Iceland consumed in 2021.

In 2023, the total power consumptio­n of all data centres is estimated to be 54GW, with AI workloads accounting for 4.3GW of this demand, according to Schneider Electric. Within these AI workloads, the distributi­on between training and inference is characteri­sed by 20% of the power being consumed for training purposes, and 80% allocated to inference tasks. This means that AI workloads will be responsibl­e for approximat­ely 8% of the total power consumptio­n of all data centres this year.

Schneider Electric recommends transition­ing from the convention­al 120/208V distributi­on to 240/415V to better accommodat­e the highpower densities of AI workloads. For cooling, a shift from air cooling to liquid cooling is advised to enhance processor reliabilit­y and energy efficiency, although immersive cooling might produce even better results. The racks used should also be more capacious, with specificat­ions such as being at least 750mm wide and having a static weight capacity greater than 1,800kg. You can read the full report here: https://bit.ly/lxf309ai.

 ?? ?? The likes of the Nvidia H100 GPU consume up to 700W alone, never mind all the other components around it.
The likes of the Nvidia H100 GPU consume up to 700W alone, never mind all the other components around it.

Newspapers in English

Newspapers from Australia