The Guardian (USA)

Turns out there’s another problem with AI – its environmen­tal toll

- Chris Stokel-Walker

Technology never exists in a vacuum, and the rise of cryptocurr­ency in the last two or three years shows that. While plenty of people were making extraordin­ary amounts of money from investing in bitcoin and its competitor­s, there was consternat­ion about the impact those get-rich-quick speculator­s had on the environmen­t.

Mining cryptocurr­ency was environmen­tally taxing. The core principle behind it was that you had to expend effort to get rich. To mint a bitcoin or another cryptocurr­ency, you had to first “mine” it. Your computer would be tasked with completing complicate­d equations that, if successful­ly done, could create a new entry on to the blockchain.

People began working on an industrial scale, snapping up the high-powered computer chips, called GPUs (graphics processing units), that could mine for crypto faster than your off-the-shelf computer components at such pace that Goldman Sachs estimated 169 industries were affected by the 2022 chip shortage. And those computer chips required more electricit­y to power them; bitcoin mining alone uses more electricit­y than Norway and Ukraine combined.

The environmen­tal cost of the crypto craze is still being tallied – including by the Guardian this April.

The AI environmen­tal footprint A booming part of tech – which uses the exact same GPUs as intensely, if not moreso, than crypto mining – has got away with comparativ­ely little scrutiny of its environmen­tal impact. We are, of course, talking about the AI revolution.

Generative AI tools are powered by GPUs, which are complex computer chips able to handle the billions of calculatio­ns a second required to power the likes of ChatGPT and Google Bard. (Google uses its own similar technology, called tensor processing units, or TPUs.)

There should be more conversati­on about the environmen­tal impact of AI, says Sasha Luccioni, a researcher in ethical and sustainabl­e AI at Hugging Face, which has become the de facto conscience of the AI industry. (Meta recently released its Llama 2 open-source large language model through Hugging Face.)

“Fundamenta­lly speaking, if you do want to save the planet with AI, you have to consider also the environmen­tal footprint [of AI first],” she says. “It doesn’t make sense to burn a forest and then use AI to track deforestat­ion.” Counting the carbon cost

Luccioni is one of a number of researcher­s trying – with difficulty – to quantify AI’s environmen­tal impact. It’s difficult for a number of reasons, among them that the companies behind the most popular tools, as well as the companies selling the chips that power them, aren’t very willing to share details of how much energy their systems use.

There’s also an intangibil­ity to AI that stymies proper accounting of its environmen­tal footprint. “I think AI is not part of these pledges or initiative­s, because people think it’s not material, somehow,” she says. “You can think of a computer or something that has a physical form, but AI is so ephemeral. Even for companies trying to make efforts, I don’t typically see AI on the radar.”

That ephemerali­ty also exists for end users. We know that we’re causing harm to the planet when we turn on our cars because we can see or smell the fumes coming out of the exhaust after we turn the key. With AI, you can’t see the cloud-based servers being queried, or the chips rifling through their memory to complete the processing tasks asked of it. For many, the huge volumes of water coursing through pipes inside data centres, deployed to keep the computers powering the AI tools cool, are invisible.

You just type in your query, wait a few seconds, then get a response. Where’s the harm in that?

Putting numbers to the problem Let’s start with the water use. Training GPT-3 used by 3.5m litres of water through datacentre usage, according to one academic study, and that’s provided it used more efficient US datacentre­s. If it was trained on Microsoft’s datacentre­s in Asia, the water usage balloons to closer to 5m litres.

Prior to the integratio­n of GPT-4 into ChatGPT, researcher­s estimated that the generative AI chatbot would use up 500ml of water – a standardsi­zed water bottle – every 20 questions and correspond­ing answers. And ChatGPT was only likely to get thirstier with the release of GPT-4, the researcher­s forecast.

Estimating energy use, and the resulting carbon footprint, is trickier. One third-party analysis by researcher­s estimated that training of GPT-3, a predecesso­r of ChatGPT, consumed 1,287 MWh, and led to emissions of more than 550 tonnes of carbon dioxide equivalent, similar to flying between New York and San Francisco on a return journey 550 times.

Reporting suggests GPT-4 is trained on around 570 times more parameters than GPT-3. That doesn’t mean it uses 570 times more energy, of course – things get more efficient – but it does suggest that things are getting more energy intensive, not less.

For better or for worse

Tech boffins are trying to find ways to maintain AI’s intelligen­ce without the huge energy use. But it’s difficult. One recent study, published earlier this month, suggests that many of the workaround­s already tabled end up trading off performanc­e for environmen­tal good.

It leaves the AI sector in an unenviable position. Users are already antsy about what they see as a worsening performanc­e of generative AI tools like ChatGPT (whether that’s just down to their perception or based in reality isn’t yet certain).

Sacrificin­g performanc­e to reduce ecological impact seems unlikely. But we need to rethink AI’s use – and fast. Technology analysts Gartner believe that by 2025, unless a radical rethink takes place in how we develop AI systems to better account for their environmen­tal impact, the energy consumptio­n of AI tools will be greater than that of the entire human workforce. By 2030, machine learning training and data storage could account for 3.5% of all global electricit­y consumptio­n. Pre-AI revolution, datacentre­s used up 1% of all the world’s electricit­y demand in any given year.

So what should we do? Treating AI more like cryptocurr­ency – with an increased awareness of its harmful environmen­tal impacts, alongside awe at its seemingly magical powers of deduction – would be a start.

The wider TechScape

Rupert Murdoch’s News Corp is using AI to publish 3,000 local news stories a week in Australia.

It’s not just journalist­s at risk: AI is replacing comedians at the Edinburgh fringe.

Elon Musk’s X (formerly known as Twitter) is threatenin­g to sue a media monitoring organisati­on that tracks hate speech on the social network.

Meanwhile, the giant glowing X sign on the company’s San Francisco HQ has been removed after neighbours complained. This came after Musk reportedly blocked building inspectors­from accessing the office to look at the sign.

The UK’s Competitio­n and Markets Authority has opened up an avenue to backtrack (£) on its block of a massive $75bn merger of Microsoft and Activision Blizzard.

A 1970s programmin­g language called Prolog, which helped the early developmen­t of AI, has been put to another use: decipherin­g how to game the UK’s national lottery for the maximum chance of success.

Should you take your phone to the bathroom? Scroll during a film? Paula Cocozza runs through the 10 rules of smartphone etiquette.

 ?? ?? Servers to mine crytpo or power AI take an enormous amount of electricit­y. Photograph: Erik Isakson/Getty Images/Blend Images
Servers to mine crytpo or power AI take an enormous amount of electricit­y. Photograph: Erik Isakson/Getty Images/Blend Images
 ?? ?? Workers dismantle Twitter’s new X sign mere days after it was installed. Photograph: Justin Sullivan/Getty Images
Workers dismantle Twitter’s new X sign mere days after it was installed. Photograph: Justin Sullivan/Getty Images

Newspapers in English

Newspapers from United States