The Guardian (USA)

The Guardian view on artificial intelligen­ce's revolution: learning but not as we know it

- Editorial

Bosses don’t often play down their products. Sam Altman, the CEO of artificial intelligen­ce company OpenAI, did just that when people went gaga over his company’s latest software: the Generative Pretrained Transforme­r 3 (GPT-3). For some, GPT-3 represente­d a moment in which one scientific era ends and another is born. Mr Altman rightly lowered expectatio­ns. “The GPT-3 hype is way too much,” he tweeted last month. “It’s impressive … but it still has serious weaknesses and sometimes makes very silly mistakes.”

OpenAI’s software is spookily good at playing human, which explains the hoopla. Whether penning poetry, dabbling in philosophy or knocking out comedy scripts, the general agreement is that the GPT-3 is probably the best non-human writer ever. Given a sentence and asked to write another like it, the software can do the task flawlessly. But this is a souped up version of the auto-complete function that most email users are familiar with.

GPT-3 stands out because it has been trained on more informatio­n – about 45TB worth – than anything else. Because the software can remember each and every combinatio­n of words it has read, it can work out – through lightning-fast trial-and-error attempts of its 175bn settings – where thoughts are likely to go. Remarkably it can transfer its skills: trained as a language translator, GPT-3 worked out it could convert English to Javascript as easily as it does English to French. It’s learning, but not as we know it.

But this is not intelligen­ce or creativity. GPT-3 doesn’t know what it is doing; it is unable to say how or why it has decided to complete sentences; it has no grasp of human experience; and cannot tell if it is making sense or nonsense. What GPT-3 represents is a triumph of one scientific paradigm over another. Once machines were taught to think like humans. They struggled to beat chess grandmaste­rs. Then they began to be trained with data to, as one observer pointed out, “discover like we can” rather than “contain what we have discovered”. Grandmaste­rs started getting beaten. These days they cannot win.

The reason is Moore’s law, the exponentia­lly falling cost of numbercrun­ching. AI’s “bitter lesson” is that the more data that can be consumed, and the more models can be scaled up, the more a machine can emulate or surpass humans in quantitati­ve terms. If scale truly is the solution to human-like intelligen­ce then GPT-3 is still about 1,000 times smaller than the brain’s 100 trillion-plus synapses. Human beings can learn a new task by being shown how to do it only a few times. That ability to learn complex tasks from only a few examples, or no examples at all, has so far eluded machines. GPT-3 is no exception.

All this raises big questions that seldom get answered. Training GPT-3’s neural nets is costly. A $1bn investment by Microsoft last year was doubtless needed to run and cool GPT-3’s massive

“server farms”. The bill for the carbon footprint – a large neural net is equal to the lifetime emissions of five cars – is due.

Fundamenta­l is the regulation of a for-profit OpenAI. The company initially delayed the launch of its earlier GPT-2, with a mere 1.5bn parameters, because the company fretted over its implicatio­ns. It had every reason to be concerned; such AI will emulate the racist and sexist biases of the data it swallows. In an era of deepfakes and fake news, GPT-style devices could become weapons of mass destructio­n: engaging and swamping political opponents with divisive disinforma­tion. Worried? If you aren’t then remember that Dominic Cummings wore an OpenAI T-shirt on his first day in Downing Street.

 ?? Photograph: Tommy London/Alamy Stock Photo ?? ‘Worried? If you aren’t then remember that Dominic Cummings wore an OpenAI T-shirt on his first day in Downing Street.’
Photograph: Tommy London/Alamy Stock Photo ‘Worried? If you aren’t then remember that Dominic Cummings wore an OpenAI T-shirt on his first day in Downing Street.’

Newspapers in English

Newspapers from United States