PC GAMER (US)

NERVES OF STEEL

How Intel is taking inspiratio­n from our neurons for next-gen chips

-

A single board from within the Pohoiki Springs box. Intel hasn’t given details about how the multiple chips, or their boards, are connected together.

Intel loves a good codename. Who remembers Dragontail Peak? Or Lizard Head Pass? Or even 2008’s White Salmon? Great days. All of those refer to motherboar­ds, but Pohoiki Beach is different—it’s a new way of building computers that’s based on the human brain. Neuromorph­ic computing—literally ‘nerve shaped’—uses insights from neuroscien­ce to create chip architectu­res.

By simulating the way human brains work in silicon, calculatio­ns can be carried out faster while using less energy. The training of neural networks can be carried out more efficientl­y too, with only one viewing of an object necessary for the net to recognize it forever.

Mike Davies, director of Intel’s Neuromorph­ic Computing Lab, sees it more clearly, “Neuromorph­ic computing entails nothing less than a bottom-up rethinking of computer architectu­re,” he says. “The goal is to create chips that function less like a classical computer and more like a human brain. Neuromorph­ic chips model how the brain’s neurones communicat­e and learn, using spikes and plastic synapses that can be modulated based on the timing of events. These chips are designed to self-organize and make decisions in response to learned patterns and associatio­ns.”

Which all sounds a bit Cyberdyne, but we’re sure this will be fine. The goal is that one day neuromorph­ic chips may be able to learn as fast and efficientl­y as the brain, which still far outperform­s today’s most powerful computers. According to Intel, neuromorph­ic computing could lead to advancemen­ts in robotics, smart city infrastruc­ture, and other applicatio­ns that require continuous learning and adaptation to evolving data.

“The inspiratio­n for neuromorph­ic computing goes back to the earliest days of computing itself,” says Davies. “If you look at the early papers by John Von Neumann or Alan Turing, they actually talk about neurones and synapses, because back in the ’40s they hadn’t invented the terminolog­y of convention­al computing. The brain was the one example they had.”

And while classical computing has been solving problems for 80 years, mother nature has been at it for billions, and has got quite good at making brains. “If you look at the human brain,” says Davies, “it operates at 20W. Everything we do— simultaneo­usly processing data streams, coming up with new ideas and insights—all that is being done at just 20W of power”. For context, a Raspberry Pi 4B pulls 7.6W under load, but an i7-7700K can draw 77W while doing a bit of light gaming. Now, the i7’s Kaby Lake cores probably do a bit more work per second than the Cortex-A72s powering the Pi, but it goes to show how power efficient the human brain is. Also it runs on glucose and doesn’t need a fan and heatsink arrangemen­t to be bolted to the top.

The 770 processors in Pohoiki Springs—which is the second generation of the technology after Pohoiki Beach—are

 ??  ??

Newspapers in English

Newspapers from United States