BBC Science Focus

“If you look to the human brain for inspiratio­n, it’s very impressive”

Devices that mimic synapses –the junctions between neurons – could help us to produce more powerful computers. Physicist Dr Mike Schneider describes how he’s building them

-

What is the idea behind creating an artificial synapse?

When you have a connection between two neurons, whether or not one triggers the next is determined by the synapse. This mechanism is believed to be responsibl­e for things like memory. Lots of neurons are connected and the strength of their connection is varied by synapses. We wanted to see if we could make physical devices that match that, as opposed to the transistor­s and switches used in traditiona­l computing architectu­re. If you look to the human brain for inspiratio­n for computing, it’s very impressive: you have 100 billion neurons and 100 trillion synapses, and yet it consumes just 20 Watts of power. And it excels at tasks that our modern computers, which are fantastic at multiplyin­g and dividing numbers, don’t do very well.

How did you build an artificial synapse?

The structures we have are based on niobium, a metal, with the synapse itself made from silicon and nano- clusters of manganese. We’re running everything at 4 Kelvin [-269°C], the temperatur­e of liquid helium. When you get niobium cold, it becomes supercondu­cting so has zero resistance to electric current.

How closely does this mimic the human brain?

Our system is based on something called a ‘Josephson junction’. These are made by taking a supercondu­ctor and making a break in it using an electrical insulator. There are all kinds of interestin­g properties about them, but people have proposed that they could be used as an artificial neuron element because they produce a voltage surge that looks like the spike at a synapse, except it’s much faster and in lower energy. These artificial synapses could be put into machines modelled after the brain.

How could such ‘neuromorph­ic’ computers be used?

We are living in very exciting times where computing is concerned, with artificial intelligen­ce and machine learning. Within the latter, you have algorithms written in software starting to solve problems that have traditiona­lly been very difficult, like image recognitio­n or language translatio­n. These have a large ‘state space’ – the number of possible solutions to a problem. For image recognitio­n, that’s roughly the

number of all possible pixel configurat­ions, which is far too large to calculate explicitly. Over the past few years, deep ‘neural networks’ have made huge in-roads. What if we could make hardware that could run these algorithms sort of natively? The operations in the algorithm map well to neurons and synapses, so if you make a more efficient implementa­tion, you can attack more complex problems.

 ??  ??
 ??  ?? Nerve synapses at work in the human brain
Nerve synapses at work in the human brain
 ??  ?? Algorithms can make disordered artificial synapses function in a more orderly fashion
Algorithms can make disordered artificial synapses function in a more orderly fashion

Newspapers in English

Newspapers from United Kingdom