The Lowdown: AI in chips
AI is increasingly finding its way into chipsets, but what effect will it have?
New chipsets are using artificial intelligence to boost their capabilities, but will it benefit you?
AI ON CHIPS?
Think of artificial intelligence and you will probably think of robots or clever software backed up by the power of the internet and cloud computing. And you’d be right. But AI tech is also being embedded into the heart of various processors and chipsets, and entering smartphones and headsets. Take Qualcomm’s new Snapdragon 845 mobile chip (see News,
Shopper 363): it not only offers gutsy processor and graphics performance, it’s also tuned to run machine-learning algorithms – effectively the clever code behind AI systems – on the chipset.
It uses a combination of chip parts, such as the Adreno GPU and the Hexagon vector processor, to power Qualcomm’s AI engine. This allows smart features such as the processing of voice commands in natural language to work at quite a lick and image recognition to be carried out on a smartphone, without relying on a consistent internet connection.
Having the tech on a widely available chipset gives developers more scope to put smart AI features into their apps and services.
HOW CAN A SINGLE CHIP POWER AI?
That question needs answering in two parts. First, AI systems are trained using machine-learning algorithms to spot patterns in data to answer questions or solve problems rather than follow a hard-coded set of instructions. This training can either be done under the supervision of human experts or left to AI to draw its own conclusions from data sets with or without guidance.
More advanced AI systems use a process called deep learning, which is reliant on an artificial neural network.
WAIT, A NEURAL WHAT?
In much the same way a human’s brain has a neural network of neurons connected by synapses, an artificial neural network is made up of layers of nodes.
When presented with data – for example, an image – the neural network breaks it up into many pieces, or indeed pixels, and each node tries to work out what it’s seeing, such as the colour of an image segment or its shape.
As the data flows through the network, each node tries to decide how right or wrong the specific data it’s looking at is for solving a query, figuring out if a dog is in a picture, for example, and then assigning it a corresponding weight or value.
Often data will filter through many layers of nodes, hence the term deep learning. The process is repeated with the weighting of the nodes changed until a system learns to get an answer that’s more than likely the correct one.
This process takes a lot of data and requires servers with hefty amounts of processor power to carry out AI training at scale. But once a machine-learning model is smart enough to solve a base level of tasks, the AI features it supports can be put into action. This stage is called inference: the AI still learns as it goes along, but it’s more likely to tailor its capabilities to individual users, such as learning to better understand their voice or what they like to buy or eat.
At this level, AI doesn’t need nearly as much compute power to figure things out, so a powerful mobile chip is enough to support it. That being said, more chip-makers and tech firms are adding specific hardware to better power AI features and allow for smart devices to get smarter by themselves.
SOUNDS INTRIGUING. WHO’S WORKING ON SUCH TECH?
A whole raft of tech firms. The likes of Intel, AMD, ARM and Nvidia are all working on chipsets that can either handle AI algorithms or have dedicated processors for powering smart software.
Microsoft’s next HoloLens augmented-reality headset is set to contain a coprocessor dedicated to running deep learning neural networks, with the goal of delivering better movement tracking and voice recognition on the device.
And Apple’s latest A11 Bionic chip, found in the iPhone X and iPhone 8, features the Neural Engine, which has circuits dedicated to accelerating AI-based software, image processing and speech recognition, as well as the advanced FaceID facial-recognition feature. Basically, there are a whole load of tech firms looking at putting AI into people’s hands and gadgets rather than leaving it stuck in a data centre.
GREAT, BUT HOW DOES ALL THIS AI TECH BENEFIT ME?
It not only makes a lot of existing features smarter, such as the automatic organisation of photos, but it also has a few other tricks up its sleeve. For example, researchers from the Massachusetts Institute of Technology have designed a neural network chip that uses AI smarts to potentially reduce smartphone battery consumption by a massive 95 per cent, something people with long commutes would welcome.
Google is using its AI tech to power Lens, a feature on the latest Pixel phones that can recognise what’s in a picture or camera feed and serve up useful information about it – literally judging a book by its cover, for example, to offer a precis on the author and where the book can be purchased.
In a nutshell, with the right training AI can handle all manner of human-assisting tasks, from things we need to things we don’t normally notice. And no, AI is not yet advanced enough to rise up and enslave us all.
A whole load of tech firms are looking at putting AI into people’s hands and gadgets