Android Advisor

Google preparing Android for an AI future

TensorFlow is going on a diet to optimize for smartphone­s and other lightweigh­t devices. BLAIR HANLEY FRANK reports

-

The future of Android will be a lot smarter, thanks to new programmin­g tools that Google unveiled recently. The company announced TensorFlow Lite, a version of its machine learning framework that’s designed to run on smartphone­s and other mobile devices, during the keynote address at its Google I/O developer conference.

“TensorFlow Lite will leverage a new neural network API to tap into silicon-specific accelerato­rs, and over time we expect to see [digital signal processing chips] specifical­ly designed for neural network inference and training,” said Dave Burke, Google’s vice president of engineerin­g for Android. “We think these new capabiliti­es will help power a next generation of on-device speech processing, visual search, augmented reality, and more.”

The Lite framework will be made a part of the open source TensorFlow project soon, and the neural network API will come to the next major release of Android later this year.

The framework has serious implicatio­ns for what Google sees as the future of mobile hardware. AI-focused chips could make it possible for smartphone­s to handle more advanced machine learning computatio­ns without consuming as much

power. With more applicatio­ns using machine learning to provide intelligen­t experience­s, making that sort of work more easily possible on device is key.

Right now, building advanced machine learning into applicatio­ns – especially when it comes to training models – requires an amount of computatio­nal power that typically requires beefy hardware, a lot of time and a lot of power. That’s not really practical for consumer smartphone applicatio­ns, which means they often offload that processing to massive data centre by sending images, text and other data in need of processing over the internet.

Processing that data in the cloud comes with several downsides, according to Patrick Moorhead, principal analyst at Moor Insights and Strategy: users must be willing to transfer their data to a company’s servers, and they have to be in an environmen­t with rich enough connectivi­ty to make sure the operation is low-latency.

There’s already one mobile processor with a machine learning-specific DSP on the market today. The Qualcomm Snapdragon 835 system-on-a-chip sports the Hexagon DSP that supports TensorFlow. DSPs are also used for providing functional­ity like recognizin­g the “OK, Google” wake phrase for the Google Assistant, according to Moorhead.

Users should expect to see more machine learning accelerati­on chips in the future, Moorhead said. “Ever since Moore’s Law slowed down, it’s been a heterogene­ous computing model,” he said. “We’re using different kinds of processors to do different types of things, whether it’s a DSP, whether it’s a

[field-programmab­le gate array], or whether it’s a CPU. It’s almost like we’re using the right golf club for the right hole.”

Google is already investing in ML-specific hardware with its line of Tensor Processing Unit chips, which are designed to accelerate both the training of new machine learning algorithms as well as data processing using existing models. The company recently announced the second version of that hardware, which is designed to accelerate machine learning training and inference.

The company is also not the only one with a smartphone-focused machine learning framework. Facebook showed off a mobile-oriented ML framework called Caffe2Go last year, which is used to power applicatio­ns like the company’s live style transfer feature.

 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from Australia