San Francisco Chronicle

Google to offer special AI chips to others

- By Cade Metz

A few years ago, Google created a computer chip to help power its giant artificial intelligen­ce systems. These chips were designed to handle the complex processes that some believe will be a key to the future of the computer industry.

On Monday, the Mountain View company said it would allow other companies to buy access to those chips through its cloud-computing service. Google hopes to build a business around the chips, called tensor processing units, or TPUs.

“We are trying to reach as many people as we can as quickly as we can,” said Zak Stone, who works alongside the small team of engineers that designs these chips.

Google’s move highlights several sweeping changes in the way modern technology is built and operated. Google is in the vanguard of a movement to design chips specifical­ly for artificial intelligen­ce, a worldwide push that includes dozens of start-

ups as well as familiar names like Intel, Qualcomm and Nvidia.

And these days, companies like Google, Amazon and Microsoft are not just big Internet firms. They are big hardware makers.

As a way of cutting costs and improving the efficiency of the multibilli­on-dollar data centers that underpin its online empire, Google designs much of the hardware inside these massive facilities, from the computer servers to the networking gear that ties these machines together. Its rivals do much the same.

In addition to its TPU chips, which sit inside its data centers, the company has designed an AI chip for its smartphone­s.

Right now, Google’s new service is focused on a way to teach computers to recognize objects, called computer vision technology. But as time goes on, the new chips will also help businesses build a wider range of services, Stone said.

At the end of last year, hoping to accelerate its work on driverless cars, Lyft began testing Google’s new chips.

Using the chips, the San Francisco ride-hailing company wanted to accelerate the developmen­t of systems that allow driverless cars to, say, identify street signs or pedestrian­s. “Training” these systems can take days, but with the new chips, the hope is that this will be reduced to hours.

“There is huge potential here,” said Anantha Kancherla, who oversees software for the Lyft driverless car project.

TPU chips have helped accelerate the developmen­t of everything from the Google Assistant, the service that recognizes voice commands on Android phones, to Google Translate, the app that translates one language into another.

They are also reducing Google’s dependence on chipmakers like Nvidia and Intel. In a similar move, it designed its own servers and networking hardware, reducing its dependence on hardware makers like Dell, HP and Cisco.

This keeps costs down, said Casey Bisson, who helps oversee a cloud computing service called Joyent, which is owned by Samsung. At times, the only way to build an efficient service is to build your own hardware.

“This is about packing as much computing power as possible within a small area, within a heat budget, within a power budget,” Bisson said.

A new wave of artificial intelligen­ce, including services like Google Assistant, are driven by “neural networks,” which are complex algorithms that can learn tasks on their own by analyzing vast amounts of data. By analyzing a database of old customer support phone calls, for example, a neural network can learn to recognize commands spoken into a smartphone. But this requires serious computing power.

Typically, engineers train these algorithms using graphics processing units, chips that were originally designed for rendering images for games and other graphics-heavy software. Most of these chips are supplied by Nvidia.

In designing its own AI chips, Google was looking to exceed what was possible with these graphics-oriented chips, speed up its own AI work and lure more businesses onto its cloud services.

At the same time, Google has gained some independen­ce from Nvidia and an ability to negotiate lower prices with its chip suppliers.

“Google has become so big, it makes sense to invest in chips,” said Fred Weber, who spent a decade as the chief technology officer at the chipmaker AMD. “That gives them leverage. They can cut out the middleman.”

This does not mean that Google will stop buying chips from Nvidia and other chipmakers. But it is altering the market. “Who’s buying and who’s selling has changed,” Weber said.

Over the years, Google has even flirted with the possibilit­y of designing its own version of the chips it buys from Intel.

Weber and other insiders question whether Google would ever do this, just because a CPU is so complex and it would be so much more difficult to design and maintain one of these chips. But at a private event in San Francisco last fall, David Patterson, a computer science professor at UC Berkeley who now works on chip technologi­es at Google, was asked if the company would go that far.

“That’s not rocket science,” he said.

 ?? Google ?? This data center houses Google’s tensor processing units, designed to handle complex processes.
Google This data center houses Google’s tensor processing units, designed to handle complex processes.

Newspapers in English

Newspapers from United States