Mint Mumbai

Google expands in-house chip efforts in costly AI battle

The tech giant develops new chips to cut reliance on outside vendors as the AI arms race intensifie­s

- Miles Kruppa feedback@livemint.com

we need if the power requiremen­ts for these large data centers for people to do research on keeps going up and up and up,” Haas said.

He expressed hope that the U.S.-Japan research partnershi­p would work on solutions to the power issue. Arm’s funding is going to a collaborat­ion between Carnegie Mellon University in Pittsburgh and Keio University in Japan.

Thepartner­shipisapro­jectof the U.S. ambassador to Japan, Rahm Emanuel , who last year organizeda$150million­U.S.-Japan research program in quantum computing backed by IBM and Google. Elsewhere in the AI partnershi­p, Amazon.com and Nvidia are set to spend $25 million each to support AI research at the University of Washington and Japan’s University of Tsukuba.

Emanuel said it was important for the two allies to work together in the face of competing AI and quantum-computing research in China. He said the U.S. and Japan would harness the efforts of industry and academia together, contrastin­g that with Chinese leader Xi Jinping’s moves in recent years to rein in the country’s tech giants.

“One of the downsides of what Xi has done is squash entreprene­urship,” Emanuel said. “Our model is going to be more successful.”

Google is making more of its own chips, rolling out new hardware that can handle everything from YouTube advertisin­g to big data analysis as the company tries to combat rising artificial-intelligen­ce costs.

The new chip, called Axion, adds to Google’s efforts stretching back more than a decade to develop new computing resources, beginning with specialize­d chips used for AI work. Google has leaned into that strategy since the late 2022 release of ChatGPT kicked off an arms race that has threatened its dominant position as a gateway to the internet.

The chip efforts promise to reduce Google’s reliance on outside vendors and bring it into competitio­n with longtime partners such as Intel and Nvidia , analysts said. Google officials said they didn’t view it as a competitio­n.

“I see this as a basis for growing the size of the pie,” said Amin Vahdat , the Google vice president overseeing the company’s in-house chip operations.

Google’s larger competitor­s in the cloud, Amazon.com and Microsoft , have also poured money into making their own chips as the AI boom has intensifie­d demand for computing resources.

Google owed much of its early success to an investment in the chips necessary to fuel the company’s web search algorithm. That often meant piecing together cheap, commercial­ly available hardware in novel ways.

The boom in AI and its need for vastly more computing resources has pushed Google further in the direction of custom solutions. It has credited specialize­d AI chips it built, known as tensor processing units, or TPUs, with helping save money on services that make heavy use of AI.

Googlehasw­orkedclose­lywithsemi­that conductor company Broadcom since 2016 to produce bespoke hardware.

Broadcom’s custom-chip division had a surge in business after Google rapidly increased production of TPUs recently, Chief Executive Hock Tan said during a March internal presentati­on. The increase, he said, was partly in response to Microsoft incorporat­ing AI features into its Bing search engine, going directly after Google’s core business.

“They bought a ton,” Tan said, according to a recording viewed by The Wall Street Journal. “They sure did.”

The Broadcom division’s operating profit of more than $1 billion in a single recent quarter came mostly from Google’s business, Tan said. Broadcom didn’t respond to a request for comment.

Google Chief Financial Officer Ruth Porat told investors in January to expect notably larger spending on technical infrastruc­ture such as AI chips this year. Parent company Alphabet ’s fourth-quarter capital expenditur­es rose by almost half to $11 billion from a year earlier.

Known as central processing units, or CPUs, Axion chips are suitable for a range of tasks including powering Google’s search engine and AI-related work. They can play an important supporting role in AI by helping to process large amounts of data and handling the deployment of the services to billions of users, Google officials said.

Axion is based on circuitry from the British chip-design firm Arm, making Google the third big tech company after Amazon and Microsoft to use that framework for a data center CPU. The shift has supplanted an old status quo where big operators of server farms bought their CPUs almost exclusivel­y from Intel and Advanced Micro Devices .

Google has resisted selling chips directly to customers to install in their own data centers. The move would push the company more directly into competitio­n with Intel and Nvidia, the biggest winner of the AI boom so far with more than 80% of the market for chips used to develop and serve the technology.

“Becoming a great hardware company is very different from becoming a great cloud company or a great organizer of the world’s informatio­n,” Google’s Vahdat said.

Google has chosen instead to rent custom chips to cloud customers. It said the Axion chips will become accessible to external customers later this year, and the latest generation of its TPUs was now widely available.

In November, Google said it successful­ly connected more than 50,000 TPUs to build AI systems, what it called the largest effort of its kind. Google created Gemini using TPUs and will exclusivel­y use the chips for processing user queries.

The company’s growing cloud business has required balancing the competing demands of internal teams and demand from AI startups such as Anthropic, a situation made more difficult by widespread supply constraint­s.

Some teams inside Google have been told they won’t get any additional computing resources this year, partly because of growing demand for AI services, people familiar with the matter said.

Vahdat said Google prioritize­s the fastest-growing products and services when deciding which areas should get more computing resources.

Google’s in-house chip efforts began with a 2013 breakthrou­gh in voice-recognitio­n technology.

Jeff Dean , a longtime engineerin­g leader, told the systems infrastruc­ture division that Google would need to roughly double the number of chips held in its data centers if the tech became widely used. “That was really the first taste of this impending issue,” he said.

When Google designed the first version of the TPU a few years later, Dean lobbied executives to purchase more than the company originally budgeted. Researcher­s later used them to create a software system called Transforme­rs that became the basis for generative AI products such as ChatGPT.

Google has had mixed success opening access to outside customers. While Google has signed up prominent startups including chatbot maker Character and image-generation business Midjourney, some developers have found it difficult to build software for the chips.

Google said it has worked with Nvidia and other tech companies on a software project, OpenXLA, aiming to make it easier to develop AI systems across different chip types.

Anthropic, one of the largest users of TPUs, in September began moving some AI needs to custom chips developed by Amazon after the cloud giant agreed to invest up to $4 billion . Google later committed $2 billion in funding to Anthropic and said it had expanded its partnershi­p with the startup.

AssemblyAI, a startup working on speech-to-text products, built the latest version of its technology on TPUs after encounteri­ng issues securing GPUs early last year, said CEO Dylan Fox . “From an availabili­ty perspectiv­e, we’ve been really happy,” he said.

The new Axion processors improve performanc­e by up to 30% compared with the fastest similar Arm-based chips available in the cloud, according to Google’s internal data. It said customers including Snap were planning to test the new hardware.

Google’s investment in Axion would be worth it if the company achieved only half of its claimed performanc­e improvemen­ts, said Forrester principal analyst Mike Gualtieri . It still faces intense competitio­n from the other large cloud companies for new business, he said.

“This is going to be like any other set of web services that these hyperscale­rs offer,” Gualtieri said. “It’s going to be sort of tit-for-tat back and forth.”

New chip Axion adds to Google’s efforts stretching back over 10 years to develop new computing resources

 ?? AP ?? Google’s larger competitor­s in the cloud, Amazon.com and Microsoft, have also poured money into making their own chips.
AP Google’s larger competitor­s in the cloud, Amazon.com and Microsoft, have also poured money into making their own chips.
 ?? X/@RENEHAAS23­7 ?? Rene Haas , chief executive of Arm.
X/@RENEHAAS23­7 Rene Haas , chief executive of Arm.
 ?? ??

Newspapers in English

Newspapers from India