San Francisco Chronicle

Willing to chip in

As AI takes off, venture capitalist­s are tempted by nimble startups that now see opportunit­y alongside giants like Intel, Nvidia

- By Cade Metz

For years, tech industry financiers showed little interest in startups that made computer chips.

How on earth could a startup compete with a Goliath like Intel, which made the chips that ran more than 80 percent of the world’s personal computers? Even in the areas where Intel didn’t dominate, like smartphone­s and gaming devices, there were companies like Qualcomm and Nvidia that could squash an upstart.

But then came the tech industry’s latest big thing: artificial intelligen­ce. AI, it turned out, works better with new kinds of computer chips. Suddenly, venture capitalist­s forgot all those forbidding roadblocks to success for a young chip company.

Today, at least 45 startups are working on chips that can power tasks like speech and self-driving cars, and at least five of them have raised more than $100 million from investors. Venture capitalist­s invested more than $1.5 billion in chip startups last year, nearly doubling the investment­s made two years ago, according to research firm CB Insights.

The explosion is akin to the sudden proliferat­ion of PC and harddrive makers in the 1980s. While these are small companies, and not all will survive, they have the power to fuel a period of rapid technologi­cal change.

It is doubtful that any of the companies fantasize about challengin­g Intel head-on with their own chip factories, which can take billions of dollars to build. (The startups contract with other companies to make their chips.) But in designing chips that can provide the particular kind of computing power needed by machines learning how to do more and more things, these startups are rac-

ing toward one of two goals: find a profitable niche or get acquired. Fast.

“Machine learning and AI has reopened questions around how to build computers,” said Bill Coughran, who helped oversee the global infrastruc­ture at Google for several years and is now a partner at Sequoia, a Menlo Park venture capital firm. Sequoia has invested in Graphcore, a British startup that recently joined the $100 million club.

By the summer of 2016, the change was apparent. Google, Microsoft and other Internet giants were building apps that could instantly identify faces in photos and recognize commands spoken into smartphone­s by using algorithms, known as neural networks, that can learn tasks by identifyin­g patterns in large amounts of data.

Nvidia was best known for making graphics processing units, which were designed to help render complex images for games and other software — and it turned out they worked really well for neural networks, too. Nvidia sold $143 million in chips for the massive computer data centers run by companies like Google in the year leading up to that summer — double the year before.

Intel scrambled to catch up. It acquired Nervana, a 50-employee startup that had started building an AI chip from scratch, for $400 million, according to a report from Recode.

After that, Los Altos startup Cerebras Systems grabbed five Nervana engineers as it, too, designed a chip just for AI.

By early 2018, according to a report by Forbes, Cerebras had raised more than $100 million in funding. So had four other firms: Campbell’s Wave Computing; Graphcore; and two Beijing companies, Horizon Robotics and Cambricon, which is backed by the Chinese government.

Raising money in 2015 and early 2016 was a nightmare, said Mike Henry, chief executive at AI chip startup Mythic. But “with the big, acquisitio­n-hungry tech companies all barreling toward semiconduc­tors,” that has changed, he said.

China has shown a particular interest in developing new AI chips. A third Beijing chip startup, DeePhi, has raised $40 million, and the country’s Ministry of Science and Technology has explicitly called for the production of Chinese chips that challenge Nvidia’s.

Because it’s a new market — and because there is such hunger for this new kind of processing power — many believe this is one of those rare opportunit­ies when startups have a chance against entrenched giants.

The first big change will most likely come in the data center, where companies like Graphcore and Cerebras, which has been quiet about its plans, hope to accelerate the creation of new forms of AI. Among the goals are bots that can carry on conversati­ons and systems that can automatica­lly generate video and virtual reality.

Researcher­s at places like Microsoft and Google, which has built its own chip just for AI, “train” neural networks by extreme trial and error, testing the algorithms across vast numbers of chips for hours and even days on end. They often sit at their laptops, staring at graphs that show the progress of these algorithms as they learn from data. Chip designers want to streamline this process, packing all that trial and error into a few minutes.

Today, Nvidia’s GPUs can efficientl­y execute all the tiny calculatio­ns that go into training neural networks, but shuttling data between these chips is still inefficien­t, said Scott Gray, who was an engineer at Nervana before joining OpenAI, an artificial intelligen­ce lab whose founder include Tesla CEO Elon Musk.

So in addition to building chips specifical­ly for neural networks, startups are rethinking the hardware that surrounds them.

Graphcore, for example, is building chips that include more builtin memory so that they don’t need to send as much data back and forth. Others are looking at ways of widening the pipes between chips so that data exchange happens faster.

“This is not just about building chips but looking at how these chips are connected together and how they talk to the rest of the system,” Coughran, of Sequoia, said.

But this is only part of the change. Once neural networks are trained for a task, additional gear has to execute that task. At Toyota, autonomous car prototypes are using neural networks as a way of identifyin­g pedestrian­s, signs and other objects on the road. After training a neural network in the data center, the company runs this algorithm on chips installed on the car.

A number of chipmakers — including startups like Mythic, DeePhi and Horizon Robotics — are tackling this problem as well, pushing AI chips into devices ranging from phones to cars.

It is still unclear how well any of these new chips will work. Designing and building a chip takes about 24 months, which means even the first viable hardware relying on them won’t arrive until this year. And the chip startups will face competitio­n from Nvidia, Intel, Google and other industry giants.

But everyone is starting from about the same place: the beginning of a new market.

 ?? Robert Beatty / New York Times ??
Robert Beatty / New York Times

Newspapers in English

Newspapers from United States