Albany Times Union

To power AI, startup creates a giant computer chip

- By Cade Metz

SAN FRANCISCO — The largest computer chips would usually fit in the palm of your hand. Some could rest on the tip of your finger. Convention­al wisdom says anything bigger would be a problem.

Now a Silicon Valley startup, Cerebras, is challengin­g that notion. Recently, the company unveiled what it claims is the largest computer chip ever built. As big as a dinner plate — about 100 times the size of a typical chip — it would barely fit in your lap.

The engineers behind the chip believe it can be used in giant data centers and help accelerate the progress of artificial intelligen­ce in everything from self-driving cars to talking digital assistants like Amazon’s Alexa.

Many companies are building new chips for AI, including traditiona­l chipmakers like Intel and Qualcomm and other startups in the United States, Britain and China.

Some experts believe these chips will play a key role in the race to create artificial intelligen­ce, potentiall­y shifting the balance of power among tech companies and even nations. They could feed the creation of commercial products and government technologi­es, including surveillan­ce systems and autonomous weapons.

Google has already built such a chip and uses it in a wide range of AI projects, including the Google Assistant, which recognizes voice commands on Android phones, and Google Translate, which translates one language into another.

“There is monstrous growth in this field,” said Cerebras’ chief executive and founder, Andrew Feldman, a chip industry veteran who previously sold a company to the chip giant AMD.

New AI systems rely on neural networks. Loosely based on the network of neurons in the human brain, these complex mathematic­al systems can learn tasks by analyzing vast amounts of data. By

pinpointin­g patterns in thousands of cat photos, for instance, a neural network can learn to recognize a cat.

That requires a particular kind of computing power. Today, most companies analyze data with help from graphics processing units, or GPUS. These chips were originally designed to render images for games and other software, but they are also good at running the math that drives a neural network.

About six years ago, as tech giants like Google, Facebook and Microsoft doubled down on artificial intelligen­ce, they started buying enormous numbers of GPUS from the Silicon Valley chipmaker Nvidia. In the year leading up to the summer of 2016, Nvidia sold $143 million in GPUS. That was more than double the year before.

But the companies wanted even more processing power. Google built a chip specifical­ly for neural networks — the tensor processing unit, or TPU — and several other chipmakers chased the same goal.ai systems operate with many chips working together. The trouble is that moving big chunks of data between chips can be slow, and can limit how quickly chips analyze that informatio­n.

“Connecting all these chips together actually slows them down — and consumes a lot of energy,” said Subramania­n Iyer, a professor at the University of California, Los Angeles, who specialize­s in chip design for artificial intelligen­ce.

Hardware makers are exploring many different options. Some are trying to broaden the pipes that run between chips. Cerebras, a 3-year-old company backed by more than $200 million in funding, has taken a novel approach. The idea is to keep all the data on a giant chip so a system can operate faster.

Working with one big chip is very hard to do. Computer chips are typically built onto round silicon wafers that are about 12 inches in diameter. Each wafer usually contains about 100 chips.

Many of these chips, when removed from the wafer, are thrown out and never used. Etching circuits into the silicon is such a complex process, manufactur­ers cannot eliminate defects. Some circuits just don’t work. This is part of the reason that chipmakers keep their chips small — less room for error, so they do not have to throw as many of them away.

Cerebras said it had built a chip the size of an entire wafer.

Others have tried this, most notably a startup called Trilogy, founded in 1980 by the wellknown IBM chip engineer Gene Amdahl. Though it was backed by over $230 million in funding, Trilogy ultimately decided the task was too difficult and it folded after five years.

Nearly 35 years later, Cerebras plans to start shipping hardware to a small number of customers next month. Feldman said the chip could train AI systems between 100 and 1,000 times faster than existing hardware.

He and his engineers have divided their giant chip into smaller sections, or cores, with the understand­ing that some cores will not work. The chip is designed to route informatio­n around these defective areas.

Significan­t questions hang over the company’s hardware. Feldman’s performanc­e claims have not been independen­tly verified and he did not reveal how much the chip will cost.

The price will depend on how efficientl­y Cerebras and its manufactur­ing partner, the Taiwaneseb­ased TSMC, can build the chip.

The process is a “lot more labor intensive,” said Brad Paulsen, a senior vice president with TSMC. A chip this large also consumes large amounts of power, which means that keeping it cool will be difficult — and expensive. In other words, building the chip is only part of the task.

“This is a challenge for us,” Paulsen said. “And it is a challenge for them.”

Cerebras plans to sell the chip as part of a much larger machine that includes elaborate equipment for cooling the silicon with chilled liquid. It is nothing like what the big tech companies and government agencies are used to working with.

“It is not that people have not been able to build this kind of a chip,” said Rakesh Kumar, a professor at the University of Illinois who is also exploring large chips for AI. “The problem is that they have not been able to build one that is commercial­ly feasible.”

 ??  ?? Cerebras claims this giant computer chip is the largest one ever built.
Cerebras claims this giant computer chip is the largest one ever built.
 ?? Photos by Jessica Chou / New York Times ?? Andrew Feldman, the chief executive and founder of Cerebras, holds a computer chip that could improve how quickly artificial intelligen­ce systems can learn tasks, but its complexity and size could be a challenge.
Photos by Jessica Chou / New York Times Andrew Feldman, the chief executive and founder of Cerebras, holds a computer chip that could improve how quickly artificial intelligen­ce systems can learn tasks, but its complexity and size could be a challenge.

Newspapers in English

Newspapers from United States