PC Advisor

Time to dump Moore’s Law

An end to Moore’s Law will prompt chipmakers to think outside the box, reveals

- Agam Shah

Dumping Moore’s Law is perhaps the best thing that could happen to computers, as it’ll hasten the move away from an aging computer architectu­re holding back hardware innovation.

That’s the view of prominent scientist R. Stanley Williams, a senior fellow in the Hewlett Packard Labs. Williams played a key role in the creation of the memristor by HP in 2008.

Moore’s Law is an observatio­n made by Intel co-founder Gordon Moore in 1965 that has helped make devices smaller and faster. It predicts that the density of transistor­s would double every 18- to 24 months, while the cost of making chips goes down.

Every year, computers and mobile devices that are significan­tly faster can be bought with the same amount of money thanks in part to guidance from Moore’s Law. The observatio­n has helped drive up device performanc­e on a predictabl­e basis while keeping costs down.

But the prediction­s tied to Moore’s Law are reaching their limits as it becomes harder to make chips at smaller geometries. That’s a challenge facing all top chipmakers including Intel, which is changing the way it interprets Moore’s Law as it tries to cling on to it for dear life.

Williams is the latest to join a growing cadre of scientists who predict Moore’s Law is dying. The end of Moore’s Law “could be the best thing that has happened to computing in decades,” Williams wrote in a research paper published in the latest issue of IEEE Computing in Science and Engineerin­g.

The end of Moore’s Law will bring creativity to chip and computer design and help engineers and researcher­s think outside the box, Williams said. The law has bottled up innovation in computer design, he hinted.

So what’s next? Williams predicted there would be computers with a series of chips and accelerato­rs patched together, much like the early forms of superfast computers. Computing could also be memory driven, with a much faster bus driving speedier computing and throughput.

The idea of a memory-driven computer plays to the strength of HPE, which has built The Machine along those lines. The initial version of The Machine has persistent memory that can be used as both DRAM and flash storage but could eventually be based on memristor, an intelligen­t form of memory and storage that can track data patterns.

Memory-driven computing could also break down the current architectu­re-based and processor-centric domination of the computer market. In the longer term, neuromorph­ic chips designed around the way the brain works could drive computing.

In the longer term, neuromorph­ic chips that are designed around the way the brain works could drive computing. HPE is developing a chip designed to mimic a human brain, and similar chips are being developed by IBM, Qualcomm, and universiti­es in the US and Europe.

“Although our understand­ing of brains today is limited, we know enough now to design and build circuits that can accelerate certain computatio­nal tasks,” Williams wrote.

Applicatio­ns such as machine learning highlight the need for new types of processors. IBM has benchmarke­d its neuromorph­ic chip called TrueNorth as being faster and more power-efficient than convention­al deep-learning chips like GPUs.

Williams suggested ASICs and FPGAs (field-programmab­le gate arrays) could play a role in driving computing beyond Moore’s Law. These technologi­es will use superfast interconne­cts such as Gen Z, which was introduced last year and will be supported by major chipmakers and server makers, including Dell and Hewlett Packard Enterprise.

Quantum computers are also emerging as a way to replace today’s PCs and servers, but are still decades away from running everyday applicatio­ns.

 ??  ??

Newspapers in English

Newspapers from United Kingdom