IBM inventor created chip that changed computing
Robert H. Dennard, an engineer who invented the silicon memory technology that plays an indispensable role in every smartphone, laptop and tablet computer, died April 23 in Sleepy Hollow, New York. He was 91.
The cause of death, at a hospital, was a bacterial infection, said his daughter, Holly Dennard.
Robert Dennard’s pioneering work began at IBM in the 1960s, when the equipment to hold and store computer data was expensive, hulking – often room-size machines – and slow. He was studying the emerging field of microelectronics, which used silicon-based transistors to store digital bits of information.
In 1966, Dennard invented a way to store one digital bit on one transistor – a technology called dynamic random-access memory, or DRAM, which holds the information as an electrical charge that slowly fades over time and must be refreshed periodically. His discovery opened the door to previously unimaginable improvement in data capacity, with lower costs and higher speeds all using tiny silicon chips.
DRAM has been the basis of steady progress in the decades since. Highspeed, high-capacity memory chips hold and quickly shuttle data to a computer’s microprocessor, which converts it into text, sound and images. Streaming videos on Youtube, playing music on Spotify or Apple Music and using AI chatbots like CHATGPT depend on them.
“DRAM has made much of modern computing possible,” said John Hennessy, chair of Alphabet, Google’s parent company.
Dennard also devised a concept that has served as a road map for future advances in microelectronics. Debuted in an initial paper in 1972, and fleshed out in another two years later, he described the physics that would allow transistors to shrink and become more powerful and less costly, even as the energy each one consumed would remain almost constant.
The principle, known as Dennard scaling, was complementary to a prediction made in 1965 by Gordon Moore, who went on to co-found Intel. Moore claimed that the number of transistors that could be crammed onto a silicon chip could be doubled about every two years – and that computing power and speeds would accelerate on that trajectory. His prediction became known as Moore’s Law.
Moore’s Law concerned the density of transistors on a chip, whereas Dennard scaling mainly concerned power consumption, and by 2005, it reached its limits: Transistors had become so tiny, they began to leak electrons, causing chips to heat up and consume more energy.