If not quantum computing, then what?
Moore’s Law is dead and quantum computing isn’t coming to the rescue. Although, not any time soon. So, is there anything that can keep conventional computing moving forward? One promising technology is carbon nanotubes, one of those wonder technologies with seemingly infinite possible applications, from space elevators to solar cells. But how can they help computing? One of the key theoretical advantages of transistors built from carbon nanotubes could be much lower power consumption. That would allow for more powerful chips to operate within the limited energy budget of a device like a smartphone. Physically, it’s easy to fit a desktop-class processor die inside a smartphone. Powering it is the problem, and carbon nanotubes might solve it.
Optical interconnects both between and within chips are another big opportunity. Just as fibre-optic technology is boosting network speeds, including domestic Internet connections, building optical technology into computer chips could unlock a huge leap in bandwidth and energy efficiency. But of all the options, more efficient circuit design is likely to bring the most immediate benefits to compute power. Graphics chips are a handy yardstick, here. Already, Nvidia’s GPUs are significantly more efficient in terms of overall graphics rendering performance per unit of die area than AMD’s competing graphics chips. That’s thanks to better optimised chip design.
For future Nvidia GPUs, including the brand new Turing architecture, that refinement and efficiency is set to improve further thanks to architectural tweaks that combine multiple small operations into a single larger operation. AMD is taking a similar approach with its Rapid Packed Math technology in the latest Vega GPUs, even if it remains behind for overall efficiency. With Moore’s Law consigned to history, that kind of optimised chip design will be more critical than ever before.