WILL COMPUTERS KEEP GETTING FASTER?
Marc Nichols
Historically, the main driver for the constant increase in computer speed has been ‘Moore’s law’, effectively cramming more and more processing power into a single microchip. This particular trend will have to stop eventually, when individual processing elements get down to atomic scales, but there are other developments – such as parallel computing, smart algorithms and even the prospect of quantum computers – that mean speeds will continue to increase for the foreseeable future.