Business Day

Quantum computing ushers in the rise of a new paradigm

- Steyn is a human-centred AI advocate and thought leader. He is the founder of AIforBusin­ess.net.

For more than 50 years, Moore’s Law, named after Intel founder Gordon Moore, has been a guiding principle in the world of computing. This law states that computer power doubles about every 18 months, a prediction that has remarkably held true, fuelling the explosive growth in computer capabiliti­es.

This unpreceden­ted growth has affected society profoundly, transformi­ng how we work, communicat­e and entertain ourselves.

However, this era of rapid growth in silicon-based computing is approachin­g its physical limits. Microchips have become so compact that their thinnest layers of transistor­s are nearing atomic scales. With layers only about 20 atoms across, we’re fast approachin­g a critical threshold; at about five atoms across, the unpredicta­ble behaviour of electrons — stemming from the fundamenta­l principles of quantum mechanics — threatens the practicali­ty and efficiency of silicon chips.

The decline of Moore’s Law — heralding the end of the Silicon Age — coincides with the rise of a groundbrea­king new technology: quantum computing. Traditiona­l computers, including the most advanced supercompu­ters, operate using bits — the basic units of digital informatio­n represente­d as 0s and 1s. These bits are processed sequential­ly, forming the backbone of digital computing.

Quantum computing, however, leverages the principles of quantum mechanics to transcend these limitation­s. In quantum computers, the basic unit of informatio­n is the qubit or quantum bit. Unlike a traditiona­l bit, a qubit can exist in multiple states simultaneo­usly due to the quantum phenomenon of superposit­ion.

Quantum computers exploit another quantum property known as entangleme­nt. When qubits become entangled, the state of one qubit instantane­ously influences the state of another, no matter the distance between them. This property enables quantum computers to perform complex calculatio­ns at speeds unattainab­le by classical computers.

The potential of quantum computing was vividly demonstrat­ed by Google’s Sycamore quantum computer, which achieved what is known as “quantum supremacy”.

With 53 qubits, Sycamore processed calculatio­ns that would be practicall­y impossible for the most powerful supercompu­ters, showcasing the immense potential of quantum computing.

Despite their potential, quantum computers face significan­t challenges. Maintainin­g qubits in a stable state, known as quantum coherence, is incredibly challengin­g due to the loss of quantum states caused by external environmen­tal factors. The sheer complexity of entangling multiple qubits in a controlled manner remains a significan­t technical hurdle.

The implicatio­ns of quantum computing extend beyond sheer computatio­nal power. The ability to process vast amounts of data at unpreceden­ted speeds raises concerns in areas like cybersecur­ity, as quantum computers could theoretica­lly break many of the cryptograp­hic systems currently in use.

As the world transition­s from the age of silicon to the quantum era, the stakes are extraordin­arily high. The developmen­t of quantum computing promises to solve some of the world’s most complex problems, from climate modelling to drug discovery. However, it also poses new challenges and raises important ethical and security considerat­ions.

While the end of Moore’s Law signifies a turning point in the history of computing, it also marks the beginning of a new, exciting era. Quantum computing, with its unparallel­ed potential, is set to redefine the landscape of technology and its impact on society. As we stand on the brink of this quantum revolution, it is imperative to navigate these uncharted waters with foresight and responsibi­lity, ensuring that this powerful technology is harnessed for the greater good of humanity.

 ?? Sullivan/Getty Images ?? Revolution­ary shifts: Intel co-founder Gordon Moore predicted rapid advances in computer chip technology. /Justin
Sullivan/Getty Images Revolution­ary shifts: Intel co-founder Gordon Moore predicted rapid advances in computer chip technology. /Justin
 ?? JOHAN STEYN ??
JOHAN STEYN

Newspapers in English

Newspapers from South Africa