The Guardian (USA)

Covid-19 has shown humanity how close we are to the edge

- Toby Ord

It is profoundly difficult to grapple with risks whose stakes may include the global collapse of civilisati­on, or even the extinction of humanity. The pandemic has shattered our illusions of safety and reminded us that despite all the progress made in science and technology, we remain vulnerable to catastroph­es that can overturn our entire way of life. These are live possibilit­ies, not mere hypotheses, and our government­s will have to confront them.

As Britain emerges from Covid-19, it could find itself at the forefront of the response to future disasters. The government’s recent integrated review, Britain’s taking of the G7 presidency and the Cop26 climate conference, which will be hosted in Glasgow later this year, are all occasions to address global crises. But in order to ensure that the UK really is prepared, we need to first identify the biggest risks that we face in the coming decades.

Technologi­cal progress since the Industrial Revolution has ultimately increased the risk of the most extreme events, putting humanity’s future at stake through nuclear war or climate breakdown. One technology that may pose the greatest threat this century is artificial intelligen­ce (AI) – not the current crop of narrowly intelligen­t networks, but more mature systems with a general intelligen­ce that surpasses our own. AI pioneers from Alan Turing to Stuart Russell have argued that unless we develop the means to control such systems or to align them with our values, we will find ourselves at their mercy.

By my estimation, the chances of such a risk causing an existentia­l catastroph­e in the next century are about

one in six: like Russian roulette. If I’m even roughly right about the scale of these threats, then this is an unsustaina­ble level of risk. We cannot survive many centuries without transformi­ng our resilience.

The government’s recent integrated review highlighte­d the importance of these “catastroph­ic-impact threats”, paying attention to four of the most extreme risks; the threats from AI, global pandemics, the climate crisis and nuclear annihilati­on. It rightly noted the crucial role that AI systems will play in modern warfare, but was silent about the need to ensure that the AI systems we deploy are developed safely and aligned with human values. It underscore­d the likelihood of a successful biological attack in the coming years, but could have said more about the role science and technology can play in protecting us. And although it mentioned the threat of other countries increasing and diversifyi­ng their nuclear capabiliti­es, the decision to expand the UK’s own nuclear arsenal is both disappoint­ing and counterpro­ductive.

To really transform our resilience to extreme risks, we need to go further. First, we must urgently address biosecurit­y. As well as the possibilit­y of a new pandemic spilling over from animals, there is the even worse prospect of an engineered pandemic, designed by foreign states or non-state actors, with a combinatio­n of lethality, transmissi­bility, and vaccine resistance beyond any natural pathogen. With the rapid improvemen­ts in biotechnol­ogy, the number of parties who could create such a weapon is only growing.

To meet this risk, the UK should launch a new national centre for biosecurit­y, as has been recommende­d by the joint committee on the National Security Strategy and my own institute at Oxford University. This centre would counter the threat of biological weapons and laboratory escapes, develop effective defences against biological threats and foster talent and collaborat­ion across the UK biosecurit­y community. There is a real danger that the legacy of Covid-19 does not go beyond preparing for the next naturally occurring pandemic, neglecting the possibilit­ies of a human-made pandemic that keep experts up at night.

Second, the UK needs to transform its resilience to the full range of extreme risks we face. We don’t know what the next crisis on the scale of Covid-19 will be, so we need to be prepared for all such threats. The UK’s existing risk management system, within the Cabinet Office’s civil contingenc­ies secretaria­t, is strong in many ways, but it only addresses risks that pose a clear danger in the next two years – making it impossible to adequately evaluate dangers that would take more than two years to prepare for, such as those posed by advanced AI. We also suffer from the lack of a chief risk officer, or equivalent position, who could take sole responsibi­lity for the full range of extreme threats across government.

Third, we need to put extreme risks on the internatio­nal agenda. These are global problems that require global solutions. The legal scholar Guglielmo Verdirame argues that while the climate emergency and nuclear weapons are covered by at least some internatio­nal law, there is no global legal regime in force that grasps the gravity of other extreme risks, or that has the necessary breadth to deal with the changing landscape of such risks. The G7 presidency is the perfect opportunit­y to remedy this. Rather than settle for a treaty on pandemic preparedne­ss, as is being proposed by the prime minister, the UK could set its ambitions higher, and lead the call for a new treaty on risks to the future of humanity, with a series of UN security council resolution­s to place this new framework on the strongest possible legal footing.

There is an understand­able tendency for even the most senior people in government to see extreme risks as too daunting to take on. But there are concrete steps that the UK can take to transform its resilience to these threats, and there is no better time to do so than now. Covid-19 has given us the chance to make decades’ worth of progress in a matter of months. We must seize this opportunit­y.

Toby Ord is a senior research fellow in philosophy at Oxford University, and author of The Precipice: Existentia­l Risk and the Future of Humanity

 ??  ?? Eagle Creek wildfire, close to Beacon Rock golf course, Washington, US, in 2017: ‘We need to transform resilience to the full range of extreme risks we face. We don’t know what the next crisis will be.’ Photograph: Reuters
Eagle Creek wildfire, close to Beacon Rock golf course, Washington, US, in 2017: ‘We need to transform resilience to the full range of extreme risks we face. We don’t know what the next crisis will be.’ Photograph: Reuters

Newspapers in English

Newspapers from United States