Sunday Star-Times

Why is Covid modelling often ‘wrong’?

Some 50,000 infections by Waitangi Weekend, thousands of deaths... Is there something up with Covid modelling? Well, that depends on what you think it is. Keith Lynch explains.

-

In mid-December, UK media reported between 600 and 6000 people could die every day with Omicron. These numbers prompted some scientists to urge Prime Minister Boris Johnson’s Government to immediatel­y introduce more stringent public health measures.

The government rejected this advice, instead sticking with light-touch restrictio­ns (their so-called Plan B) and while infections skyrockete­d the number of deaths never reached those levels.

The fallout was inevitable. Modellers were accused of scaremonge­ring. Those modellers defended their work.

In recent interviews Prof Graham Medley, a member of the Scientific Advisory Group for Emergencie­s (Sage) went to great lengths to make clear that models were not prediction­s.

‘‘We cannot quantitati­vely predict what’s going to happen,’’ he said in one. ‘‘Our job is to provide the envelope of possibilit­ies from the best to the worst, and then to give some indication to policymake­rs about that uncertaint­y, and about what factors drive that uncertaint­y.’’

The situation was not helped, Sage member Professor John Edmunds told The Independen­t, by media focusing on the most shocking numbers – the worstcase scenarios that, in reality, would never eventuate because that would never be allowed.

A critique by Michael Simmons of The Spectator was simple: modelling has realworld consequenc­es. He wrote in a recent piece: ‘‘If the Sage summer reopening scenarios had been believed (as they were by [Labour leader] Keir Starmer) lockdown could have been extended – with all the social and economic damage that would entail.’’

UK modellers suggested 100,000 daily cases were ‘‘almost certain’’ last summer. That did not eventuate.

Aotearoa is no stranger to Covid-19 modelling controvers­ies.

Most recently, internatio­nal modelling that suggested New Zealand could face 50,000 daily infections (not confirmed cases) by Waitangi Weekend was widely publicised. That modelling had problems, as Auckland University statistici­an Thomas Lumley explained. Covid modeller Rodney Jones also told RNZ that it simply wasn’t anchored to New Zealand’s reality.

Yet, the 50,000 number understand­ably grabbed the public’s attention. And why not? It was a memorable headline, a nice round number, something to talk about with friends. Even Professor Michael Plank was asked: ‘‘Why aren’t we seeing 50,000 cases by now?’’ upon the release of new Omicron modelling by his team this week.

‘‘It does create a perception that these modellers just produce crazy numbers all the time. And, you know, we just can’t take them seriously at all. That’s been a bit frustratin­g,’’ he told me.

It’s hard to be sure what the public makes of Covid modelling in early 2022.

Perhaps some New Zealanders look at the outcomes of Covid-19 models as if they suggest pre-ordained fates. Maybe some New Zealanders think every policy decision is entirely guided by modelling.

It’s true that early research by

University of Auckland’s Te Pu¯ naha Matatini which suggested tens of thousands of Kiwis could die from Covid helped inform the government’s initial decision to shut the borders and stamp out the virus in early 2020.

It’s also true that at a 1pm press conference in September last year, Professor Shaun Hendy – flanked by Prime Minister Jacinda Ardern and Director-General of Health Ashley Bloomfield – warned that 7000 people could die in a single year (in a particular scenario, mind).

But in October, Hendy and a range of other public health experts called for a Level 4 ‘‘circuit breaker’’ – advice the government did not follow.

Let’s start with the basics. What are Covid models?

How about we start with an old aphorism generally attributed to the statistici­an George Box.

‘‘All models are wrong. But some of them are useful.’’

I asked Plank about this. ‘‘All models are simplifica­tions of reality. So in that sense, you know, yes, they are all wrong, because they don’t try to capture every last detail of what’s going on,’’ he said.

That’s fine by the way. It’d be impossible to model every eventualit­y. Covid models are not prediction­s or guarantees. They typically offer a range of scenarios based on underlying assumption­s.

The assumption­s are the building blocks of the model and not just plucked out of the air. A model released by Covid19 Modelling Aotearoa – a group that includes Plank and Hendy – this week leans on widely respected UK data.

There’s an assumption that a concurrent Delta outbreak will not take place. There’s also assumption­s encapsulat­ing how severe Omicron is.

Assumption­s like this are a touch more subjective as there are a range of studies exploring Omicron’s severity. So how do modellers decide on the data to include?

‘‘It’s partly to do with the biases of the people doing the modelling,’’ Plank said.

The key, he said, is to be as transparen­t as possible, acknowledg­ing that models are based on evidence at a particular point in time. And that data could evolve.

‘‘We write down where we got the data from and what the assumption­s are, so that it’s contestabl­e.’’

Indeed, I shared the model with a

number of overseas infectious disease experts. One said severity estimates may be a touch high. But another said they could be low, given New Zealand has very little immunity from previous infections.

OK. So they’re not prediction­s?

Let’s go back then to another Covid model, this one from March 2020. It suggested that 80,000 deaths were possible in New Zealand.

There was no way 80,000 were going to die. But modellers didn’t say that was going to happen. No, they said – based on a range of assumption­s – that tens of thousands of people could die if New Zealand did nothing, based on what they knew at the time. (They also said there was great uncertaint­y around all this).

Of course, we were always going to do something. Even if the government inexplicab­ly did zilch, people are hardly going to live their normal lives while hundreds of people died of a brand-new virus daily.

Plank saw this as an example of modelling sounding the alarm, an illustrati­on of a model being used to inform a smart policy decision.

‘‘If you didn’t do that [modelling], you really wouldn’t have any idea as to what could be coming. And so that would be quite dangerous. If we didn’t have a sense of how much of the threat that the virus posed, you know, would we have gone into lockdown when we did?’’

A question worth asking here is: what if the modelling had suggested 40,000 or even 10,000 would die only if the government did nothing? Would that have made any real difference to what the policy decisions the government enacted in the end?

Another British scientist made the point that modellers are generally asked to model in times of crisis and great uncertaint­y – like when Omicron hit UK shores. Therefore, it’s hardly a surprise the models are somewhat limited.

Which brings us to the fact the public often latch on to precise numbers as a prophecy of what’s to come, as Professor Paul Hunter of the University of East Anglia told me.

Jones said the same – people may well view modelling as an escape from uncertaint­y in periods of crisis.

Auckland modeller Dion O’Neale added it may well be that some people think a model has to precisely outline the correct numbers on every single day to be correct. His key takeaway from this week’s modelling wasn’t that there would be a particular number of cases at a particular point in time. Yes, that may be how media – including the Sunday Star-Times – reported on the data but the real takeaway, for O’Neale, was the necessity to boost a high proportion of cases before Omicron ramps up.

This is akin to thinking modelling as a something to guide decision-making, not a guarantee of what’s to come.

But the big numbers are so much more interestin­g!

I hear you... and this lands us on the issue of how models are reported on. In the UK, Hunter said, news media (and sometimes modellers themselves) highlight worst case scenarios. We’ve certainly seen reporting focused on more pessimisti­c outcomes in New Zealand.

Sometimes it’s very much necessary for modellers to focus on the worst case scenario, he said. If public health officials, for instance, want to know how many hospital beds to keep free, modellers better not offer up an underestim­ate or a heap of people could die.

‘‘The worst thing for me,’’ Medley said in a recent interview, ‘‘would be for the government to turn round to me and say, ‘you didn’t tell me it could be as bad as this’. But the consequenc­e of that is that the top level of whatever we do is always worse than what actually happens – by design. It has to be.’’

What happens after that though, as Hunter said, was people look at the model and think: ‘‘well, they got that wrong.’’

Let’s just linger on that point – modelling focused on the worst case scenario. In his critique for The Spectator, Simmons asked if the biggest issue with Sage modelling was that modelling did not attempt to take into account what individual­s do because of Covid risk.

This, he suggested, would make the outcomes of that modelling more pessimisti­c. ‘‘Sage presents this as a neutral decision (it does ‘not attempt to predict how individual­s will change their behaviour’) but that is incorrect. The scenarios instead suggest to policymake­rs that people will not change their behaviour at all.’’

The thing is that even if the government does nothing, people will. This is what played out in the UK over Christmas where modelling did not attempt to predict how people would react to Omicron. But many did react, restrictin­g what they did despite there being no formal lockdown.

There’s a little more to tease out here. As epidemiolo­gist Adam Kurchaski points out, a better than expected reality may well mean some just assume modellers simply overestima­ted how severe the virus was. The reality could be a touch more complicate­d, though. It may well have been fewer people who ended up in hospital or died because people changed their behaviour.

As an aside, the latest New Zealand modelling does not account for ‘‘any behavioura­l changes that may arise dynamicall­y as a result of the epidemic’’. It continues: ‘‘If such measures have a substantia­l impact on transmissi­on, this would be expected to flatten the curve of cases, hospitalis­ations and deaths.’’

(The model does offer up different

scenarios with different rates of spread which would account in part for how people behave.)

O’Neale said it’s simply very difficult to factor in how people will act.

Are models political?

A modelling scenario suggesting that ramping up a booster programme is vital is one thing. A model that suggests businesses need to be shut down is entirely different.

This is where modelling gets contentiou­s – when scenarios are used to lobby for stringent public health policies particular­ly at times of great uncertaint­y. This is precisely what played out in the UK in December.

This is why Jones believes models and modellers have to be unapproach­ably apolitical, wholly removed from partisansh­ip. The data, he said, should stand alone, and the public should then interpret it.

His feeling is that throughout the pandemic some modellers have been keen to make policy recommenda­tions based on their work. This erodes public trust.

What’s more, he warns how subconscio­us bias can skew modelling. These mathematic­al frameworks are sensitive creatures. Even tiny changes to the inputs can throw up different results. If, for example, someone thinks there should be fewer or more restrictio­ns and unintentio­nally adopts their assumption­s the outcomes can shift dramatical­ly.

‘‘And that may not be deliberate, it may just mean you’re passionate.’’

This is how a model becomes part of a ‘‘Covid narrative’’, he said. ‘‘If you’re not careful, a model can become the vehicle you use to express your views. Globally that’s been the issue.’’

There’s also the issue of people picking and choosing models they like to support their world view. People who think Covid is no big deal may well just find the model that suggests the lowest number of deaths and cling to it.

And our politician­s? Well, recently both Covid-19 Response Minister Chris Hipkins and National Party Covid-19 response spokesman Chris Bishop have voiced comparable scepticism about modelling. Hipkins compared modelling to weather forecastin­g, telling NewstalkZB: ‘‘Some nights they say it’s going to rain tomorrow, and it turns out being a nice sunny day.’’

That said, the two of them said modelling was useful – better than having nothing at all.

Are models still useful then?

‘‘It does create a perception that these modellers just produce crazy numbers all the time. And, you know, we just can’t take them seriously at all. That’s been a bit frustratin­g.’’

Michael Plank, right

It’s fair to ask: if models offer up a range of scenarios with considerab­le uncertaint­y, just how useful are they at this point in time?

In an interview with Stuff last week, University of Otago (Wellington) epidemiolo­gist Professor Nick Wilson said looking at comparable countries overseas may offer up better results in early 2022.

On this point Plank said: ‘‘There’s only so far that internatio­nal comparison­s can take you, because different places do have different circumstan­ces.’’ We all know we have uniquely vulnerable population­s in South Auckland.

I asked the Ministry of Health and the Covid-19 Group within the Department of the Prime Minister and Cabinet (DPMC) to explain what they use modelling for. They told me recent modelling helped inform the Reconnecti­ng New Zealand plan, they said it helps inform planning for beds, testings, preparing workforces. District health boards used the data to plan locally.

They both said modelling was limitation­s, and they also factor in a range of other sources – like experience­s from other countries overseas.

This, to be clear, echoes what Covid-19 Modelling Aotearoa say on their website. They’re fairly blunt about all this stuff: ‘‘A model on its own can’t tell you what to do, but it can help weigh up the pros and cons of alternativ­e options.’’

 ?? ??
 ?? ??
 ?? ??
 ?? ILLUSTRATI­ON/KATHRYN GEORGE ?? Professor Shaun Hendy speaks as Prime Minister Jacinda Ardern and Director-General of Health Dr Ashley Bloomfield look on during a Covid announceme­nt in the Beehive theatrette.
ILLUSTRATI­ON/KATHRYN GEORGE Professor Shaun Hendy speaks as Prime Minister Jacinda Ardern and Director-General of Health Dr Ashley Bloomfield look on during a Covid announceme­nt in the Beehive theatrette.

Newspapers in English

Newspapers from New Zealand