Experts acknowledge data limitations
THE world is arguably at its strongest position in history to fight a pandemic as data collection, skills, expertise and technology have advanced significantly over the centuries.
Yet countries from every corner of the globe are battling to contain the spread of Covid-19, which has infected millions and killed hundreds of thousands of people within a few months.
This week, international and local multidisciplinary experts explained during a webinar, hosted by the University of Johannesburg (UJ) and titled “Data and Delusion after Covid-19”, that much of the shortcomings of both First and Third World countries’ approaches to the deadly virus was around data.
The information, which is typically numerical and collected through observation, is what world leaders have relied on to create lockdown regulations in a bid to curb the spread of the virus.
Data is also used to impose other measures that seek to protect the lives of its citizens while keeping their economies going.
But as professor Alex Broadbent, the director at UJ’S Institute for the Future of Knowledge (IFK) and chair of the webinar said, the case regarding data was one of quantity over quality.
“The scientific research machine is many times larger than ever before when facing serious pandemics such as the Asian flu in the middle of the 20th century.
“Our ability to share and analyse data has increased astonishingly as well, so how then can we not have enough data? And how can we apparently be no better able to understand our current situation and predict its outcome?
“Is this a case of misconception, the case of science being a victim of the unrealistic expectations created by its own success?”
Those engaged in the debate, including Professor Charis Harley, an academic in UJ’S Faculty of Engineering and the Built Environment, said data was used by scientists to create models that predicted how the coronavirus could potentially effect a population.
Harley said the data, which was used to make pronouncements on the lives of millions of different types of people, was subject to human error and could be interest-driven.
“When you are using a mathematical model, there are often hidden assumptions that you are employing to be able to model something,” she said.
“Without stating these assumptions very clearly and without being open and honest about the thought framework that you are employing and you are using this model for predictive purposes, you kind of introduce a piece of the puzzle but you are not providing a vision of what that model is representing.”
Harley said the urgency which was demanded to fight the spread of the pandemic could also be to blame for experts and scientists not always getting their models correct.
The time factor also skewed data collection. Harley listed Covid-19 testing delays as an example.
She said that despite the influx of information continuously being released about the pandemic, everyday people do not necessarily get the best possible data available.
“It is very much well known that the quality of data surrounding Covid19 is not that good and it’s not that reliable. We know that it has been withheld by various agencies and we are aware of the fact that there are time lags in the data being released,” she said.
The sentiments were echoed by Broadbent who said the government had not always made data available for others to assess.
“We can’t make good predictions about Covid-19 because we don’t have enough data,” he said.
Professor Olaf Dammann, the vice-chair of Public Health at Tufts University in Boston in the US, said the difficulty with data and models was that there was no one-size-fits-all system that could be employed to fight the virus in every country.
“During this global pandemic, any variable or factor that is playing locally in countries, or on continents, don’t necessarily play a role globally as well.”
He said that while some models were helpful, most were limited because they were global.
“Our interests are local and have to be local, and decision-making needs to be local, and that is very difficult when you are using global models with only a few variables that are affected by the interest of modellers and, therefore, they don’t necessarily give us what we want.”
The panel said that while time was of the essence, the irony was that researchers needed more time to compile more valuable data which could be shared with world leaders and populations at large.
“Research takes its time, it doesn’t rush and there is a systematic process that is being employed to be able to say something accurately,” Harley said.
Dammann agreed and said even medical experts, despite their qualifications, were experiencing the pandemic for the first time together with the rest of the world.
“This kind of pandemic is something that is really novel to most epidemiologists because we have not had an outbreak of this kind of magnitude for a long, long time and when we did, for example the 1918 Spanish Flu, we didn’t have the data skills we have now.
“We are gathering data to understand what is going on and then we are using this data to make predictions, and it is really hard to model something that we have never encountered before.”