Weekend Argus (Saturday Edition)
Experts acknowledge data limitations
THE world is arguably in its strongest position in history to fight a pandemic as data collection, skills, expertise and technology have advanced significantly over the centuries.
Yet countries from every corner of the globe are battling to contain the spread of Covid-19, which has infected millions and killed hundreds of thousands of people within a few months.
This week, international and local multidisciplinary experts explained in a webinar, hosted by the University of Johannesburg (UJ) and titled “Data and Delusion after Covid-19”, that much of the shortcomings of both First and Third World countries’ approaches to the virus was around data.
The information, which is typically numerical and collected through observation, is what world leaders have relied on to create lockdown regulations in a bid to curb the spread of the virus.
But as professor Alex Broadbent, the director at UJ’s Institute for the Future of Knowledge and chair of the webinar said, the case regarding data was one of quantity over quality.
“The scientific research machine is larger than ever before when facing pandemics such as the Asian flu in the middle of the 20th century. Our ability to share and analyse data has increased astonishingly as well, so how then can we not have enough data? And how can we apparently be no better able to understand our current situation and predict its outcome?”
Those engaged in the debate, including Professor Charis Harley, an academic in UJ’s Faculty of Engineering and the Built Environment, said data was used by scientists to create models that predicted how the coronavirus could potentially affect a population.
Harley said the data, which was used to make pronouncements on the lives of millions of different types of people, was subject to human error and could be interest-driven.
“When you are using a mathematical model, there are often hidden assumptions that you are employing to be able to model something,” she said.
“Without stating these assumptions very clearly and without being open and honest about the thought framework that you are employing and you are using this model for predictive purposes, you kind of introduce a piece of the puzzle but you are not providing a vision of what that model is representing.”
Harley said the urgency which was demanded to fight the spread of the pandemic could also be to blame for experts and scientists not always getting their models correct.
The time factor also skewed data collection. Harley listed Covid-19 testing delays as an example.
She said despite the influx of information continuously being released about the pandemic every day, people do not necessarily get the best possible data available.
“It is very much well known that the quality of data surrounding Covid19 is not that good and it’s not that reliable. We know that it has been withheld by various agencies and we are aware of the fact that there are time lags in the data being released,” she said.
Professor Olaf Dammann, the vice-chair of Public Health at Tufts University in Boston in the US, said the difficulty with data and models was that there was no one-size-fits-all system that could be employed to fight the virus in every country.
“During this global pandemic, any variable or factor that is playing locally in countries, or on continents, don’t necessarily play a role globally as well.”
He said that while some models were helpful, most were limited because they were global.
“Our interests are local and have to be local, and decision-making needs to be local, and that is very difficult when you are using global models with only a few variables that are affected by the interest of modellers and, therefore, they don’t necessarily give us what we want.”