Mail & Guardian

Shaky data skews literacy results

South African education does have a good story to tell — but not the one told by misleading statistics

- Nic Spaull

Every few years South Africa participat­es in internatio­nal tests of reading, maths and science to see what learners know and how this is changing over time. These tests are independen­tly set and usually comparable over time. Every four or five years we test our grade nines in maths and science and our grade fours and fives in reading (Progress in Internatio­nal Reading Literacy Study, Pirls), and every six years our grade six learners in reading and maths (Southern and Eastern African Consortium for Monitoring Educationa­l Quality, Sacmeq).

This year the results of these assessment­s are being released to the public. The 2015 grades four, five and nine results will be released in November and December and the 2013 grade six results were presented to Parliament earlier this month.

In what should have been the biggest news of the post-apartheid period, Parliament heard that the primary education system improved faster than any other education system in the history of internatio­nal testing — that is, since 1967. Our alleged improvemen­t was more than twice that of the fastest improving country in the world, Brazil.

To be specific, grade six pupils’ test scores improved by more than 0.9 standard deviations between 2007 and 2013 or the equivalent of an extra three years’ worth of learning. To put this in perspectiv­e, this is the same as taking Thailand’s or Mexico’s education system and making it equal to Finland’s or Canada’s in six years.

It makes for a great story or a wish from a fairy godmother, but not for plausible results from a psychometr­ically rigorous internatio­nal test. Note that it is not only South Africa that experience­d these colossal “gains”, but all Sacmeq countries, which is even more suspicious.

A big part of the alleged Sacmeq improvemen­ts arise from different methodolog­ies employed in 2007 and 2013, making them incomparab­le until they are properly equated.

The results presented to Parliament compare data from 2007 and 2013, yet the way these results were calculated in each period was not the same, and I should I know. I was appointed by Sacmeq earlier this year to analyse the results for the internatio­nal Sacmeq report.

After analysing the data I raised a number of serious technical concerns about the data that significan­tly affect the comparabil­ity and validity of the findings, and especially the fact that the weaker learners had been excluded from the final analysis.

I advised the Sacmeq secretaria­t to address these concerns before publishing the results because doing so would be misleading. Based on the subsequent response from the Sacmeq secretaria­t indicating that this would not happen, I said that I could not, in good conscience, continue with the analysis and resigned on technical grounds in August.

The issues I raised have not been addressed — the results presented to Parliament were the same as those that I identified as problemati­c. When this was going on I emailed the department of basic education to flag my concerns and cautioned against publishing the results.

The department was shocked by the unpreceden­ted improvemen­ts. In the presentati­on to Parliament they said: “Given the significan­t improvemen­ts, the South African national research team requested Sacmeq to double-check the results and were subsequent­ly reassured on their accuracy.” This is simply not good enough.

The l a c k o f c o mp a r a b i l i t y between 2007 and 2013 is so glaringly obvious one doesn’t need inside knowledge of the data to see how implausibl­e the results are.

At the same time that the learner reading scores soared (rising by 0.9 standard deviations), the teacher reading scores plummeted (dropping by 0.8 standard deviations), which is extremely peculiar.

If we are to believe the results, by 2013 basically all South African learners could read, with illiteracy rates dropping from 27% in 2007 to 3% in 2013.

This is at odds with Pirls, the other internatio­nal test South Africa does.

In 2011 it showed 29% of grade four students were reading-illiterate and 58% could not read for meaning, confirming a host of smaller studies showing the same thing.

If we dig a little deeper, the d e p a r t me n t ’ s p r e s e n t a t i o n to Parliament apparently showed that the biggest improvers were Limpopo and the Eastern Cape. These are the very same provinces that were placed under administra­tion (Section 100) in 2011 because they were dysfunctio­nal. To use the minister’s own words, these are the education system’s “pockets of disaster”. Yet Sacmeq would have us believe that illiteracy in Limpopo has been eradicated, dropping from 49% in 2007 to 5% in 2013. In stark contrast, our other internatio­nal test, Prepirls, showed that, of the more than 2 900 grade fours tested in Limpopo in 2011, 50% were reading-illiterate and 83% could not read for meaning.

The sad thing about all of this is that it does seem that South Africa is really improving — other reliable evidence points to this — but not nearly as fast as the Sacmeq test scores would have us believe.

According to the presentati­on, the Sacmeq questionna­ire data also shows that learners’ access to their own textbooks increased substantia­lly over the period from 45% to 66% for reading textbooks and from 36% to 66% for maths textbooks. This is good news.

In the latest turn of events the department explained that the results presented to Parliament were “preliminar­y”, that an “extensive verificati­on process” is underway, and that it is “fully aware of the issues raised in this regard”.

Why then did it choose to go ahead and present questionab­le results to Parliament?

Apparently researcher­s — aka me — have “misled the public” and my motives are “unclear”.

There is nothing unclear about my motives — there is a major technical concern and the public should not be misled into trusting these results presented to Parliament.

There is also no uncertaint­y about whether the Sacmeq’s results should have been presented to Parliament. They should not have been presented while there is still so much uncertaint­y around the comparabil­ity of the results.

The department has been aware of the serious technical concerns about the results since I emailed a number of members of the department’s research team many months ago. I drew attention to these problems and cautioned against publishing any results until they could be rectified.

What I do not understand is why the department would undermine their own technical credibilit­y by presenting questionab­le results to Parliament.

I would also not be surprised if the Sacmeq data — once comparable — did show an improvemen­t in line with those of other studies.

Soon we will also have the Pirls grade fours and fives results of 2015 as another data point to verify what is going on.

In South African education there is probably a good story to tell, so why muddy the waters by reporting impossible improvemen­ts based on dodgy data?

The department and Sacmeq must make sure the results of Sacmeq 2007 and 2013 are strictly comparable before reporting any further results and causing additional confusion.

Newspapers in English

Newspapers from South Africa