NAPLAN has lost sight of its main purpose
DUE to a number of challenges with this year’s NAPLAN test, many people are questioning its value. From computer glitches for online test-takers to high absentee rates in year 9, it is difficult to know whether to trust the results.
While we are told to interpret the results with care, it is easy to see why the public is losing faith in the system. The concerns are real. For standardised testing to be reliable, strict procedures must be followed. Such procedures guarantee that all students have fair and equal conditions when sitting for the test.
What we know about this year’s test is that many students didn’t have that opportunity. So we must use tremendous caution when reviewing the results and we must refrain from making major policy decisions on the basis of such outcomes. One of the greatest problems is the transition from paper tests to online tests. This year, about 50 per cent of students took the online test and 50 per cent took the paper test. We have been told to trust that the two tests are comparable, yet we don’t know exactly how the results have been made comparable.
Perhaps a bigger problem, however, is that many of the online students faced a number of computer issues during their testing time. Right now, we have no idea how many students were affected by computer glitches, or to what degree. Some students dealt with minor disruptions, while others had to completely start over on another day.
That means is the “fair and equal” conditions necessary for making a test trustworthy have been violated. Even minor disruptions can lead to frustration, anxiety and apathy, especially among students who already deal with test anxiety. That leads us to question whether we should even take this year’s results seriously.
What is even more troubling is how state politicians are proposing massive policy changes on the basis of these questionable results. Victorian parents should be deeply troubled by the proposal of Education Minister James Merlino to “ensure our year 9 students are more engaged” by linking performance on NAPLAN to their future job prospects.
That suggestion is misguided for many reasons. First, it assumes that year 9 student performance on NAPLAN is strongly affected by student motivation. While conflating poor motivation with poor performance might make sense at first glance, there is little actual evidence to support that levels of year 9 disengagement from NAPLAN are any different from those seen in years 3, 5 or 7.
Second, we see moves to further increase the stakes associated with NAPLAN performance as worrying. Decades of research, both in Australia and internationally, have found that attaching higher stakes to testing
do nothing to improve the performance being measured. The research finds that high stakes do, however, increase the likelihood of unintended, and often perverse, consequences. Such negative outcomes have included teaching to the test and a narrowing of the curriculum, as well as broader concerns around student anxiety.
To this end, we see the suggested introduction of a “proficiency certificate” tied to year 9 student NAPLAN results as entirely unnecessary. That possibly harmful move will only ratchet up the already considerable pressure faced by students and families.
What’s more, such measures will, in all likelihood, produce no improvements to student performance on NAPLAN.
We also find the highly politicised debate around NAPLAN entirely unhelpful to addressing the core business of student learning. For more than a decade, NAPLAN performance has served as a political football, a reason for governments and oppositions of all political persuasions to blame one another. The response from Tanya Plibersek, Shadow Minister for Education and Training, to the 2019 NAPLAN results was to note that, “on some measures, Russia is achieving better than Australia”.
Such commentary is entirely unhelpful and distracts us from attending to what is actually required — achieving more equitable funding and more equitable conditions for students and schools across Australia.
Perhaps this is a good time to recall the original intention of NAPLAN, which was to “take the temperature” of the Australian education system more broadly. We can admire pursuits to better understand how the overall system is operating and whether particular areas of education need greater attention. However, over the past decade, we have seen a widening distance between the original purpose of NAPLAN and how it is actually being used.
New proposals that only increase stakes, without any evidence to suggest this gets at the right problem, are only getting us farther away from what the tests were designed to do.
It is time we look at systematic conditions and stop blaming individual students, teachers or schools.