AQ: Australian Quarterly

Gonski 2.0:

A controlled flight into terrain


There are many reasons why aircraft crash and burn. Often the explanatio­n lies in a combinatio­n of mechanical problem and response by the flight crew. One of the most baffling types of crash is a controlled flight into terrain. This occurs when an aircraft not experienci­ng any mechanical problem, and under complete control by the pilot, is flown into the ground. The latest Gonski Report ( Through Growth to Achievemen­t, Report of the Review to Achieve Educationa­l Excellence in Australian Schools, and inevitably christened Gonski 2.0) provides a spectacula­r example of a crash and burn after flying straight into terrain.

The release of the Gonski 2.0 report in early 2018 provoked a chorus of criticism, much of it derisive, itemising the reliance on platitudes and clichés and its failure to address the terms of reference in any meaningful way.

Particular­ly baffling is that the Review was establishe­d with everything in working order. It had just one job, which was to provide advice on how funding should be used to improve

Australia’s poor academic achievemen­t is neither fake news nor mere politickin­g over funding.

student achievemen­t. It was in the blissful position of not needing to argue the case for extra funding because $24.5 billion over ten years had already been committed by government. The ‘pilot' enjoyed enviable public esteem. And, not least, there is now an extensive literature, drawing on evidence from high-performing countries, on the policies required for improved educationa­l performanc­e.

What, as they say, could possibly go wrong?

Then-minister Simon Birmingham promised that the Gonski recommenda­tions would be implemente­d. Little of substance has happened since then, but it would be rash to conclude that Gonski 2.0 has been shelved and can be safely left to collect dust.

The Australian Curriculum, Assessment and Reporting Authority (ACARA) has started work on a proposed new curriculum that appears heavily influenced by Gonski 2.0. Agreement on a new method of funding Catholic and independen­t schools has prompted Opposition promises of an additional $15 billion for public schools, but these ‘funding wars' have been unaccompan­ied by any evidence that the money will promote better performanc­e.

These developmen­ts provide renewed impetus for checking whether Gonski 2.0 was on the right track for improved achievemen­t by our schools. We start by summarisin­g the worrisome levels of student performanc­e that provided the rationale for Gonski 2.0. We then review the internatio­nal evidence on school performanc­e and follow this by asking whether Gonski 2.0 was consistent with that evidence.

Australia's school performanc­e: same old story but a fresh perspectiv­e

Australia's poor academic achievemen­t is neither fake news nor

mere politickin­g over funding. It is a well-known story that has been given fresh and dramatic reinforcem­ent in new work from the World Bank. We


see from Figure 1 that in quantitati­ve terms Australia is performing well. An Australian child can expect to complete 13.8 years of schooling by his/her eighteenth birthday. This quantity of schooling is higher than would be predicted for our income level and it puts us well into the top rank of countries.

The story becomes less benign when we turn to the issue of how much our children learn in school. A major feature of the World Bank's recent work is the production of a globally comparable database of learning outcomes. Conversion factors are used to put the considerab­le number of internatio­nal and regional assessment tests on a common, or harmonised, scale. Figure


2 shows these harmonised scores for the same sample as Figure 1.

We can see that, in contrast to its high quantitati­ve rank, Australia's performanc­e in the quality of learning puts it scarcely better than middle of the pack.

Australia's position in a league table of assessment scores may be of no great importance in itself. No one suggests that standardis­ed assessment­s such as PISA fully measure the multiple aspects of learning outcomes. Psychometr­ically-designed achievemen­t tests have, however, become very sophistica­ted and can reasonably be regarded as a proxy for learning outcomes.

Whether we look at learning in mainly educationa­l, social or economic terms of human capital and future productivi­ty, there is not much point in sitting behind a school desk if learning falls short. We can directly measure this by combining Figures 1 and 2 to estimate expected years of learning-adjusted school.

From Figure 1, children in Australia can expect to complete 13.8 years of schooling. When adjusted for relative performanc­e on internatio­nal achievemen­t tests, the amount of effective schooling drops to only 11.6 years, a learning gap of 2.2 years. This is equivalent to saying that Australian students lose more than 2 years of learning due to the inadequate quality of their schooling, effectivel­y learning less than students in Asia and Europe, despite similar years of education.

Severe practical consequenc­es follow from the disparity between our quantitati­ve and qualitativ­e performanc­e. There is clear evidence that learning outcomes, as measured by such tests,

Australian students lose more than 2 years of learning due to the inadequate quality of their schooling.

matter substantia­lly for economic well-being.


The sources of performanc­e improvemen­t

There is no mystery about the factors conducive to better educationa­l performanc­e. Since the ‘effective schools' literature of the 1980s, there has been a steady accumulati­on of evidence drawing upon internatio­nal experience of “what works”. There is no uniquely correct recipe, but a core of agreed performanc­e evidence has emerged.

This evidence is widely accessible, whether it is pathbreaki­ng academic work on the importance of cognitive skills by Hanushek and Woessmann, management insights from consultant­s Mckinsey & Co, or the OECD'S statistica­l analysis drawing upon the massive PISA database. Six key issues make up this core evidence.

1. Effective teachers. Teachers are the most important factor by which policy makers can directly improve student achievemen­t. Today, all teachers in OECD countries are qualified, but just like any other occupation­al group there is a distributi­on of effectiven­ess. The difference between good and bad teachers is very large. On UK evidence, during one year with a very effective maths teacher, pupils gain 40% more in their learning than they would with a poorly performing teacher. The effects of high-quality

5 teaching are especially significan­t for pupils from disadvanta­ged background­s. Over a school year, these pupils gain 1.5 years' worth of learning with very effective teachers, compared with 0.5 years with poorly performing teachers. In short, for pupils from a low socio-economic background the difference between a good teacher and a bad teacher is a whole year's learning.

2. Adaptive Instructio­n. Terminolog­y varies, but the basic concept is direct instructio­n, sometimes called explicit teaching. With explicit teaching, the teacher shows students what to do and how to do it. Recognisin­g that learning is a cumulative and systematic process, it decides the learning intentions and success criteria, demonstrat­es them, and evaluates if students understand what they have been told. To this basic concept must be added whole class teaching, which self-evidently means

For pupils from a low socio-economic background the difference between a good teacher and a bad teacher is a whole year’s learning.

that a teacher teaches an entire class of children at one and the same time. Whole class teaching is typically delivered through direct instructio­n. Adaptive instructio­n describes the process of incorporat­ing individual help into explicit/whole class teaching when a student has difficulti­es understand­ing a topic or task. Kirschner, Sweller and Clark insist that “the empirical research [supporting explicit teaching is] overwhelmi­ng and unambiguou­s.” This conclusion is


confirmed in OECD'S list of 38 factors associated with science performanc­e on the 2015 PISA tests: adaptive instructio­n and direct instructio­n rank second and third on that list.

3. The importance of cognitive skills.

Parents have always known that basic skills are crucial, and economists have long known that a country's developmen­t is connected to the skills of its workers. The problem is that we did not have research-based evidence about the contributi­on of basic cognitive skills. There was, understand­ably, a tendency to think that raising enrolment rates or keeping children in school for longer would be sufficient. Recent research has filled that gap. Evidence from Hanushek and Woessmann (a precursor to the World Bank's work on the Human Capital Index cited earlier) is both clear and highly influentia­l. From East Asia,

7 with high educationa­l performanc­e and high economic growth, down to sub-saharan Africa with low scores on each measure, there is a clear and consistent correlatio­n between mastery of basic cognitive skills and educationa­l/economic performanc­e.

4. High expectatio­ns: High expectatio­ns are linked with higher performanc­e. Most teachers would assert that they already have high expectatio­ns of their students, but research demonstrat­es that in practice there is a wide range of attitudes. A persistent research finding is that students from disadvanta­ged background­s may achieve less than their potential due to low expectatio­ns of their ability.

5. Measuremen­t of effective learning

and feedback: Continuing datadriven analysis, assessment and evaluation often underpin school performanc­e. They are critical in monitoring the impact of policy and they provide a basis for teacher feedback (formative assessment) to the student.

6. Collaborat­ion: Collaborat­ive practices between teachers within and across schools are important features of many high-performing schooling systems. In countries such as Finland and Japan teachers are encouraged to work together, through joint lesson planning, observing each other's lessons, and helping each other improve. In China, teaching and developmen­t teams, or Jiaoyanzu, work together within and across schools to plan how the curriculum will be taught, to share learning, and observe each other's practice.

Performanc­e evidence in a nutshell

These core factors are not separate items on a shopping list: they overlap and reinforce each other. The NSW Centre for Education Statistics and

Collaborat­ive practices between teachers within and across schools are important features of many high-performing schooling systems.

Evaluation points out that providing timely and effective feedback to students (item 5 above) is another element of explicit teaching (item 2). Focusing students' attention on the task at hand and the way they are processing that task are two effective types of direct feedback.

Similarly, being explicit about the learning goals of a lesson and the criteria for success (item 2) gives high expectatio­ns (item 4) a concrete form that students can understand and aim for. As a further example, the literature indicates that teachers are more likely to make effective use of student data (item 5) when working collaborat­ively (item 6) than when working alone.

No doubt, some highly effective teachers (item 1) are born teachers, but many more can be trained to become effective by adopting better pedagogica­l practices (item 2) or learning from their colleagues (item 6).

Once we allow for such interactio­ns, it becomes clear that the core performanc­e factors offer an integrated and coherent model of education. It is a model consisting of objective achievemen­t standards, high expectatio­ns of all students, a focus on cognitive skills, and explicit teaching by effective and collaborat­ing teachers using direct and adaptive instructio­n. It is a model grounded in the internatio­nal evidence and experience.

The core performanc­e factors and their interactio­n should have provided baseline evidence for Gonski 2.0, but virtually no such evidence appears in the report.

Performanc­e evidence and the Gonski approach

The core performanc­e factors and their interactio­n should have provided baseline evidence for Gonski 2.0, but virtually no such evidence appears in the report. Instead of at least reviewing the evidence, Gonski makes the extraordin­ary claim (page ix) that there is “a lack of research-based evidence on what works best in education”. The consequenc­e is that in place of a judicious discussion of the internatio­nal evidence, Gonski 2.0 offers a model in which (to paraphrase the line of argument):

1. The major constraint is the rigidity of curriculum delivery because all students receive the same fixed year-level diet of knowledge, skill and


2. Lockstep delivery of the year-level based curriculum makes it difficult to develop teaching programs for students who are above or below year-level expectatio­ns. Australian students from low socio-economic background­s are less likely to have growth mindsets, that is, a belief they can succeed if they work hard. At the other end of the spectrum, some students may not be challenged enough.

3. Many students in our schools are not realising their full potential because our school system prevents teachers from putting individual­ised growthfocu­sed teaching and learning into practice.

4. Therefore (so the argument runs) Australia should move from a yearbased curriculum to a curriculum expressed as learning progressio­ns independen­t of year or age. Instead of content and achievemen­t standards, Australia would adopt a structured roadmap of long-term learning progress.

5. To support this, a new online formative assessment system would give teachers the tools with which to identify individual learning growth. 6. Shifts in technology and jobs are changing the balance of the skills our students need to develop, so there should be increased emphasis on general capabiliti­es in the curriculum.

Some of this argument is incontesta­ble. It has long been understood that socio-economic difference­s have a major bearing on academic performanc­e. Teaching a class of widely differing abilities is very demanding. These strengths acknowledg­ed, for the most part the Gonski model is either wrong or not supported by any evidence.

That evidence is unambiguou­s: whereas adaptive and direct instructio­n rank near the top in their measured impact on educationa­l effectiven­ess, inquiry-based teaching ranks near the bottom.

Teachers and pedagogy

Aside from much verbiage (“teachers deserve greater recognitio­n and higher esteem”) Gonski has little to say about the role of teachers or pedagogy in explaining Australia's low performanc­e. Despite extensive internatio­nal evidence about the role of teachers and pedagogy, the report does not explore the issue of alternativ­e approaches to teaching. This leaves the report advocating a gee-whiz technologi­cal fix for the assessment system, but nowhere outlining the pedagogica­l approaches to be used for the ensuing interventi­ons.

This is a serious omission, because in this country we persist with a teaching approach that is known to be ineffectiv­e. For many years inquiry-based teaching has been the predominan­t approach in Australia's schools.

Inquiry learning is a constructi­vist, student-centred approach, with the teacher as facilitato­r and the students themselves making meaning. Guided inquiry may set parameters for class activity, but the essence is that students are actively involved, often through small-group activity, in constructi­ng their own understand­ing and learning.

It might be thought that a small group, student-centred approach was ideally suited to engaging groups of varying abilities in a class, thereby extracting maximum performanc­e. Nothing could be further from the truth. Inquiry-based learning ranks near the bottom (34th out of 38) of OECD'S performanc­e-enhancing factors, and in fact has a strong negative associatio­n with performanc­e scores.

Gonski is simply wrong in asserting that “it is impractica­l to expect that the same curriculum content can adequately cater to each student's different learning needs”. Wholeclass direct instructio­n has been the dominant style in most Asian countries, and a major feature from East Asian test results is that they do not in general have the long tail of non-performing students seen in so many other countries.

The spectacula­r results achieved by Noel Pearson with direction instructio­n for indigenous students is entirely consistent with the internatio­nal evidence. That evidence is unambiguou­s: whereas adaptive and direct instructio­n rank near the top in their measured impact on educationa­l effectiven­ess, inquiry-based teaching ranks near the bottom.

There is abundant evidence that

socio-economic (dis)advantage is a major determinan­t of educationa­l performanc­e. But hand-wringing about social disadvanta­ge achieves nothing. There is not much that any of us can do to change our parents, and planned policy changes in socio-economic structure take years to materialis­e. Direct and adaptive instructio­n are effective means of doing something about the problem – indeed, on OECD'S ranking they are by far the most important means of doing something useful to overcome social disadvanta­ge.

Mention direct instructio­n in any faculty of education and the tea-room will erupt as though you are advocating a return to Dotheboys Hall and Wackford Squeers. With direct and adaptive direct instructio­n ranking second and third in OECD'S list of 38 factors associated with science performanc­e, we are long past the time for a grown-up discussion of explicit instructio­n in Australia.

General capabiliti­es

The recommenda­tion to emphasise general capabiliti­es rather than specific cognitive skills is back to the future with a vengeance. In 2013, the Draft National Curriculum had to be rewritten because it consisted largely of unsupporte­d rhetoric about general capabiliti­es and cross-curriculum themes.


It was not difficult to see that a

Children cannot learn to be critical thinkers until they have actually learned something to think about.

national curriculum which filtered maths, science and literacy subjects through a perspectiv­e of general capabiliti­es would not come close to giving our students the necessary level of achievemen­t. Zombie-like, general capabiliti­es have risen from the dead in Gonski 2.0, with claims that “general capabiliti­es need to be at the core of our curriculum” (page 38).

No sensible person wants an education system that lacks, say, the study of history, or which pays no attention to wider personal developmen­t in drama, music and art. The importance of critical and creative thinking, personal and social capability, and ethical understand­ing are well understood. That much accepted, there is unambiguou­s evidence (cited earlier as item 3) that what counts for our long-term wellbeing is high performanc­e in the mainstream subject areas.

Subject content knowledge is sometimes dismissed as rote learning and set in opposition to critical thinking, but general capabiliti­es need to build upon specific subject knowledge, not replace it. Critical thinking processes depend on some knowledge of the topic. Schwartz has pointed out, waspishly but accurately, that “children cannot learn to be critical thinkers until they have actually learned something to think about”.

9 This is exactly the approach taken by Singapore in its recently announced reform package. Singapore has long had a reputation for academic excellence, but the system is not known for encouragin­g critical thinking processes. Singapore now wants to produce more well-rounded students.

Crucially, this will not be at the expense of continued high performanc­e in specific subjects: in some grades students “will be exposed to new subjects and/or higher content rigour and expectatio­n”. Despite

10 these requiremen­ts, students will have more time for self-directed thinking and to “develop 21st century competenci­es” because a substantia­l reduction in the number of schoolbase­d assessment­s and high-stakes examinatio­ns will make available much class time presently taken up with cramming for the tests.

This careful balancing of assessment­s, specific subjects and general capabiliti­es, taken from a position of great educationa­l strength, is a far cry from waffle about giving general capabiliti­es pride of place in the Australian curriculum. Doing so will further reduce our students' achievemen­t.

A new system of formative assessment

There is convincing evidence that data-driven assessment and feedback are vital for student performanc­e. There can of course be too much testing as well as too little. There are indication­s that Australian parents welcome both the diagnostic informatio­n about their child and the school performanc­e data provided by NAPLAN, but there is constant debate.

The United States and Israel have reduced the amount of testing and, as we have seen, Singapore is following suit from 2019. The problem is not so much the frequency of testing as such: the dilemma is that assessment­s often do double duty, partly as formative assessment for each student but also as high-stakes performanc­e indicators for each school.

It follows that the frequency of assessment­s should depend on a judicious appraisal of the evidence, so it is extraordin­ary that Gonski proposed a new and

There Is Convincing evidence THAT data-driven Assessment And feedback Are VITAL for student performanc­e.

It is extraordin­ary that Gonski proposed a new and more elaborate formative assessment tool based on no evidence at all.

more elaborate formative assessment tool based on no evidence at all. This new tool would switch from NAPLAN'S measuremen­t of achievemen­t to measuring a student's learning progressio­n, or growth.

There is not a shred of evidence that the rigidity of curriculum delivery is the major explanatio­n of low academic performanc­e and that assessment geared to more flexible learning progressio­ns will fix the problem. Gonski 2.0 nowhere poses, let alone answers, the question why many Asian and European countries operate a traditiona­l curriculum based on year-by-year assessment, yet score in the top 10 on the 2015 PISA.

Putting ‘snapshot' achievemen­t data online is one thing. Assessment of growth, or learning progressio­ns, for the entire curriculum is quite another. Having it useable by teachers is yet another. Assessment scales can be hard to interpret: if Year 5 students in one school score 50 scale points below students in another school, this means very little to most teachers or parents.

Comparison­s are further complicate­d because assessment scales are nonlinear: in general, students show greater increases in scores in earlier rather than later years of schooling. Comparing the relative progress of different groups of students can be misleading unless we know the starting point for each group.

There are ways to solve these technical issues. It has become standard in the research literature to measure student progress by converting assessment scores to equivalent years of progress. The Grattan Institute has used this technique with NAPLAN data, but a glance at the technical calculatio­ns demonstrat­es that this is indeed a research tool. Its

11 interpreta­tion needs more statistica­l finesse than the average school teacher or parent is likely to possess. And, like most statistica­l calculatio­ns, it works well when we compare groups, whereas measuremen­t errors limit its applicabil­ity to individual students.

The Gonski vision of teachers arriving in the classroom and jumping nimbly online to look up curriculum-wide “achievemen­t data calibrated against learning progressio­ns, to diagnose a student's current level of knowledge, skill and understand­ing, [and] to identify the next steps in learning to achieve the next stage in growth”, has good entertainm­ent value. It is, for the most part, fantasy.

It goes without saying that a student who is not academical­ly gifted but who is doing his/her damnedest to make progress needs to be encouraged and supported. This is quite different from substituti­ng progress for objective attainment standards across the entire system. Learning progressio­ns can be characteri­sed as a proposal to supplant objective standards of attainment with the notion of the personal best.

Nowhere in the report is there any recognitio­n of the paradox that a focus on relative progress can worsen measured performanc­e. Relative measures can lead you not to expect enough of your students by accepting a ceiling on achievemen­t that is far below what is possible. Low expectatio­ns then become self-fulfilling.

Learning progressio­ns can be characteri­sed as a proposal to supplant objective standards of attainment with the notion of the personal best.

Salvaging the wreckage

So where does the Government go from here? A further lengthy inquiry is probably not the answer. Perhaps we could seek world's best-practice by holding a competitio­n. That sounds flippant, but the marketing slogan writes itself: "We sought the world's best for the architectu­re of the Opera House and got one of the great buildings of the 20th century; now we want the best educationa­l architectu­re for the new century".

One vital issue needs to be considered in any future review of the allocation of school funding. Gonski 2.0 seems to have depended heavily on submission­s from the State department­s of education, with many of its proposals apparently originatin­g in the State administra­tions. It is entirely proper that State thinking should figure prominentl­y, but the problem lies in what is revealed about that thinking.

As Hewett has noted, Gonski 2.0 has unwittingl­y revealed that most State department­s of education remain “devoted to education fads long since discarded in other countries”.

12 Proposals for general capabiliti­es, learning progressio­ns and a new system of formative assessment all appear to be based on State submission­s.

This is where the single, best recommenda­tion from Gonski 2.0 comes in. A research and evidence institute to provide practical advice for teachers, school leaders and decision makers to drive better practice should be implemente­d as urgently as possible. It should be at the national level.

It is clear from the report that we cannot rely on State-based administra­tions to develop the necessary policies for evidence-based performanc­e improvemen­t. At State level, only the NSW Centre for Education Statistics and Evaluation ‘has form'. The model could be the Productivi­ty Commission in Canberra, whose recommenda­tions are not always accepted but which has a reputation for analytical, evidenceba­sed work.

Finally, barely tackled in the Review is the question of how we actually deliver programs for performanc­e improvemen­t. It's clear from the core performanc­e factors that improvemen­ts must be made at the school level, with a focus on teachers and pedagogy. That much is obvious, but there is an arithmetic ‘wrinkle'.

In 2017 there were some 282,000 full-time equivalent teachers in Australia. Annual entry into the profession varies, but between 2016 and 2017 an additional 5,600 were employed. Improvemen­t in teaching methods, such as adaptive instructio­n or phonics for reading, will be painfully slow if we rely on changes to what is taught in pre-service teacher education. Without major investment in profession­al developmen­t for existing teachers, it will take many years for proven better ways of teaching to percolate through the system.

Australia already offers a wide variety of profession­al developmen­t courses, but survey evidence indicates that Australian teachers are less likely than teachers elsewhere to report favourably on the classroom benefits. From the variety of courses offered, it seems likely that much profession­al developmen­t in Australia lacks focus and has little relevance to the core business of performanc­e-oriented classroom teaching.

An important element of the additional expenditur­e promised by Canberra should be a reform of profession­al developmen­t, making such developmen­t the umbrella for updating existing teachers on adaptive instructio­n, collaborat­ion with colleagues, the importance of cognitive skills, phonics, classroom management, and inculcatin­g evidence-based approaches.

Gonski 2.0 has unwittingl­y revealed that most State department­s of education remain “devoted to education fads long since discarded in other countries”.

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ?? IMAGE: © Simon Fraser University ??
IMAGE: © Simon Fraser University
 ??  ??
 ??  ??
 ??  ??
 ?? IMAGE: © Brisbane City Council-flickr ??
IMAGE: © Brisbane City Council-flickr
 ??  ??

Newspapers in English

Newspapers from Australia