Thom Marchbank

Part two: NAPLAN Results Haven’t Collapsed – But Media Interpretations Have

This is the second instalment of our two-part series addressing claims made about NAPLAN data. The first part is here

We begin this section by addressing a comment made in ABC media reporting on 2025 NAPLAN results.

“We’ve seen declines in student achievement despite significant investment in funding of the school system”

This comment echoes a broader theme that re-surfaces regularly in public discussions of student achievement on standardised tests. There are two aspects of this comment to unpack and we address each in turn. 

No evidence

First the concept that student achievement is declining is demonstrably untrue if we evaluate NAPLAN data alone. There is no evidence that student achievement in NAPLAN declined between 2008 and 2022 – and indeed there were some notable gains for Year 3 and Year 5 students in several domains. Results from 2023 to 2025 have remained stable across all Years and all domains. 

By contrast, there have been well-documented declines in average achievement in the Reading, Mathematics and Scientific Literacy tests implemented by the Programme for International Student Assessment (PISA). PISA tests are undertaken by Australian 15-year-old students every three years. The most recent data, from the 2022 assessment round showed that these declines had flattened out in all three test domains since the 2015 round: in other words the average decline has not continued after 2015. 

There’s plenty of speculation as to why there have been declines in PISA test scores specifically, and there are enough plausible explanations to suggest that no single change in schools, curriculum, pedagogy or funding will reverse this trend. Nonetheless it is important to highlight the contrast between PISA and NAPLAN and not conflate the two in public discussion about student performance on standardised tests.

Before Gonski, schools were relatively underfunded

The second aspect of the claim above is that increases in school funding should have resulted in improvements in NAPLAN achievement (notwithstanding the fact that average results are not trending downwards). School funding has increased since the publication of the first Gonski report in 2011, and subsequent government efforts to adequately fund schools as per the model agreed upon. This is one reason why the total amount of money spent on schooling has increased in the last 10-15 years: because prior to Gonski, government schools were relatively underfunded across the board (and many remain so).

A second reason relates to government policies resulting in more children staying in school for longer (arguably a good thing). The 2009 National Report on Schooling in Australia (ACARA, 2009) produced a handy table identifying new state and territory policies aimed at increasing the proportions of students engaged with education, training or employment after the age of 15 (p. 36). For example, in NSW (the largest jurisdiction by student numbers), the new policy from 2010 was as follows:

“(a) From 2010 all NSW students must complete Year 10. After Year 10, students must be in school, in approved education or training, in full-time employment or in a combination of training and employment until they turn 17.”

Students stay at school longer

This and similar policies across all states and territories had the effect of retaining more students in school for longer, therefore costing more money.  

The other reason total school funding has increased is simple: growth in total student numbers. If there are more students in the school system, then schools will cost more to operate.

According to enrolment statistics published on the ACARA website, from 2006 to 2024, the number of children aged 6 – 15 enrolled in schools increased from 2,720,866 to 3,260,497. This represents a total increase of 539,631 students, or a 20% increase on 2006 numbers. These gains in total student numbers were gradual but consistent year on year. It is a pretty simple calculation to work out: more students = higher cost to schools.

Students who ‘start behind, stay behind’

The design of the NAPLAN tests allow an excellent opportunity to test claims that children who start with poor achievement never ‘catch up’. Interestingly, the Australian Education Research Organisation published a report in 2023 that calls into question this idea. The AERO report demonstrated that of all the children at or below the National Minimum Standard (NMS) in Year 3 (187, 814 students in their national sample), only 33-37% remained at or below NMS to Year 9. 

We can explain this another way using the terminology from the new NAPLAN proficiency standards. Of the ~10% of students highlighted as needing additional support, it is likely that one third of these students will need that additional support throughout their schooling – or around 3.5% of the total population. The remainder of the students needing additional support in Year 3 in fact did make additional gains and moved up the achievement bands as they progressed from Year 3 to Year 9.

AERO’s analyses supported other research that had used different methods to analyse longitudinally matched NAPLAN data. This research also showed no evidence that students starting at the bottom of  NAPLAN distributions in Year 3 fell further behind. In fact, on average, students starting with the poorest achievement made the most progress to Year 9.

Sweeping inaccurate claims

Consistently supporting students who need additional help throughout their school years is something that teachers do and will continue to do as part of their core business. Making sweeping claims that are not supported by the available data is problematic and doesn’t ultimately support schools and teachers to do their jobs well. 

In recent weeks, there have been some excellent and thoughtful pieces calling for a more careful interpretation of NAPLAN data, for example here and here. It is disappointing to see the same claims recycled in the media year after year, when published, peer-reviewed research and sophisticated data analyses don’t support the conclusions. 

Sally Larsen is a senior lecturer in Education at the University of New England. She researches reading and maths development across the primary and early secondary school years in Australia, interrogating NAPLAN. Thom Marchbank is deputy principal academic at International Grammar School, Sydney and a PhD candidate at UNE supervised by Sally Larsen and William Coventry. His research focuses on academic achievement and growth using quantitative methods for understanding patterns of student progress.

NAPLAN Results Haven’t Collapsed – But Media Interpretations Have

Each year, the release of NAPLAN results is accompanied by headlines that sound the alarm – about policy failures, teacher training and classroom shortcomings, and further and further slides in student achievement. 

In this two-part series, we address four claims that have made the rounds of media outlets over the last couple of weeks. We show how each is, at best, a simplification of NAPLAN achievement data, and that different interpretations (not different numbers) can easily lead to different conclusions. 

Are a third of students really failing to meet benchmarks?

Claims that “one-third of students are failing to meet benchmarks” have dominated recent NAPLAN commentary in The Guardian, the ABC, and The Sydney Morning Herald. While such headlines generate clicks, fuel public concern and make for political soundbites, they rest on a shallow and statistically naive reading of how achievement is reported.

The root of the problem is a change in how a continuous distribution is cut up. 

In 2023, ACARA shifted NAPLAN reporting from a 10-band framework to a new four-level “proficiency standard” model, in conjunction with the test moving online to an adaptive framework, rather than paper-based tests. 

Under the older system, students meeting the National Minimum Standard (NMS) were in Band 2 or above in Year 3, Band 4 or above in Year 5, and Band 6 or above in Year 7 and 9. Those students were not “failing”; rather, they were on the lower end of a normative distribution. Now, with fewer reporting categories (just four instead of ten), the same distribution of achievement is compressed. Statistically, when you collapse a scale with many levels into one with fewer, more students will cluster below the top thresholds; but that doesn’t mean their achievement has declined.

A particular target

Take, for example, the 2022 Year 9 Writing results. This was a particular target for media commentary this year and last.

That year, about one in seven (14.3%) of students were in Band 5 or below the National Minimum Standard. In 2025, by contrast, 40.2% of students were in the categories “Needs additional support” and “Developing”, which are the new categories for perceived shortfall. 

This represents a nearly threefold jump. But that’s only if the categories of ‘below NMS’ and within the ‘bottom two proficiency groupings’ are considered qualitatively equivalent. That’s a naive interpretation.

But let’s look at how those two groups actually scored.

In 2022, the NMS for Writing in Year 9 was Band 6, i.e., a NAPLAN score of  ≥430.8 (and Band 7 started at ~534.9), whereas in 2025, the “Developing”/“Strong” boundary is at 553, which is above the 2022 Band 6 cut-off (~485), and roughly equivalent to midway through 2022’s Band 7. 

This means that what was previously considered solid performance (Band 6 or low Band 7) is now seen as “Developing”, not “Strong.” The “Strong” range (553–646), by contrast, corresponds roughly to upper Band 7 and most of Band 8 from the 2022 scale, and the “Exceeding” range (647+) overlaps mostly with Band 9+. Students now have to reach what was previously considered top‑quartile performance to be classified as “Strong” or higher. A student scoring 500 in Year 9 Writing was in Band 6 in 2022 – above the NMS – but now falls just short of “Strong” (cut = 553). That same student would now be labelled as “Developing,” even if their skills haven’t changed.

The boundaries have changed

The results are the same. It’s the boundaries which have changed.

What’s also missing in the new scheme is the ability to compare between year levels. The historical bands allowed a direct vertical comparison across year levels; you could say a Year 3 student in Band 6 was at the same proficiency as a Year 5 student in Band 6. Proficiency categories, in comparison, are year-specific labels. “Strong” in Year 3 is not the same raw proficiency as “Strong” in Year 9; it’s the same relative standing within year expectations. 

Vertical comparison is still possible with the raw scale scores, but not with the categories. This shift makes the categories more communicative for parents (“Your child is Strong for Year 5”), but less useful for direct cross-year growth statements without going back to the underlying scale.

Surprisingly, there has been commentary that suggests that we should expect 90% of students to be scoring in the top two bands – “Strong” and “Exceeding”. 

How would that work?

Population distributions always contain variability around their mean, and the achievement distributions year to year for NAPLAN are generally consistent and similar. Expecting 90% of students to be in the top two categories, therefore, is statistically unrealistic, especially when those categories represent higher-order competencies. 

As we saw earlier, the “Strong” range (553–646) corresponds roughly to an upper Band 7 and most of Band 8 from the 2022 scale, and the “Exceeding” range (647+) overlaps mostly with 2022’s Band 9+. Students now have to reach what was previously considered top‑quartile performance to be classified as “Strong” or higher. This is a very exacting target. 

The bell curve

Most assessment distributions are approximately normal (shaped like a “bell curve”), so high achievement on the NAPLAN scale naturally includes fewer students, just as low achievement bands do. Without an intensive increase in needs-based resourcing that might change things such as class sizes and teacher to student ratios, the availability of school-based materials, resources, and training, or one to one support for struggling learners, the shape of the population distribution is likely to remain disappointingly stable.

The overall message is that students haven’t suddenly lost ground in their learning; we’ve just changed the categories that we use to understand their achievement. To interpret NAPLAN data accurately, we must consider how the framework has shifted, and avoid drawing simplistic conclusions. Even more importantly, this change in reporting is not any kind of evidence of educational failure – it’s just a shift in how we describe student progress, and not what that progress is. Misrepresenting it fuels anxiety without helping schools or students.

Most Year 9 Students Do Not Write at a Year 4 Level

Another headline resurfacing with troubling regularity is the claim that “a majority of Year 9 students write at a Year 4 level”. It’s a “major crisis” in students’ writing skills, reflecting a “thirty year policy failure”. 

This is based on distortions of analysis from the Australian Educational Research Organisation that selectively focused on Persuasive Writing as a text type. Unlike media coverage, AERO’s recent report, “Writing development: What does a decade of NAPLAN data reveal?” did not conclude that Year 9 students’ writing is at “an all-time low”. Instead, the report found a slight historical decline in writing achievement, only when examining persuasive writing as a text type. 

AERO’s report is somewhat misleading, though, because it focuses only on persuasive writing, even though NAPLAN can and does assess narrative writing as well, and the two text types are considered to have equivalence from the point of view of test design. 

In fact, in publicly available NAPLAN data from the Australian Curriculum, Assessment and Reporting Authority (ACARA), and in academic analyses of this data undertaken last year, Australian students’ Writing achievement – taken as both persuasive and narrative writing – has been quite stable over time. 

Consistent for more than a decade

For example, without taking narrative writing away from persuasive writing, mean Year 9 NAPLAN Writing achievement is quite stable, with 2022 representing the strongest year of achievement since 2011:

A graph with a line going up

AI-generated content may be incorrect.

Year 9 students’ average achievement may have been consistent for more than a decade, but what about the claim that the majority of Australian Year 9s write at a Year 4 level?

For Year 5, mean writing achievement has ranged between 464 and 484 nationally between 2008 and 2022 on the NAPLAN scale. For Year 3, the mean score range for the same period was 407 to 425. With developmental growth, mean Year 4 achievement might be expected to be somewhere in the 440 to 450 range. 

However, the cut-point for the NMS for Year 9 tended to be around 484, historically. Why is this important? Because national NAPLAN reporting always supplied the proportion of students falling below the National Minimum Standard (Band 5 and below in Year 9). This tells us how many students are demonstrating lower levels of writing achievement.

Where are most year nine students?

In 2022, only 14.3% of Year 9 students were in Band 5 or below (or below a NAPLAN scale score of about 484), which was the second-best year on record since 2011, as illustrated in the figure below. Contrast that with the 65.9% of students who scored in Band 7 or above in 2022 (with a cut point of 534.9), clearly indicating most Year 9 students have writing proficiency far beyond primary levels.

When you consider that the NMS records the proportion of students falling into Band 5 and below for Year 9, 13.7% to 18.6% of students fell in this range from 2008–2022. This is quite different to “the majority”.

In Part 2 (published tomorrow) of this series we go on to address the perennial claim that students’ results are declining, the argument that additional school funding should result in ‘better’ NAPLAN results, and the idea that children who start behind will never ‘catch up’.

Sally Larsen is a senior lecturer in Education at the University of New England. She researches reading and maths development across the primary and early secondary school years in Australia, interrogating NAPLAN. Thom Marchbank is deputy principal academic at International Grammar School, Sydney and a PhD candidate at UNE supervised by Sally Larsen and William Coventry. His research focuses on academic achievement and growth using quantitative methods for understanding patterns of student progress.