CATEGORIES
August.11.2025

NAPLAN Results Haven’t Collapsed – But Media Interpretations Have

By Thom Marchbank and Sally Larsen

Each year, the release of NAPLAN results is accompanied by headlines that sound the alarm – about policy failures, teacher training and classroom shortcomings, and further and further slides in student achievement. 

In this two-part series, we address four claims that have made the rounds of media outlets over the last couple of weeks. We show how each is, at best, a simplification of NAPLAN achievement data, and that different interpretations (not different numbers) can easily lead to different conclusions. 

Are a third of students really failing to meet benchmarks?

Claims that “one-third of students are failing to meet benchmarks” have dominated recent NAPLAN commentary in The Guardian, the ABC, and The Sydney Morning Herald. While such headlines generate clicks, fuel public concern and make for political soundbites, they rest on a shallow and statistically naive reading of how achievement is reported.

The root of the problem is a change in how a continuous distribution is cut up. 

In 2023, ACARA shifted NAPLAN reporting from a 10-band framework to a new four-level “proficiency standard” model, in conjunction with the test moving online to an adaptive framework, rather than paper-based tests. 

Under the older system, students meeting the National Minimum Standard (NMS) were in Band 2 or above in Year 3, Band 4 or above in Year 5, and Band 6 or above in Year 7 and 9. Those students were not “failing”; rather, they were on the lower end of a normative distribution. Now, with fewer reporting categories (just four instead of ten), the same distribution of achievement is compressed. Statistically, when you collapse a scale with many levels into one with fewer, more students will cluster below the top thresholds; but that doesn’t mean their achievement has declined.

A particular target

Take, for example, the 2022 Year 9 Writing results. This was a particular target for media commentary this year and last.

That year, about one in seven (14.3%) of students were in Band 5 or below the National Minimum Standard. In 2025, by contrast, 40.2% of students were in the categories “Needs additional support” and “Developing”, which are the new categories for perceived shortfall. 

This represents a nearly threefold jump. But that’s only if the categories of ‘below NMS’ and within the ‘bottom two proficiency groupings’ are considered qualitatively equivalent. That’s a naive interpretation.

But let’s look at how those two groups actually scored.

In 2022, the NMS for Writing in Year 9 was Band 6, i.e., a NAPLAN score of  ≥430.8 (and Band 7 started at ~534.9), whereas in 2025, the “Developing”/“Strong” boundary is at 553, which is above the 2022 Band 6 cut-off (~485), and roughly equivalent to midway through 2022’s Band 7. 

This means that what was previously considered solid performance (Band 6 or low Band 7) is now seen as “Developing”, not “Strong.” The “Strong” range (553–646), by contrast, corresponds roughly to upper Band 7 and most of Band 8 from the 2022 scale, and the “Exceeding” range (647+) overlaps mostly with Band 9+. Students now have to reach what was previously considered top‑quartile performance to be classified as “Strong” or higher. A student scoring 500 in Year 9 Writing was in Band 6 in 2022 – above the NMS – but now falls just short of “Strong” (cut = 553). That same student would now be labelled as “Developing,” even if their skills haven’t changed.

The boundaries have changed

The results are the same. It’s the boundaries which have changed.

What’s also missing in the new scheme is the ability to compare between year levels. The historical bands allowed a direct vertical comparison across year levels; you could say a Year 3 student in Band 6 was at the same proficiency as a Year 5 student in Band 6. Proficiency categories, in comparison, are year-specific labels. “Strong” in Year 3 is not the same raw proficiency as “Strong” in Year 9; it’s the same relative standing within year expectations. 

Vertical comparison is still possible with the raw scale scores, but not with the categories. This shift makes the categories more communicative for parents (“Your child is Strong for Year 5”), but less useful for direct cross-year growth statements without going back to the underlying scale.

Surprisingly, there has been commentary that suggests that we should expect 90% of students to be scoring in the top two bands – “Strong” and “Exceeding”. 

How would that work?

Population distributions always contain variability around their mean, and the achievement distributions year to year for NAPLAN are generally consistent and similar. Expecting 90% of students to be in the top two categories, therefore, is statistically unrealistic, especially when those categories represent higher-order competencies. 

As we saw earlier, the “Strong” range (553–646) corresponds roughly to an upper Band 7 and most of Band 8 from the 2022 scale, and the “Exceeding” range (647+) overlaps mostly with 2022’s Band 9+. Students now have to reach what was previously considered top‑quartile performance to be classified as “Strong” or higher. This is a very exacting target. 

The bell curve

Most assessment distributions are approximately normal (shaped like a “bell curve”), so high achievement on the NAPLAN scale naturally includes fewer students, just as low achievement bands do. Without an intensive increase in needs-based resourcing that might change things such as class sizes and teacher to student ratios, the availability of school-based materials, resources, and training, or one to one support for struggling learners, the shape of the population distribution is likely to remain disappointingly stable.

The overall message is that students haven’t suddenly lost ground in their learning; we’ve just changed the categories that we use to understand their achievement. To interpret NAPLAN data accurately, we must consider how the framework has shifted, and avoid drawing simplistic conclusions. Even more importantly, this change in reporting is not any kind of evidence of educational failure – it’s just a shift in how we describe student progress, and not what that progress is. Misrepresenting it fuels anxiety without helping schools or students.

Most Year 9 Students Do Not Write at a Year 4 Level

Another headline resurfacing with troubling regularity is the claim that “a majority of Year 9 students write at a Year 4 level”. It’s a “major crisis” in students’ writing skills, reflecting a “thirty year policy failure”. 

This is based on distortions of analysis from the Australian Educational Research Organisation that selectively focused on Persuasive Writing as a text type. Unlike media coverage, AERO’s recent report, “Writing development: What does a decade of NAPLAN data reveal?” did not conclude that Year 9 students’ writing is at “an all-time low”. Instead, the report found a slight historical decline in writing achievement, only when examining persuasive writing as a text type. 

AERO’s report is somewhat misleading, though, because it focuses only on persuasive writing, even though NAPLAN can and does assess narrative writing as well, and the two text types are considered to have equivalence from the point of view of test design. 

In fact, in publicly available NAPLAN data from the Australian Curriculum, Assessment and Reporting Authority (ACARA), and in academic analyses of this data undertaken last year, Australian students’ Writing achievement – taken as both persuasive and narrative writing – has been quite stable over time. 

Consistent for more than a decade

For example, without taking narrative writing away from persuasive writing, mean Year 9 NAPLAN Writing achievement is quite stable, with 2022 representing the strongest year of achievement since 2011:

A graph with a line going up

AI-generated content may be incorrect.

Year 9 students’ average achievement may have been consistent for more than a decade, but what about the claim that the majority of Australian Year 9s write at a Year 4 level?

For Year 5, mean writing achievement has ranged between 464 and 484 nationally between 2008 and 2022 on the NAPLAN scale. For Year 3, the mean score range for the same period was 407 to 425. With developmental growth, mean Year 4 achievement might be expected to be somewhere in the 440 to 450 range. 

However, the cut-point for the NMS for Year 9 tended to be around 484, historically. Why is this important? Because national NAPLAN reporting always supplied the proportion of students falling below the National Minimum Standard (Band 5 and below in Year 9). This tells us how many students are demonstrating lower levels of writing achievement.

Where are most year nine students?

In 2022, only 14.3% of Year 9 students were in Band 5 or below (or below a NAPLAN scale score of about 484), which was the second-best year on record since 2011, as illustrated in the figure below. Contrast that with the 65.9% of students who scored in Band 7 or above in 2022 (with a cut point of 534.9), clearly indicating most Year 9 students have writing proficiency far beyond primary levels.

When you consider that the NMS records the proportion of students falling into Band 5 and below for Year 9, 13.7% to 18.6% of students fell in this range from 2008–2022. This is quite different to “the majority”.

In Part 2 (published tomorrow) of this series we go on to address the perennial claim that students’ results are declining, the argument that additional school funding should result in ‘better’ NAPLAN results, and the idea that children who start behind will never ‘catch up’.

Sally Larsen is a senior lecturer in Education at the University of New England. She researches reading and maths development across the primary and early secondary school years in Australia, interrogating NAPLAN. Thom Marchbank is deputy principal academic at International Grammar School, Sydney and a PhD candidate at UNE supervised by Sally Larsen and William Coventry. His research focuses on academic achievement and growth using quantitative methods for understanding patterns of student progress.

Republish this article for free, online or in print, under Creative Commons licence.

5 thoughts on “NAPLAN Results Haven’t Collapsed – But Media Interpretations Have

  1. AL says:

    Thank goodness we have a top-100 schools list from News Corp, which coincidentally were also the most wealthy.

  2. At Lake Wobegon “… all the children are above average”. 😉

  3. Thom Marchbank says:

    Thanks for your comment, Tom, but I don’t think that’s at all what NAPLAN shows.

    The data reliably reveal the full range of student achievement and ability, with lower numbers of students at the high and low end of the range. The issue with recent media coverage is that a revamp of how achievement is categorised for NAPLAN has meant a revitalisation of unhelpful and inaccurate “failure” narratives.

  4. Fiona Edwards says:

    Finally someone who understands the Naplan results. I personally do not like Naplan. But I also have not agreed with the news paper reporting.

Comments are closed.