The Hidden NAPLAN Swamp

Is there an education reporter in Australia who is aware of how bad the NAPLAN Numeracy tests are? Is there a reporter who cares that my claim, although true beyond any reasonable doubt, cannot even be properly investigated, cannot readily be proven to be right or wrong? I have seen zero evidence of any reporter ever bothering to look or to ask. Even if a reporter were to try there is now close to zero chance they will find anything. ACARA can now write whatever nonsense questions they wish and it is practically impossible to call them on it.

A decade ago, I fought some battles with ACARA to have past NAPLAN tests released under FOI. The battles included some wins and some losses, and more than a little lunacy. The purpose of the FOI battles was not to gain access to the tests: I already had copies of them. The tests were pretty easy to get in those days, with paper copies floating around. My purpose was to make the tests properly visible, for the tests to be readily accessible by the public and the media. I was sick to death of the media reporting on NAPLAN as if the test questions were somehow meaningful and I hoped that if the tests were properly open for critique then at least some proper critique would be forthcoming. That hope was, of course, in vain.

Simply, no one cares that the NAPLAN Numeracy tests are bad. No one cares to look at what “numeracy” really means, particularly in the hands of ACARA’s ignorami. No one cares about the gratuitous, absurd and muddying “real world” scenarios. No one cares that the tests are predominantly trivial, with barely any testing of arithmetic let alone anything of any substance on primes or Pythagoras or algebra or anything else solidly mathematical. No one cares that the test is calculator dominated. No one cares about the errors or the slapdash graphics or the garbled wording. The Numeracy tests are simply accepted, unviewed and unreviewed. Parents and teachers either ignore the test or treat it seriously but as a fait accompli: it is what it is.

The education media could do something about this. They could apply some pressure if they cared, but they don’t and they don’t. It’s easier to remain ignorant, to write, year after year, the same potted stories about results being up or down, about public schools versus private, or swallowing whole some contrived ACARA media release. Still, at least the tests have been public. At least those who have wanted to see can see. Not any longer.

All the tests up to 2016 are now online, but beginning around 2016 the paper tests have became significantly more difficult to obtain. Presumably this has been due to a tightening of protocols, with schools being more watchful of spare copies, together with increasingly more schools conducting the test online. Still, enough of the papers were around for me to hammer a number of questions from the 2017-2021 tests; I chose the worst, of course, but I could have hammered many more. More recent test questions, however, appear to be entirely inaccessible.

Beginning in 2022, NAPLAN went fully online, meaning there is no longer a paper trail, there is no longer a natural way to access the questions from a NAPLAN test. Moreover, there is not even a NAPLAN test anymore. Each “test” now consists of a branching sequence of “testlets“, with better performing and worse performing kids then being directed to harder and easier testlets.

What are the test questions now like? How is the branching determined? How are the testlets determined? There appears to be no way that anyone outside of ACARA can confirm anything. But there are good guesses one can make and there are definitely things to be said.

To begin, the NAPLAN tests matter to students and to their parents. For a long time governments and ACARA tried to pretend that students had nothing to fear, that NAPLAN was all about gauging schools and not kids, but no one buys that whopper anymore, if they ever did. Private schools typically ask for NAPLAN results; they are assholes for doing so, but they do. The Victorian High-Ability Program substantially bases eligibility on NAPLAN results; they are idiots for doing so, but they do. Everyone other than ACARA treats NAPLAN as if it were a test of the students and so of course it is. It is an appalling test, and anyone in their right mind realises it is an appalling test, but if everyone else treats the test seriously then students and parents have no choice but to treat it seriously as well.

Secondly, since NAPLAN matters it follows that students and parents and teachers have every right to know enough about the test to be able to study meaningfully for it. How are they to do that? Sure, there are thousands of third party grifters offering practice materials of questionable worth. But ACARA has a professional and ethical responsibility to offer guidance. Which they do not. ACARA simply writes,

NAPLAN is not a test of content. The tests are constructed to give students an opportunity to demonstrate skills they have learned over time through the school curriculum, and NAPLAN test days should be treated as just another routine event on the school calendar. The best way you can help your child prepare for NAPLAN is to reassure them that NAPLAN tests are just one part of their school program, and to urge them to simply do the best they can on the day.

This is bullshit and everyone knows it’s bullshit. But at least the 2008-2016 tests are online for students and teachers and parents to use. Except, these tests are of unknown but clearly limited assistance.

All tests, in any context, will evolve. Even if the written or de facto curriculum remains the same, the test will evolve as the educational culture evolves, as writers change. There is simply no reason to assume that a 2016 NAPLAN test is a reliable indicator of a 2024 NAPLAN testlet, and there is good reason to assume otherwise. Moreover, the curriculum has not remained the same, and it matters. ACARA has explicitly written,

Since 2016, NAPLAN testing has been aligned to the Australian Curriculum: English and the Australian Curriculum: Mathematics.

For those just returning from Mars, the Australian Mathematic Curriculum changed radically in 2023, for the way worse. Of particular relevance, the curriculum contains much more emphasis on real-world numeracy twaddle. It would be absurd to imagine that ACARA has ignored its own explicit statement and has failed to adjust the NAPLAN numeracy tests to reflect this increased real worlding. But students and teachers and parents can determine nothing of this adjustment.

Thirdly, it has been suggested to me privately that ACARA’s numeracy testlets are seriously flawed. The suggestion is that, in an attempt to make the harder testlets sufficiently harder, numeracy questions now go outside the curriculum. If true, that is of course entirely inappropriate. Is it true? No one outside of ACARA can have any proper idea. Which brings up the fourth and final, the most obvious and most important point.

Simply, critically, the public and the media must be able to hold ACARA to account. Even if ACARA weren’t demonstrably an arrogant and incompetent body, and they are, even if there were not a solid history of error and ambiguity and inanity on NAPLAN tests, and there is, ACARA still must be accountable, each and every year, for what they are testing. Simply, they are not.

In the next couple weeks roughly a zillion kids will sit the NAPLAN tests. They will do this but the teachers, the parents, the media and the public cannot have any idea what was tested. This is unconscionable. It is madness.

UPDATE (12/02/24)

ACARA issued a media release today:

SCHOOLS AND STUDENTS SET TO BENEFIT FROM EARLIER NAPLAN RESULTS

Stephen Gniel [yes, that Stephen Gniel], acting Chief Executive Officer at ACARA, said that providing earlier results to schools supports teachers to help every student reach their potential.

“NAPLAN is invaluable as a national assessment that allows us to see whether young Australians are developing critical literacy and numeracy skills for learning, using a national, objective scale.

“Getting the results to schools sooner is a key benefit of having moved the assessment from May to March last year, as well as delivering the tests fully online.”

The media release was stenographed here, here, here, here, here, here, here, here, here and here, and counting.

UPDATE (14/03/24)

I generally refrain from linking to or noting news articles about myself or this blog; it feels too much like self-promotion. But, given the whack I took at education reporters, it is fair to note that Robyn Grace from The Age picked up the story yesterday. It is much appreciated.

UPDATE (15/03/24)

Sarah Duggan from EducationHQ has also written on the issue, here. She including my whack at the education media, which I think is pretty impressive. Again, the coverage is appreciated.

UPDATE (20/03/24)

Rachel Wilson contacted me, indicating that she and her Gonski colleagues have made similar criticisms of NAPLAN. Their paper is here. I’ve had no time to look at it. Also, “Peter Adams”, presumably but not statedly the ex-ACARA Peter Adams, has commented on Duggan’s EducationHQ article, including some remarks about my ignorance of psychometric analysis and whatnot. I’ve responded, but my response appears to be stuck in moderation.

64 Replies to “The Hidden NAPLAN Swamp”

  1. You apparently have a never-ending supply of brick walls, Marty, and one uniquely tough skull. I thank you for what you are doing.

      1. Not no-one. I care, and so do many others. For your interest, the last secondary school I taught at before retiring in disgust, we teachers weren’t allowed to see our students’ NAPLAN results?????

        1. Thanks, Libby. Of course I understand there are people such as yourself who care. But you are vastly outnumbered.

  2. I don’t know anything about the Victorian High-Ability Program which you mention. According to their website, “In the VHAP maths course, students embrace the “why” of maths, not just the “what”. Students and teachers dive into the philosophical implications of mathematical topics.” But I can’t see any detail. Do any readers know about this program?

    1. I had planned to write about VHAP and looked into it quite a bit. I even had the title picked out: VHAPless. But it appears that VHAP is better than that, at least from my daughter’s experience. The VHAP maths program is not profoundly good, not remotely the same as having a Terry or a Marty in the room (rather than by Zoom), but it appeared to be solidly beneficial. The VHAP English program, less so.

      The main problems with VHAP are that it’s too small a program for the very many kids undernourished by the standard curriculum gruel and, especially, the selection process. I had an extended exchange with a very friendly and very patient administrator, who explained as best he could how VHAP selection works. It is a weird combination of NAPLAN results, school recommendations and (strongly) concern for underprivileged kids. Of course I hammered the use of NAPLAN results for the selection, and the administrator replied, weakly, “That’s all we have to go on”. I replied that this argument was just as idiotic as SCoMoFo’s Robodebt farce, the use of yearly income data to conclude fortnightly debt because “That’s all we have to go on”. To his credit, the administrator laughed and accepted the point. I also accepted the point that the VHAP guys have nothing else to go on, and it was unfair of me in the post above to call them idiots for using NAPLAN data; it is more accurate to refer to the selection process as idiotic.

      This is a really important point. NAPLAN is not simply an appallingly bad test. It is an appallingly bad test that occupies the only space for national testing, and thus precludes the possibility of having genuinely sensible and useful national testing.

      1. Now that I have PAT results for nearly all my students, I use them in making decisions because PAT data are the only reliable and valid data (according to ACER) that I have available. PAT does make the point that one should use more than a single number. So I complement PAT data with my experience with the students and common sense.

          1. No.

            The booklet “Number and Algebra”, the 5th in the series of 6 entitled “PAT Essentials”, gives examples of how one might structure lessons to bring a student up to the next band in their learning.

            Like NAPLAN, PAT uses adaptive tests. The tests are adapted as the student progresses through the test. All my Year 9 students may do different tests. So while it may be possible for ACER to publish individual questions, there is no such thing as a single past paper for Year 9 students.

            All the above is based on my understanding of PAT.

              1. I make these decisions based on what I know of PAT tests from my reading, webinars I have attended, the data that they produce, and my experience in teaching the students. I accept their claim that PAT tests are valid and reliable – although these terms have many variations. Of course, my conclusions are not “slam-dunk” (as the PAT people acknowledge) but they are based on the best evidence that I have at present.

                1. Testing kids.. yeah.
                  “Here, 1.5 years ahead (of what?), but here 2 years behind (behind what?).”
                  It’s about the same student , in maths.
                  Can it be right? Does it make sense?

          2. Any teacher who conducts a PAT has full access to every question that any student has sat, with a reasonably complete set of analytic information.
            If a student you are responsible for has sat a PAT, and you don’t have access to this information, someone in your school needs a solid kicking.

            1. Thank you, CC. Have you in particular seen the questions for the PAT tests sat by your students? If so, can you comment on them?

  3. NAPLAN fits into a clear pattern that people get very excited about data, often when the quality is utterly appalling. Very very often, when you start to find out how data have been gathered, what questions have actually been asked and how they were actually answered, you discover that conclusions way above its pay-grade are drawn from it.

    But this isn’t about the quality of whatever NAPLAN data measures, it’s about politics. NAPLAN fits a political agenda – that’s all that the politicians who make these decisions care about. Needs a big scandal or scare to change it I would guess (like the VCE exams) but then how is it different from PISA in that regard?

    One can accept that it starts out not so good and then improves (that’s normal for data implementation). It’s the lack of transparency and a feedback loop that is the real problem as you’ve noted Marty.

    Data makes money and gives the appearance of something – that’s what matters to the decision-makers.

    And – guess how influential teachers are in parliament – bet it’s worse now than this from 2018 – from https://www.aeufederal.org.au/news-media/news/2019/fewer-teachers-entering-federal-politics – “According to the report ‘The way in: representation in the Australian Parliament’ by policy group Per Capita, in 1988 teaching was the most common career path for members of federal parliament. Nearly a quarter (23.2%) of MPs in 1988 (including a third of Labor MPs), came from teaching backgrounds. In 2018 this figure had dropped to only 12.4% (including one fifth of all Labor MPs who still come from teaching backgrounds)”

  4. Marty (and others), if I may take a slight tangent – a point you half-raised that I think is worth exploring the more I think about it: parents have a right to know what these NAPLAN results mean.

    To some parents, seeing the colorful graphics is too much. For others though, if a report says X is underperforming in skill A then I suggest the parents of X have every right to see what tool was used to make this diagnosis and, if they wish, to then seek a second opinion.

    Teachers and students seem to be the enduring center of the NAPLAN discussion each year, but if I remember correctly, the original argument for NAPLAN was to give parents easy to understand information.

    So, how about giving parents the evidence to draw their own conclusions?

    1. Anything that lacks transparency should never be trusted. And anything that works very hard to actively avoid transparency should be trusted even less. It’s a house of cards waiting for a strong enough wind to blow it down.

  5. If you want to see how NAPLAN is actually used by parents – check out the Melbourne Schools Discussion Group in Facebook – a lot of parents take it very very seriously indeed.

    1. Thanks, JJ. What do you mean “used by”? You mean the parents take the test results seriously or they use the test results seriously?

      1. They compare the NAPLAN scores of schools assiduously – suspect they don’t care much about the tests themselves.

        1. As I recall, ACARA has said that one should not use NAPLAN results to compare schools.

          I have also heard it said that NAPLAN results affect government funding in Victorian schools. Is this true? If so, how do the results affect funding?

          1. Re: TM comment. If the latter is true, then certainly one should use NAPLAN results to compare schools. But of course, if ACARA says that “that one should not use NAPLAN results to compare schools.” then obviously we shouldn’t because ACARA knows best ….

            Re: JJ comment: “They compare the NAPLAN scores of schools assiduously – suspect they don’t care much about the tests themselves.”
            Of course they do and of course they don’t. It’s all about boasting for many parents. That’s why so many students have tutors, not because they need one but because many parents have a compulsion to compare and feel they have to ‘keep up’.

            Re: Marty update. I wonder why results could not be delivered so quickly in the past?
            If I was cynical I would think this is more about trying to create a narrative, a spandex wearing superhero with fluttering cape swooping in to save the day. The dawn of a new age. And the media are happy to blow the trumpets, playing the part of the faithful stooge. Perish the thought that tough questions would get asked by the media and there would be careful analysis. A smart kid could rip this press release to shreds. To me it reeks of a spin doctored public interview, with any positive educational outcomes a fortunate side effect. But I’m a cynic.

              1. What do you think the purpose of the press release is?
                Who is the intended audience, what is the intended message, and why? At best I think it’s a PR stunt, spin doctoring 101, with the media playing their part as the faithful stooge. (I suppose it takes an extra-special skill to defend the media, towards which the opinion of “lazy and silly” is better directed). At worst, I stand by my cynicism (there are some people I’ll always wear my jade-coloured glasses when looking at them).

                And the question stands – why all of a sudden can results be released so much earlier than in the past? Suddenly all the states have unblocked a drain pipe of some previous impediment? Teachers, parents and students of the (recent) past have been shortchanged.

                1. The press release can have more than one purpose. You just look foolish by refusing to consider that there might be a legitimate purpose.

            1. I have never understood why online tests such as NAPLAN take so long to come out. LANTITE was the same when I did it a few of years ago. On the other hand, the results of PAT tests are available almost immediately.

              1. It can be added to the list of ‘Things That Take Too Long To Come Out’.

                I think the answer is a combination of layers of bureaucracy, not caring, little accountability and bureaucratic laziness/inertia. Sheep that simply accept it certainly doesn’t help either.

                Self-interest and self-preservation are the only factors I can see that motivate bureaucrats to do better. JJ might have more insight.

                1. Speed is not a driver in the public service, unless the Minister wants it – then it’s pretty much the only driver.

                  The role of politics is the most likely driver for anything (these days more than ever). Before anything like this is released, the Minister will have to have been briefed and agreed to its release. Drops in Naplan scores etc will have to have agreed messages etc so the Minister is prepared. Add to that no real driver for speed and it will take a long time.

                  You used to motivate bureaucrats by relating their work to real world outcomes – that’s harder and harder these days as the politics drives everything.

                  1. It would be good if bureaucrats were held accountable and motivated by genuine non-trivial consequences.

                    (The fact that they’re pushing for a 4 day working week by saying that all the work of a 5 day working week would still get done says a lot).

                    Imagine the outcry if, say, teachers didn’t hand back SACs for a couple of months and kept giving the same lame-ass excuses as these bureaucrats … The buck stops with the Ministers to demand better.

                    1. Hi BiB – I’m afraid it doesn’t work like that – in a world of considerable complexity, it’s impossible to judge and if there is any judgement, it’s those who have the power who make the judgement (ie Ministers/execs etc). If there were accountability, no public servant would ever take any risk, let alone the few taken these days. They already have performance pay for public servants (ie does your manager like you) and it’s as effective as performance pay would be for teachers (except that it drives up public servants’ pay – we had the last laugh on Jeff!).

                      Let me assure you that if the Minister wanted it released quickly, it would be released quickly. In fact the State Liberal Government decided to put timelines on the release of health data – easily done, just reduce the insufficient amount of cleaning and quality control that took place even further to speed things up.

                      Be careful what you wish for ….

                    2. Thanks, JJ.

                      I’m referring less about the quick release of data and more about:

                      (a) the quality of the questions (I know bureaucrats don’t write them but they could and should do a better job getting competent writers and vetters),

                      (b) the lack of transparency (such as refusing to make the actual NAPLAN questions available).

                      These are things controlled by bureaucrats. The acting CEO and previous CEO’s of ACARA have a very poor track record – recent events with the VCAA exams amply demonstrate how little value some people place on (a) and (b) and the amount of spin they disgorge. I find it astounding that after the VCAA exam debacles of the last few years, the person with whom the buck stopped is now responsible for the NAPLAN. I’d love to see education reporters ask how and why that happened.

                    3. Hi BiB

                      Fair enough – the main thing I noticed in my 30 years in the place, was the decline of the bureaucracy as a result of many factors, particularly the increasing politicisation of the place. In addition, it’s hard to convey just how little experience and competence count anymore in the bureaucracy.

                      To select the right people to write and vet questions, you need people who really understand the subject to choose the right people. There may not be many if any of those left. If you’re a generic bureaucrat, who’s never done it yourself, it’s very easy to get the wool pulled over your eyes by consultants. While the question writing is in the control of bureaucrats, release of questions and results may not be – that may very well be a political decision.

                      Unfortunately once you reach a certain level in the bureaucracy (or companies it seems), you seem to always be able to fail upwards.

                    4. Thanks, JJ. But sorry …

                      You are surely wrong about the politicisation of the public service. Despite the thorough investigation and findings of Ombudsman Deborah Glass (OBE), our Premier Jacinta Allen, Treasurer Tim Pallas and Department of Premier and Cabinet Secretary Jeremi Moule categorically deny any suggestion that the public service is politicised. We should trust what our supreme leaders say.

                      And you must surely be wrong about just how little experience and competence count anymore in the bureaucracy. Such a situation is inconceivable because it would be sheer lunacy. I cannot believe our supreme leaders would be so negligent as let this happen or to facilitate it in any way.

                      The notion that a bureaucrat could easily get the wool pulled over their eyes by consultants is surely ludicrous. These bureaucrats are responsible for advising our supreme leaders of policy that has significant consequences to the public, they are highly competent professionals appointed by our supreme leaders and would never be so easily fooled. They are not gullible chumps.

              2. I could argue that ACARA is competent and has always been concerned to get the NAPLAN results out as quickly as possible, but I forgot to take my LSD this morning. I will make a couple points, however.

                Firstly, NAPLAN is a number of tests, taken at the same time by a billion kids. Then, the results are analysed to the nth degree, to see if the results for left-handed gay hispanics have gone up or down, or whatever. It’s a *lot* to do, and I would assume way more than PAT or LANTITE.

                Secondly, there have long been strong calls for NAPLAN to be conducted earlier in the year and for the results to be out quicker. Not by me: I voted for never and never. But people cared.

                Thirdly, with NAPLAN now completely online it is obviously much easier to process the answers. Yes, that was true last year as well, but it’s not unreasonable to assume that the processes have improved.

                With all that, I think it’s fair enough for ACARA to issue a press release on the New And Faster NAPLAN. Yes, ACARA also then benefits from the thoughtlessness and fluffiness of the stories, but it’s still a reasonable media.

                What’s not reasonable is the lazy education media eating it up, and thinking no further, or at all.

                1. I don’t recall NAPLAN asking students which hand is their dominant or preferred…

                  But you make a fair point: a lot of the data analysis being done misses the main point – if the test sucks then the data sucks.

                2. Thanks, Marty.

                  I don’t understand why the results have to be “analysed to the nth degree” before they can be made available to teachers, parents and students. Surely this could/can be done concurrently with the release of results, and findings from the analysis drip fed …? Then again, that would mean results got handed out before they had been thoroughly analysed for anything embarrassing and no opportunity given for spin.

                  Yes, I agree that “it’s fair enough for ACARA to issue a press release on the New And Faster NAPLAN.” What was issued was not a press release in my opinion – it was a trailer for a Christopher Nolan movie about how wonderful ACARA and NAPLAN is. And I’ve read nothing that explains how this wonderful many weeks earlier than in the past release is being accomplished. Maybe the fact that “NAPLAN [is] now completely online it is obviously much easier to process the answers.” is the answer. But I cannot believe that alone explains the huge disparity between past release dates and the release date promised for 2024. I prefer to believe that the ACARA bureaucrats have been forced to take off their cardigans, roll up their sleeves, get rid of the BS and actually do their job properly.

                  1. I’m not for a minute excusing ACARA’s use of the press release, here or ever, in order to manipulate media coverage. Of course they timed the press release to coincide with the beginning of NAPLAN, and of course they did that to generate favourable coverage. I just don’t think ACARA’s use of press releases in this manner is particularly egregious. Pretty much all press releases by government and official bodies are written and timed in the same manner.

                    I pick on ACARA (and other such groups) because their conduct and their product is way beyond standardly bad, even accepting that they are an official body. When ACARA is simply bad in the standard manner, as I think is the case with the press release, I’m not going to make a big deal about it.

                    Compare, for example, ACARA’s latest press release with their ACARA’s involvement with Why Maths Must Change. The former was a little greasy but the latter was absolutely outrageous.

                    As for analysing to the nth degree and so on, I can see there is an argument for doing this and that first, instead of that and this. But I think you are too arrogant and presumptuous about how such a large set of data should or can be handled.

                  2. Hi Bib – you did answer your own question “Then again, that would mean results got handed out before they had been thoroughly analysed for anything embarrassing and no opportunity given for spin.”

                    However there is a valid reason for some delay – in any large data exercise, you need to clean the data, organise it and analyse it etc before you release it. Inevitably there will be weird things that were hard to predict eg a question that in hindsight (to ACARA at least) made no sense.

                    People who haven’t analysed large data sets never realise just how much effort goes into cleaning data, analysing etc and presenting it before it has ever been released. You do need to get your story straight before releasing any data. (Not excusing the time it takes, just explaining that it is by no means instantaneous even with modern computers).

          2. If ACARA does not want people to compare schools using NAPLAN data, why did the MySchool website ever come into being?

            It made comparison very “easy” (in that a lot of people thought comparing schools was easy, but the data was, as we found out over time, incomplete, rather oddly presented and not entirely meaningful in the first place.

            So yes, ACARA says many things. They also do a few things which do not always support what they say.

            1. A more pointed question would be: Why does MySchool present NAPLAN data? The website has other useful data.

                1. ACARA tells us that you cannot compare current NAPLAN data with data of past years. Suppose further that NAPLAN data should not be used to compare schools. Then there it appears that the NAPLAN data posted on MySchool is not helpful in any way.

                    1. “This means you can’t compare NAPLAN achievement prior to 2023 to that from 2023 onwards.” (MySchool)

                1. Thanks. Of course NAPLAN will result in teaching to the test. As I keep saying, I don’t think that is a problem, unless the test sucks. Which of course NAPLAN does, but then that is the real problem on which to focus: the test sucks.

                  The Committee had the nous to realise that ACARA accepts no responsibility for anything. I doubt they had the nous to realise the test sucks.

  6. The Robyn Grace article was very good. She reported the facts. It is both troubling and fascinating that ACARA could not give straight answers to simple questions – I think even the general reader can draw the correct conclusions from this.

    I hope that Minister Carroll is asking ACARA the same simple questions and demanding answers, that will in turn translate into significant reform. Maybe Dr Bennett could be convinced to conduct a review into ACARA’s practices, at least when it comes to the NAPLAN.

    1. No. I like no Gniel no more than you. He was bad at VCAA and there is no reason to believe he’ll be any better at ACARA. Your comment was still obtuse. Don’t be so damn lazy.

  7. “Please note in 2023 NAPLAN testing moved from May to March and the NAPLAN scale was reset. This means you can’t compare NAPLAN achievement prior to 2023 to that from 2023 onwards. ” (MySchool)

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 128 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here