The NAPLAN Numeracy Test Test

The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:

Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?

Q2. Do you regard it important to note and to clarify these distinctions?

Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?

The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.

Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.

The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.

The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.

The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.

And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.

How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?

The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.

The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.

17 Replies to “The NAPLAN Numeracy Test Test”

  1. Is there no room for a Q4: Do you agree that having schools laser-focussed on numeracy is to the detriment of the mathematical education of our children?

    1. Good point, Glen. Although, given that pretty much all reporters/ed-clowns get Q1 wrong, I doubt it’d change the mark distribution.

  2. An honest question (I promise): what is numeracy, actually? I know plenty of things it isn’t, but apart from a quasi-mathematical answer to the term “literacy” I’m not really sure that numeracy is even a thing.

    If I’m correct (and I have yet to hear a convincing argument otherwise) there is no point in trying to test it, because it doesn’t exist!

    1. Hi RF. I think JF’s references spell it out as much as is possible. Numeracy is a thing, but a vague and very limited thing. I think the simplest short answer is to think of numeracy as “functional numeracy”, in the same manner as one has functional literacy: the bare bones of numerical understanding to get by in everyday life.

  3. A test that has an identity crisis is always doomed. This is the case with the NAPLAN: It either doesn’t know what it’s meant to be testing, or it thinks it knows but it actually doesn’t.

    And a test – particularly one that has an identity crisis – whose results can be (mis)used by different interest groups to mean different things is always going to cause nothing but trouble.

    And a test that becomes the sole focus of teaching is the worst test of all.

    But what amazes me, and I see this in schools, is the idiots clearly shown incapable of organising a piss-up in a brewery keep getting asked to organise the piss-up.

    1. Thanks, JF. I strongly agree with almost all of that. The incompetence (and mendacity) of Australia’s education authorities is astonishing and important and almost totally unreported.

  4. And just when you thought it couldn’t get any worse, this afternoon the department sent around a bloated mess of documentation about “literacy in mathematics”. I look forward to next week’s email about “numeracy in English” and how students studying Shakespeare need to be taught “numeracy” skills to properly understand iambic pentameter.

    1. Jesus. By “department” do you mean your school department? Is there a way of sharing the documents that won’t get you whacked?

  5. After thinking about the meaning of “numeracy” for some time, I have concluded that “numeracy” should be defined as “applied mathematics”.

    1. That’s very funny and a great insight. Terry, I think you’re absolutely correct. Of course it’s a very impoverished form of “applied” and “mathematics”, more applied arithmetic. But I can’t see that it’s anything else.

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 128 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here