ACARA Didn’t Expect the Sydney Inquisition

Catholics Schools NSW has begun some serious interrogation of ACARA. Led by CEO Dallas McInerney, their chief weapon is sense, and truth. Their two weapons are sense and truth, and intelligence. Their three weapons are sense and truth and intelligence, and a pastoral ethic. Amongst their weapons are … Continue reading “ACARA Didn’t Expect the Sydney Inquisition”

Shuffling NAPLAN’s Deckchairs

We’re late to this, but it’s gotta be done.

Some State education ministers, unhappy with NAPLAN, commissioned a review, which appeared a couple weeks ago. The Review considers many contentious aspects of NAPLAN, but we’ll focus upon “numeracy”, NAPLAN’s homeopathic proxy for mathematics. We’ll leave others to debate “literacy” and the writing tests, and the timing and reporting and so forth.

So, what might the Review entail for the Son-of-NAPLAN testing of mathematics? Bugger all.

Which was always going to happen. For all the endless public and pundit whining about NAPLAN, which is what prompted this latest Review, none of the criticism has been aimed at the two elephants: the Australian Curriculum, which underpins NAPLAN, is a meatless mass of gristle and fat; and “numeracy” is not mathematics, is not arithmetic, and is barely anything. The inevitable consequence is that NAPLAN amounts to the aimless testing of untestable fuzz. As Gertrude Stein would have put it, there is no there there to test.

This misdirection of the Review was locked in by the terms of reference. No mention is made of “mathematics” or “arithmetic”. The single reference in the Terms to “numeracy” is a deadpan call for “the most efficient and effective system for assessing key literacy and numeracy outcomes”, as if this were a clear and unproblematic and worthy goal. It is no surprise, therefore, that the Review gives almost no attention to arithmetic and mathematics, and the meaning(lessness) of numeracy, and indeed works actively to avoid it.

The Review includes a capsule summary of the Numeracy tests, a superficial comparison to PISA and TIMSS, and Australia’s relative performance over time on these tests (pp 34-42). There is no proper exposition, however, of the nature of the tests. There is nothing reflecting the hard fact that NAPLAN and PISA are pseudomathematical garbage. TIMSS, on the other hand, is decidedly not garbage, so what does the Review do with that? That is interesting.

In what could have been a beacon paragraph, the Review compares the Australian Curriculum with expectations on TIMSS:

“… The Australian Curriculum emphasis on knowing and applying is similar to TIMSS but the Australian Curriculum does not appear to cover some of the complexity that is described in the TIMSS framework under reasoning. It seems likely, too, that a substantial number of TIMSS mathematics items are beyond Australian Curriculum expectations for achievement, especially at the Year 4 level.”

In summary, the emphasis on “knowing and applying” mathematics in the Australian Curriculum is just like TIMSS, as long as you don’t really care how much students know, or how deeply they can apply it, or how successful you “expect” them to be at it. Yep, two peas in a pod.

What does the Review then do with this critical paragraph? Nothing. They just drone ahead. Here is the indication that their entire Review is doomed to idiot trivialities, but they can’t see it, or won’t admit it. They see the smoke, note the smoke, but it doesn’t occur to them, or they just can’t be bothered, or it wasn’t in their idiot Terms, to look for the damn gun.

Finally, what of the recommendations proposed by the Review? There are two that concern the testing of numeracy and/or mathematics. The first, Recommendation 2.2, is that authorities

“Rename the numeracy test as mathematics …”

Huh. And what would be the purpose of that? Well, supposedly it would “clarify that [the test] assesses the content and proficiency strands of the Australian Curriculum: Mathematics”. Except, of course, and as the Review itself acknowledges, the Numeracy test doesn’t do anything of the sort. And, even to the minimal extent that it does, it just points back to Elephant Number One, that the Australian Curriculum is not a properly sound basis for anything.

The isolated suggestion to rename a test is of course a distracting triviality. Alas, not all of the Review’s recommendations are so trivial. Recommendation 2.3 proposes a new test, for

“… [the] assessment of critical and creative thinking in science, technology, engineering and mathematics (STEM) …”

Ah, Yes. Let’s test whether ten-year-old Tommy is the new Einstein.

This is a monumentally stupid recommendation. Is Jenny the next Newton? Maybe. But can she manipulate numbers and expressions with sufficient speed and accuracy to hold, let alone mould, a substantial mathematical thought in her head? Just maybe you might want to test for that first? Is Carol the new Capote? Then perhaps first teach her the basics of grammar, first teach her how to construct a clear and correct sentence. Then you can think to tease out all the great works inside her. Is Fritz another Mozart? Gee, I dunno. How are his scales? And on and on.

This constant, idiot call for the teaching of and, worse, the testing of “higher order” thinking, this mindless genuflection to reasoning and creativity, is maddening. It ignores the stubborn fact that deeper thought and creativity in any discipline can only be built upon the craft, upon the basic knowledge and skills of that discipline. The Review’s call is even worse for that, since STEM isn’t a discipline, it’s just a foggy con job.

This Godzilla versus Mothra battle is never likely to end, nor likely to end well. On the one side are the numeracy nuts, who can’t see the value of skills independent of some ridiculous application. On the other side are the creativity clowns, who ludicrously denigrate “the basics”, and ludicrously paint NAPLAN as the basics they’re denigrating. Neither side exhibits any understanding of what the basics are, or their critical importance. Neither side has a clue. Which means, unless and until these two monsters somehow destroy each other, we’re all doomed.

The NAPLAN Numeracy Test Test

The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:

Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?

Q2. Do you regard it important to note and to clarify these distinctions?

Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?

The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.

Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.

The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.

The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.

The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.

And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.

How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?

The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.

The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.

NAPLAN’s Latest Last Legs

The news is that NAPLAN is on its way out. An article from SMH Education Editor Jordan Baker quotes Boston College’s Andy Hargreaves claiming tests such as NAPLAN are on their “last legs”. This has the ring of truth, since Professor Hargreaves is … who knows? We’re not told anything about who Hargreaves is, or why we should bother listening to him.

Perhaps Professor Hargreaves is correct, but we have reason to doubt it. And, Jordan Baker has been administering NAPLAN’s last rites for a while now. Last year, Baker wrote another article, on NAPLAN’s “death knell”.

Regular readers of this blog would be aware that this writer would love nothing more than to see ACARA sink into the sea, taking its idiotic tests and clueless curriculum with it. But it’s important to understand why, and why the argument for getting rid of NAPLAN is no gimme. It is here that we disagree with Hargreaves and (we suspect) Baker.

Baker quotes Hargreaves on national testing such as NAPLAN and its “unintended impact of students’ well-being and learning”:

[They include] students’ anxiety, teaching for the test, narrowing of the curriculum and teachers avoiding innovation in the years when the tests were conducted.

Let’s consider Hargreaves’ points in reverse order.

  • Innovation. Yes, a focus on NAPLAN would discourage innovation. Which would be a bad thing if the innovation wasn’t poisonous, techno-fetishistic nonsense. Hargreaves, someone, has to give a convincing argument that current educational innovation is generally positive. We’ll wait. We won’t hold our breath.    
  • Narrowing of the curriculum? We can only wish. The Australian Curriculum is a blivit, a bloated mass of pointlessness.
  • Teaching to the test is of course a bad thing. Except if it isn’t. If you have a good test then teaching to the test is a great thing.
  • Finally, we have to deal with students’ anxiety, a concern for which has turned into an academic industry. All those poor little petals having their egos bruised. Heaven forbid that we require students to struggle with the hard business of learning.

There is plenty to worry about with any national testing scheme: the age of the students, the frequency of the tests, the reporting and use of test results, and the ability to have an informed public discussion of all of this. But all of this is secondary.

The problem with the NAPLAN tests isn’t their “unintended consequences”. The problem with the NAPLAN tests is the tests. They’re shithouse.

 

NAPLAN’s Numeracy Test

NAPLAN has been much in the news of late, with moves for the tests to go online while simultaneously there have been loud calls to scrap the tests entirely. And, the 2018 NAPLAN tests have just come and gone. We plan to write about all this in the near future, and in particular we’re curious to see if the 2018 tests can top 2017’s clanger. For now, we offer a little, telling tidbit about ACARA.

In 2014, we submitted FOI applications to ACARA for the 2012-2014 NAPLAN Numeracy tests. This followed a long and bizarre but ultimately successful battle to formally obtain the 2008-2011 tests, now available here: some, though far from all, of the ludicrous details of that battle are documented here. Our requests for the 2012-2014 papers were denied by ACARA, then denied again after ACARA’s internal “review”. They were denied once more by the Office of the Australian Information Commissioner. We won’t go into OAIC’s decision here, except to state that we regard it as industry-capture idiocy. We lacked the energy and the lawyers, however, to pursue the matter further.

Here, we shall highlight one hilarious component of ACARA’s reasoning. As part of their review of our FOI applications, ACARA was obliged under the FOI Act to consider the public interest arguments for or against disclosure. In summary, ACARA’s FOI officer evaluated the arguments for disclosure as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

Yes, the scoring is farcical and self-serving, but let’s ignore that.

ACARA’s FOI officer went on to “total” the public interest arguments in favour of disclosure. They obtained a “total” of 2/10.

Seriously.

We then requested an internal review, pointing out, along with much other nonsense, ACARA’s FOI officer’s dodgy scoring and dodgier arithmetic. The internal “review” was undertaken by ACARA’s CEO. His “revised” scoring was as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

And his revised total? Once again, 2/10.

Seriously.

These are the clowns in charge of testing Australian students’ numeracy.

NAPLAN’s Numerological Numeracy

This year Australia celebrates ten years of NAPLAN testing, and Australians can ponder the results. Numerous media outlets have reported “a 2.55% increase in numeracy” over the ten years. This is accompanied by a 400% increase in the unintended irony of Australian education journalism.

What is the origin of that 2.55% and precisely what does it mean to have “an increase in numeracy” by that amount? Yes, yes, it clearly means “bugger all”, but bugger all of what? It is a safe bet that no one reporting the percentage has a clue, and it is not easy to determine.

The media appear to have taken the percentage from a media release from Simon Birmingham, the Federal Education and Training Minister. (Birmingham, it should be noted, is one of the better ministers in the loathsome Liberal government; he is merely hopeless rather than malevolent.) Attempting to decipher that 2.55%, it seems to refer to the “% average change in NAPLAN mean scale score [from 2008 to 2017], average for domains across year levels”. Whatever that means.

ACARA, the administrators of NAPLAN, issued their own media release on the 2017 NAPLAN results. This release does not quote any percentages but indicates that the “2107 summary information” can be found at the the NAPLAN reports page. Two weeks after ACARA’s media release, no such information is contained on or linked on that page, nor on the page titled NAPLAN 2017 summary results. Both pages link to a glossary, to explain “mean scale score”, which in turn explains nothing. The 2016 NAPLAN National Report contains the expression 207 times, without once even pretending to explain what it means. The 609-page Technical Report from 2015 (the latest available on ACARA’s website) appears to contain the explanation, though the precise expression is never used and nothing remotely resembling a user-friendly summary is included.

To put it very briefly, each student’s submitted test is given a “scaled score”. One purpose of this is to be able to compare tests and test scores from different years. The statistical process is massively complicated and in particular it includes a weighting for the “difficulty” of each test question. There is plenty that could be queried here, particularly given ACARA’s peculiar habit of including test questions that are so difficult they can’t be answered. But, for now, we’ll accept those scaled scores as a thing. Then, for example, the national average for 2008 Year 3 numeracy scaled scores was 396.9. This increased to 402.0 in 2016, amounting to a percentage increase of 1.29%. The average percentage increases from 2008 to 2017 can then be further averaged over the four year levels, and (we think) this results in that magical 2.55%.

It is anybody’s guess whether that “2.55% increase in numeracy” corresponds to anything real, but the reporting of the figure is simply hilarious. Numeracy, to the very little extent it means anything, refers to the ability to apply mathematics effectively in the real world. To then report on numeracy in such a manner, with a who-the hell-cares free-floating percentage is beyond ironic; it’s perfect.

But of course the stenographic reportage is just a side issue. The main point is that there is no evidence that ten years of NAPLAN testing, and ten years of shoving numeracy down teachers’ and students’ throats, has made one iota of difference.

NAPLAN’s Mathematical Nonsense, and What it Means for Rural Peru

The following question appeared on Australia’s Year 9 NAPLAN Numeracy Test in 2009:

y = 2x – 1

y = 3x + 2

Which value of x satisfies both of these equations?

It is a multiple choice question, but unfortunately “The question is completely stuffed” is not one of the available answers.

Of course the fundamental issue with simultaneous equations is the simultaneity. Both equations and both variables must be considered as a whole, and it simply making no sense to talk about solutions for x without reference to y. Unless y = -7 in the above equations, and there is no reason to assume that, then no value of x satisfies both equations. The NAPLAN question is way beyond bad.

It is always worthwhile pointing out NAPLAN nonsense, as we’ve done before and will continue to do in the future. But what does this have to do with rural Peru?

In a recent post we pointed out an appalling question from a nationwide mathematics exam in New Zealand. We flippantly remarked that one might expect such nonsense in rural Peru but not in a wealthy Western country such as New Zealand. We were then gently slapped in the comments for the Peruvian references: Josh queried whether we knew anything of Peru’s educational system; and, Dennis questioned the purpose of bringing up Peru, since Australia’s NAPLAN demonstrates a “level of stupidity” for all the World to see. These are valid points.

It would have been prudent to have found out a little about Peru before posting, but we seem to be safe. Peru’s economy has been growing rapidly but is not nearly as strong as New Zealand’s or Australia’s. Peruvian school education is weak, and Peru seems to have no universities comparable to the very good universities in New Zealand and Australia. Life and learning in rural Peru appears to be pretty tough.

None of this is surprising, and none of it particularly matters. Our blog post referred to “rural Peru or wherever”. The point was that we can expect poorer education systems to throw up nonsense now and then, or even typically; in particular, lacking ready access to good and unharried mathematicians, it is unsurprising if exams and such are mathematically poor and error-prone.

But what could possibly be New Zealand’s excuse for that idiotic question? Even if the maths ed crowd didn’t know what they were doing, there is simply no way that a competent mathematician would have permitted that question to remain as is, and there are plenty of excellent mathematicians in New Zealand. How did a national exam in New Zealand fail to be properly vetted? Where were the mathematicians?

Which brings us to Australia and to NAPLAN. How could the ridiculous problem at the top of this post, or the question discussed here, make it into a nationwide test? Once again: where were the mathematicians?

One more point. When giving NAPLAN a thoroughly deserved whack, Dennis was not referring to blatantly ill-formed problems of the type above, but rather to a systemic and much more worrying issue. Dennis noted that NAPLAN doesn’t offer a mathematics test or an arithmetic test, but rather a numeracy test. Numeracy is pedagogical garbage and in the true spirit of numeracy, NAPLAN’s tests include no meaningful evaluation of arithmetic or algebraic skills. And, since we’re doing the Peru thing, it seems worth noting that numeracy is undoubtedly a first world disease. It is difficult to imagine a poorer country, one which must weigh every educational dollar and every educational hour, spending much time on numeracy bullshit.

Finally, a general note about this blog. It would be simple to write amusing little posts about this or that bit of nonsense in, um, rural Peru or wherever. That, however, is not the purpose of this blog. We have no intention of making easy fun of people or institutions honestly struggling in difficult circumstances; that includes the vast majority of Australian teachers, who have to tolerate and attempt to make sense of all manner of nonsense flung at them from on high. Our purpose is to point out the specific idiocies of arrogant, well-funded educational authorities that have no excuse for screwing up in the manner in which they so often do.