Foundation Stoned

The VCAA is reportedly planning to introduce Foundation Mathematics, a new, lower-level year 12 mathematics subject. According to Age reporter Madeleine Heffernan, “It is hoped that the new subject will attract students who would not otherwise choose a maths subject for year 12 …”. Which is good, why?

Predictably, the VCAA is hell-bent on not solving the wrong problem. It simply doesn’t matter that not more students continue with mathematics in Year 12. What matters is that so many students learn bugger all mathematics in the previous twelve years. And why should anyone believe that, at that final stage of schooling, one more year of Maths-Lite will make any significant difference?

The problem with Year 12 that the VCAA should be attempting to solve is that so few students are choosing the more advanced mathematics subjects. Heffernan appears to have interviewed AMSI Director Tim Brown, who noted the obvious, that introducing the new subject “would not arrest the worrying decline of students studying higher level maths – specialist maths – in year 12.” (Tim could have added that Year 12 Specialist Mathematics is also a second rate subject, but one can expect only so much from AMSI.)

It is not clear that anybody other than the VCAA sees any wisdom in their plan. Professor Brown’s extended response to Heffernan is one of quiet exasperation. The comments that follow Heffernan’s report are less quiet and are appropriately scathing. So who, if anyone, did the VCAA find to endorse this distracting silliness?

But, is it worse than silly? VCAA’s new subject won’t offer significant improvement, but could it make matters worse? According to Heffernan, there’s nothing to worry about:

“The new subject will be carefully designed to discourage students from downgrading their maths study.”

Maybe. We doubt it.

Ms. Heffernan appears to be a younger reporter, so we’ll be so forward as to offer her a word of advice: if you’re going to transcribe tendentious and self-serving claims provided by the primary source for and the subject of your report, it is accurate, and prudent, to avoid reporting those claims as if they were established fact.

A Quick Message for Holden and Piccoli

A few days ago the Sydney Morning Herald published yet another opinion piece on Australia’s terrific PISA results. The piece was by Richard Holden, a professor of economics at UNSW, and Adrian Piccoli, formerly a state Minster for Education and now director of the Gonski Institute at UNSW. Holden’s and Piccoli’s piece was titled

‘Back to basics’ is not our education cure – it’s where we’ve gone wrong

Oh, really? And what’s the evidence for that? The piece begins,

A “back to basics” response to the latest PISA results is wrong and ignores the other data Australia has spent more than 10 years obsessing about – NAPLAN. The National Assessment Program – Literacy and Numeracy is all about going back to basics ...

The piece goes on, arguing that the years of emphasis on NAPLAN demonstrate that Australia has concentrated upon and is doing fine with “the basics”, and at the expense of the “broader, higher-order skills tested by PISA”.

So, here’s our message:

Dear Professors Holden and Piccoli, if you are so ignorant as to believe NAPLAN and numeracy is about “the basics”, and if you can exhibit no awareness that the Australian Curriculum has continued the trashing of “the basics”, and if you are so stuck in the higher-order clouds to be unaware of the lack of and critical need for properly solid lower-order foundations, and if you can write an entire piece on PISA without a single use of the words “arithmetic” and “mathematics” then please, please just shut the hell up and go away.

The NAPLAN Numeracy Test Test

The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:

Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?

Q2. Do you regard it important to note and to clarify these distinctions?

Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?

The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.

Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.

The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.

The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.

The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.

And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.

How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?

The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.

The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.

A PISA Crap

The PISA results were released on Tuesday, and Australians having been losing their minds over them. Which is admirably consistent: the country has worked so hard at losing minds over the last 20+ years, it seems entirely reasonable to keep on going.

We’ve never paid much attention to PISA. We’ve always had the sense that the tests were tainted in a NAPLANesque manner, and in any case we can’t imagine the results would ever indicate anything about Australian maths education that isn’t already blindingly obvious. As Bob Dylan (almost) sang, you don’t need a weatherman to know which way the wind is blowing.

And so it is with PISA 2018. Australia’s mathematical decline is undeniable, astonishing and entirely predictable. Indeed, for the NAPLANesque reasons suggested above, the decline in mathematics standards is probably significantly greater than is suggested by PISA. Greg Ashman raises the issue in this post.

So, how did this happen, and what are we to do? Unsurprisingly, there has been no reluctance from our glorious educational leaders to proffer warnings and solutions. AMSI, of course, is worrying their bone, whining for about the thirtieth time about unqualified teachers. The Lord of ACER thinks that Australia is focusing too much on “the basics”, at the expense of “deep understandings”. If only the dear Lord’s understanding was a little deeper.

Others suggest we should “focus systematically on student and teacher wellbeing“, whatever that means. Or, we should reduce teachers’ “audit anxiety“. Or, the problem is “teachers [tend] to focus on content rather than student learning“. Or, the problem is a “behaviour crisis“. Or, we should have “increased scrutiny of university education degrees” and “support [students’] schooling at home”. And, we could introduce “master teachers”. But apparently “more testing is not the answer“. In any case, “The time for talk is over“, according to a speech by Minister Tehan.

Some of these suggestions are, of course, simply ludicrous. Others, and others we haven’t mentioned, have at least a kernel of truth, and a couple we can strongly endorse.

No institution we can see, however, no person we have read, seems ready to face up to the systemic corruption, to see the PISA results in the light of the fundamental perversion of mathematics education in Australia. Not a word we could see questioning the role of calculators and the fetishisation of their progeny. Not a note of doubt about the effect of computers. Not a single suggestion that STEM may not be an antidote but, rather, a poison. Barely a word on the “inquiry” swampland that most primary schools have become. And, barely a word on the loss of discipline, on the valuable and essential meanings of that word. What possible hope is there, then, for meaningful change?

We await PISA 2021 with unbated breath.

NAPLAN’s Numeracy Test

NAPLAN has been much in the news of late, with moves for the tests to go online while simultaneously there have been loud calls to scrap the tests entirely. And, the 2018 NAPLAN tests have just come and gone. We plan to write about all this in the near future, and in particular we’re curious to see if the 2018 tests can top 2017’s clanger. For now, we offer a little, telling tidbit about ACARA.

In 2014, we submitted FOI applications to ACARA for the 2012-2014 NAPLAN Numeracy tests. This followed a long and bizarre but ultimately successful battle to formally obtain the 2008-2011 tests, now available here: some, though far from all, of the ludicrous details of that battle are documented here. Our requests for the 2012-2014 papers were denied by ACARA, then denied again after ACARA’s internal “review”. They were denied once more by the Office of the Australian Information Commissioner. We won’t go into OAIC’s decision here, except to state that we regard it as industry-capture idiocy. We lacked the energy and the lawyers, however, to pursue the matter further.

Here, we shall highlight one hilarious component of ACARA’s reasoning. As part of their review of our FOI applications, ACARA was obliged under the FOI Act to consider the public interest arguments for or against disclosure. In summary, ACARA’s FOI officer evaluated the arguments for disclosure as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

Yes, the scoring is farcical and self-serving, but let’s ignore that.

ACARA’s FOI officer went on to “total” the public interest arguments in favour of disclosure. They obtained a “total” of 2/10.

Seriously.

We then requested an internal review, pointing out, along with much other nonsense, ACARA’s FOI officer’s dodgy scoring and dodgier arithmetic. The internal “review” was undertaken by ACARA’s CEO. His “revised” scoring was as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

And his revised total? Once again, 2/10.

Seriously.

These are the clowns in charge of testing Australian students’ numeracy.

A Lack of Moral Authority

The Victorian Minister for Education has announced that the state’s senior school curriculum will undergo a review. The stated focus of the review is to consider whether “there should be a more explicit requirement for students to meet minimum standards of literacy and numeracy …“. The review appears to be strongly supported by industry, with a representative of the Australian Industry Group noting that “many companies complained school leavers made mistakes in spelling and grammar, and could not do basic maths“.

Dumb and dumber.

First, let’s note that Victorian schools have 12 years (plus prep) to teach the 3 Rs. That works out to 4 years (plus prep/3) per R, yet somehow it’s not working. Somehow the standards are sufficiently low that senior students can scale an exhausting mountain of assignments and exams, and still too many students come out lacking basic skills.

Secondly, the Minister has determined that the review will be conducted by the VCAA, the body already responsible for Victorian education.

If the definition of insanity is doing the same thing over and over and expecting different results, then the definition of insane governance is expecting the arrogant clown factory responsible for years of educational idiocy to have any willingness or ability to fix it.

Accentuate the Negative

Each year about a million Australian school students are required to sit the Government’s NAPLAN tests. Produced by ACARA, the same outfit responsible for the stunning Australian Curriculum, these tests are expensive, annoying and pointless. In particular it is ridiculous for students to sit a numeracy test, rather than a test on arithmetic or more broadly on mathematics. It guarantees that the general focus will be wrong and that specific weirdnesses will abound. The 2017 NAPLAN tests, conducted last week, have not disappointed. Today, however, we have other concerns.

Wading into NAPLAN’s numeracy quagmire, one can often find a nugget or two of glowing wrongness. Here is a question from the 2017 Year 9 test:

In this inequality is a whole number.

\color{blue} \dfrac7{n} \boldsymbol{<} \dfrac57

What is the smallest possible value for n to make this inequality true?

The wording is appalling, classic NAPLAN. They could have simply asked:

What is the smallest whole number n for which \color{red} \dfrac7{n} \boldsymbol{<} \dfrac57\, ?

But of course the convoluted wording is the least of our concerns. The fundamental problem is that the use of the expression “whole number” is disastrous.

Mathematicians would avoid the expression “whole number”, but if pressed would most likely consider it a synonym for “integer”, as is done in the Australian Curriculum (scroll down) and some dictionaries. With this interpretation, where the negative integers are included, the above NAPLAN question obviously has no solution. Sometimes, including in, um, the Australian Curriculum (scroll down), “whole number” is used to refer to only the nonnegative integers or, rarely, to only the positive integers. With either of these interpretations the NAPLAN question is pretty nice, with a solution n = 10. But it remains the case that, at best, the expression “whole number” is irretrievably ambiguous and the NAPLAN question is fatally flawed.

Pointing out an error in a NAPLAN test is like pointing out one of Donald Trump’s lies: you feel you must, but doing so inevitably distracts from the overall climate of nonsense and nastiness. Still, one can hope that ACARA will be called on this, will publicly admit that they stuffed up, and will consider employing a competent mathematician to vet future questions. Unfortunately, ACARA is just about as inviting of criticism and as open to admitting error as Donald Trump.