Two PISA Crap

Below are two “units” (scenarios) used in the PISA 2012 testing of mathematics. The units appeared in this collection of test questions and sample questions, and appear to be the most recent questions publicly available. Our intention is for the units to be read in conjunction with this post, and see also here, but of course readers are free to comment here as well. The two units below are, in our estimation, the most difficult or conceptually involved of the PISA 2012 units publicly available; most questions in most other units are significantly more straight-forward.

SAILING SHIPS

 

REVOLVING DOORS

The Slanted Tower of PISA

Here’s an interesting tidbit: PISA‘s mathematics testing doesn’t test mathematics. Weird, huh? Who knew?

Well, we kinda knew. Trustworthy colleagues had suggested to us that PISA was slanted, but finding out the extent of that slant, like lying-on-the-ground slant, was genuinely surprising. (We’re clearly just too optimistic about the world of education.) Not that we had any excuse for being surprised; there were clues of mathematical crime in plain sight, and it was easy enough to locate the bodies.

The first clues are on PISA’s summary page on “Mathematics Performance“. The title is already a concern; qualifications and elaborations of “mathematics” usually indicate some kind of dilution, and “performance” sounds like a pretty weird elaboration. Perhaps “mathematics performance” might be dismissed as an eccentricity, but what follows cannot be so dismissed. Here is PISA’s summary of the meaning of “mathematical performance”:

Mathematical performance, for PISA, measures the mathematical literacy of a 15 year-old to formulate, employ and interpret mathematics in a variety of contexts to describe, predict and explain phenomena, recognising the role that mathematics plays in the world. The mean score is the measure. A mathematically literate student recognises the role that mathematics plays in the world in order to make well-founded judgments and decisions needed by constructive, engaged and reflective citizens.

The alarms are set off by “mathematical literacy”, a pompous expression that promises more than, while signalling we’ll be getting much less than, straight mathematics. All doubt is then ended with the phrase “the role that mathematics plays in the world”, which is so fundamental that it is repeated verbatim.

What this sums to, of course, is numeracy, the noxious weed that inevitably chokes everything whenever there’s an opportunity to discuss the teaching of mathematics. What this promises is, akin to NAPLAN, PISA’s test of “mathematical performance” will centre on shallow and contrived scenarios, presented with triple the required words, and demanding little more than simple arithmetic. Before investigating PISA’s profound new world, however, there’s another aspect of PISA that really could do with a whack.

We have been told that the worldly mathematics that PISA tests is needed by “constructive, engaged and reflective citizens”. Well, there’s nothing like irrelevant and garishly manipulative salesmanship to undermine what you’re selling. The puffing up of PISA’s “world” mathematics has no place in what should be a clear and dispassionate description of the nature of the testing. Moreover, even on its own terms, the puffery is silly. The whole point of mathematics is that it is abstract and transferrable, that the formulas and techniques illustrated with one setting can be applied in countless others. Whatever the benefits of PISA’s real world mathematics for constructive, engaged and reflective citizens, there will be the exact same benefits for destructive, disengaged psychopaths. PISA imagines Florence Nightingale calculating drip rates? We imagine a CIA torturer calculating drip rates.

PISA’s flamboyent self-promotion seems part and parcel of its reporting. Insights and Inpretations, PISA’s summary of the 2018 test results, comes served with many flavours of Kool-Aid. It includes endless fussing about “the digital world” which, we’re told, “is becoming a sizeable part of the real world”. Reading has changed, since it is apparently “no longer mainly about extracting information”. And teaching has changed, because there’s “the race with technology”. The document wallows in the growth mindset swamp, and on and on. But not to fear, because PISA, marvellous PISA, is on top of it, and has “evolved to better capture these demands”. More accurately, PISA has evolved to better market itself clothed in modern educational fetishism.

Now, to the promised crimes. The PISA test is administered to 15 year old students (typically Year 9 or, more often, Year 10 in Australia). What mathematics, then, does PISA consider worth asking these fifteen year olds? PISA’s tests questions page directs to a document containing questions from the PISA 2012 test, as well as sample questions and questions from earlier PISAs; these appear to be the most recent questions made publicly available, and are presumably representative of PISA 2018. In total, the document provides eleven scenarios or “units” from the PISA 2012 test, comprising twenty-six questions.

To illustrate what is offered in those twenty-six questions from PISA 2012, we have posted two of the units here, and a third unit here. It is also not difficult, however, to indicate the general nature of the questions. First, as evidenced by the posted units, and the reason for posting them elsewhere, the questions are long and boring; the main challenge of these units is to suppress the gag reflex long enough to digest them. As for the mathematical content, as we flagged, there is very little; indeed, there is less mathematics than there appears, since students are permitted to use a calculator. Predictably, every unit is a “context” scenario, without a single straight mathematics question. Then, for about half of the twenty-six questions, we would categorise the mathematics required to be somewhere between easy and trivial, involving a very simple arithmetic step (with calculator) or simple geometric idea, or less. About a quarter of the questions are computationally longer, involving a number of arithmetic steps (with calculator), but contain no greater conceptual depth. The remaining questions are in some sense more conceptual, though that “more” should be thought of as “not much more”. None of the questions could be considered deep, or remotely interesting. Shallowness aside, the breadth of mathematics covered is remarkably small. These are fifteen year old students being tested, but no geometry is required beyond the area of a rectangle, Pythagoras’s theorem and very simple fractions of a circle; there is no trigonometry or similarity; there is no probability; there are no primes or powers or factorisation; there are no explicit functions, and the only implicit functional behaviour is linear.

Worst of all, PISA’s testing of algebra is evidently close to non-existent. There is just one unit, comprising two questions, requiring any algebra whatsoever. That unit concerns a nurse (possibly a CIA torturer) calculating drip rates. Minus the tedious framing and the pointless illustration, the scenario boils down to consideration of the formula

D = dv/(60n) .

(The meaning of the variables and the formula needn’t concern us here, although we’ll note that it takes a special type of clown to employ an upper case D and a lower case d in the same formula.)

There are two questions on this equation, the first asking for the change in D if n is doubled. (There is some WitCHlike idiocy in the suggested grading for the question, but we’ll leave that as a puzzle for the reader.) For the second question (labelled “Question 3” for God knows what reason), students are given specific, simple values of D, d and n, and they are required to calculate v (with a calculator). That’s it. That is the sum total of the algebra on the twenty-six questions, and that is disgraceful.

Algebra is everything in mathematics. Algebra is how we name the quantity we’re after, setting the stage for its capture. Algebra is how we signify pattern, allowing us to hunt for deeper pattern. Algebra is how we indicate the relationship between quantities. Algebra is how Descartes captured geometry, and how Newton and Leibniz captured calculus.

It is not difficult to guess why PISA sidelines algebra, since it is standard, particularly from numeracy fanatics, to stereotype algebra as abstract, as something only within mathematics. But of course, even from PISA’s blinkered numeracy perspective, this is nonsense. You want to think about mathematics in the world? Then the discovery and the analysis of patterns, and the analysis of relationships, of functions is the heart of it. And what makes the heart beat is algebra.

Does PISA offer anything of value? Well, yeah, a little. It is a non-trivial and worthwhile skill to be able to extract intrinsically simple mathematics from a busy and wordy scenario. But it’s not that important, and it’s hardly the profound “higher order” thinking that some claim PISA offers. It is a shrivelled pea of an offering, which completely ignores vast fields of mathematics and mathematical thought.

PISA’s disregard of algebra is ridiculous and shameful, the final stake in PISA’s thoroughly nailed coffin. It demonstrates that PISA isn’t “higher” or “real”, it is just other, and it is an other we would all be much better off without.

Foundation Stoned

The VCAA is reportedly planning to introduce Foundation Mathematics, a new, lower-level year 12 mathematics subject. According to Age reporter Madeleine Heffernan, “It is hoped that the new subject will attract students who would not otherwise choose a maths subject for year 12 …”. Which is good, why?

Predictably, the VCAA is hell-bent on not solving the wrong problem. It simply doesn’t matter that not more students continue with mathematics in Year 12. What matters is that so many students learn bugger all mathematics in the previous twelve years. And why should anyone believe that, at that final stage of schooling, one more year of Maths-Lite will make any significant difference?

The problem with Year 12 that the VCAA should be attempting to solve is that so few students are choosing the more advanced mathematics subjects. Heffernan appears to have interviewed AMSI Director Tim Brown, who noted the obvious, that introducing the new subject “would not arrest the worrying decline of students studying higher level maths – specialist maths – in year 12.” (Tim could have added that Year 12 Specialist Mathematics is also a second rate subject, but one can expect only so much from AMSI.)

It is not clear that anybody other than the VCAA sees any wisdom in their plan. Professor Brown’s extended response to Heffernan is one of quiet exasperation. The comments that follow Heffernan’s report are less quiet and are appropriately scathing. So who, if anyone, did the VCAA find to endorse this distracting silliness?

But, is it worse than silly? VCAA’s new subject won’t offer significant improvement, but could it make matters worse? According to Heffernan, there’s nothing to worry about:

“The new subject will be carefully designed to discourage students from downgrading their maths study.”

Maybe. We doubt it.

Ms. Heffernan appears to be a younger reporter, so we’ll be so forward as to offer her a word of advice: if you’re going to transcribe tendentious and self-serving claims provided by the primary source for and the subject of your report, it is accurate, and prudent, to avoid reporting those claims as if they were established fact.

A Quick Message for Holden and Piccoli

A few days ago the Sydney Morning Herald published yet another opinion piece on Australia’s terrific PISA results. The piece was by Richard Holden, a professor of economics at UNSW, and Adrian Piccoli, formerly a state Minster for Education and now director of the Gonski Institute at UNSW. Holden’s and Piccoli’s piece was titled

‘Back to basics’ is not our education cure – it’s where we’ve gone wrong

Oh, really? And what’s the evidence for that? The piece begins,

A “back to basics” response to the latest PISA results is wrong and ignores the other data Australia has spent more than 10 years obsessing about – NAPLAN. The National Assessment Program – Literacy and Numeracy is all about going back to basics ...

The piece goes on, arguing that the years of emphasis on NAPLAN demonstrate that Australia has concentrated upon and is doing fine with “the basics”, and at the expense of the “broader, higher-order skills tested by PISA”.

So, here’s our message:

Dear Professors Holden and Piccoli, if you are so ignorant as to believe NAPLAN and numeracy is about “the basics”, and if you can exhibit no awareness that the Australian Curriculum has continued the trashing of “the basics”, and if you are so stuck in the higher-order clouds to be unaware of the lack of and critical need for properly solid lower-order foundations, and if you can write an entire piece on PISA without a single use of the words “arithmetic” and “mathematics” then please, please just shut the hell up and go away.

The NAPLAN Numeracy Test Test

The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:

Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?

Q2. Do you regard it important to note and to clarify these distinctions?

Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?

The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.

Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.

The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.

The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.

The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.

And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.

How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?

The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.

The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.

NAPLAN’s Numeracy Test

NAPLAN has been much in the news of late, with moves for the tests to go online while simultaneously there have been loud calls to scrap the tests entirely. And, the 2018 NAPLAN tests have just come and gone. We plan to write about all this in the near future, and in particular we’re curious to see if the 2018 tests can top 2017’s clanger. For now, we offer a little, telling tidbit about ACARA.

In 2014, we submitted FOI applications to ACARA for the 2012-2014 NAPLAN Numeracy tests. This followed a long and bizarre but ultimately successful battle to formally obtain the 2008-2011 tests, now available here: some, though far from all, of the ludicrous details of that battle are documented here. Our requests for the 2012-2014 papers were denied by ACARA, then denied again after ACARA’s internal “review”. They were denied once more by the Office of the Australian Information Commissioner. We won’t go into OAIC’s decision here, except to state that we regard it as industry-capture idiocy. We lacked the energy and the lawyers, however, to pursue the matter further.

Here, we shall highlight one hilarious component of ACARA’s reasoning. As part of their review of our FOI applications, ACARA was obliged under the FOI Act to consider the public interest arguments for or against disclosure. In summary, ACARA’s FOI officer evaluated the arguments for disclosure as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

Yes, the scoring is farcical and self-serving, but let’s ignore that.

ACARA’s FOI officer went on to “total” the public interest arguments in favour of disclosure. They obtained a “total” of 2/10.

Seriously.

We then requested an internal review, pointing out, along with much other nonsense, ACARA’s FOI officer’s dodgy scoring and dodgier arithmetic. The internal “review” was undertaken by ACARA’s CEO. His “revised” scoring was as follows:

  • Promoting the objects of the FOI Act — 1/10
  • Informing a debate on a matter of public importance — 1/10
  • Promoting effective oversight of public expenditure — 0/10

And his revised total? Once again, 2/10.

Seriously.

These are the clowns in charge of testing Australian students’ numeracy.

A Lack of Moral Authority

The Victorian Minister for Education has announced that the state’s senior school curriculum will undergo a review. The stated focus of the review is to consider whether “there should be a more explicit requirement for students to meet minimum standards of literacy and numeracy …“. The review appears to be strongly supported by industry, with a representative of the Australian Industry Group noting that “many companies complained school leavers made mistakes in spelling and grammar, and could not do basic maths“.

Dumb and dumber.

First, let’s note that Victorian schools have 12 years (plus prep) to teach the 3 Rs. That works out to 4 years (plus prep/3) per R, yet somehow it’s not working. Somehow the standards are sufficiently low that senior students can scale an exhausting mountain of assignments and exams, and still too many students come out lacking basic skills.

Secondly, the Minister has determined that the review will be conducted by the VCAA, the body already responsible for Victorian education.

If the definition of insanity is doing the same thing over and over and expecting different results, then the definition of insane governance is expecting the arrogant clown factory responsible for years of educational idiocy to have any willingness or ability to fix it.

Accentuate the Negative

Each year about a million Australian school students are required to sit the Government’s NAPLAN tests. Produced by ACARA, the same outfit responsible for the stunning Australian Curriculum, these tests are expensive, annoying and pointless. In particular it is ridiculous for students to sit a numeracy test, rather than a test on arithmetic or more broadly on mathematics. It guarantees that the general focus will be wrong and that specific weirdnesses will abound. The 2017 NAPLAN tests, conducted last week, have not disappointed. Today, however, we have other concerns.

Wading into NAPLAN’s numeracy quagmire, one can often find a nugget or two of glowing wrongness. Here is a question from the 2017 Year 9 test:

In this inequality is a whole number.

\color{blue} \dfrac7{n} \boldsymbol{<} \dfrac57

What is the smallest possible value for n to make this inequality true?

The wording is appalling, classic NAPLAN. They could have simply asked:

What is the smallest whole number n for which \color{red} \dfrac7{n} \boldsymbol{<} \dfrac57\, ?

But of course the convoluted wording is the least of our concerns. The fundamental problem is that the use of the expression “whole number” is disastrous.

Mathematicians would avoid the expression “whole number”, but if pressed would most likely consider it a synonym for “integer”, as is done in the Australian Curriculum (scroll down) and some dictionaries. With this interpretation, where the negative integers are included, the above NAPLAN question obviously has no solution. Sometimes, including in, um, the Australian Curriculum (scroll down), “whole number” is used to refer to only the nonnegative integers or, rarely, to only the positive integers. With either of these interpretations the NAPLAN question is pretty nice, with a solution n = 10. But it remains the case that, at best, the expression “whole number” is irretrievably ambiguous and the NAPLAN question is fatally flawed.

Pointing out an error in a NAPLAN test is like pointing out one of Donald Trump’s lies: you feel you must, but doing so inevitably distracts from the overall climate of nonsense and nastiness. Still, one can hope that ACARA will be called on this, will publicly admit that they stuffed up, and will consider employing a competent mathematician to vet future questions. Unfortunately, ACARA is just about as inviting of criticism and as open to admitting error as Donald Trump.