The Mysterious Wisdom of the East

According to The Australian newspaper (paywalled), a bunch of “education and policy experts” have headed to China in an attempt to address Australia’s educational woes:

Frustrated by stagnating maths and STEM standards, [they] are travelling to China for lessons on how to boost maths and science in local classrooms. 

Gee, I wonder what they might learn. What secret path to mathematical facility could those inscrutable Chinese possess? A wonderful new app, maybe. Or perhaps Chinese schools flip their classrooms in some really special way. 

But, whatever their secret, it may not help us to learn it. The worth of “importing other countries’ teachings practices” is apparently questionable, “given that education is woven within the cultural fabric of nations.”

There’s plenty woven within (?) the cultural fabric of Australia, but whether one should refer to it as education is open to debate.

NAPLAN’s Mathematical Nonsense, and What it Means for Rural Peru

The following question appeared on Australia’s Year 9 NAPLAN Numeracy Test in 2009:

y = 2x – 1

y = 3x + 2

Which value of x satisfies both of these equations?

It is a multiple choice question, but unfortunately “The question is completely stuffed” is not one of the available answers.

Of course the fundamental issue with simultaneous equations is the simultaneity. Both equations and both variables must be considered as a whole, and it simply making no sense to talk about solutions for x without reference to y. Unless y = -7 in the above equations, and there is no reason to assume that, then no value of x satisfies both equations. The NAPLAN question is way beyond bad.

It is always worthwhile pointing out NAPLAN nonsense, as we’ve done before and will continue to do in the future. But what does this have to do with rural Peru?

In a recent post we pointed out an appalling question from a nationwide mathematics exam in New Zealand. We flippantly remarked that one might expect such nonsense in rural Peru but not in a wealthy Western country such as New Zealand. We were then gently slapped in the comments for the Peruvian references: Josh queried whether we knew anything of Peru’s educational system; and, Dennis questioned the purpose of bringing up Peru, since Australia’s NAPLAN demonstrates a “level of stupidity” for all the World to see. These are valid points.

It would have been prudent to have found out a little about Peru before posting, but we seem to be safe. Peru’s economy has been growing rapidly but is not nearly as strong as New Zealand’s or Australia’s. Peruvian school education is weak, and Peru seems to have no universities comparable to the very good universities in New Zealand and Australia. Life and learning in rural Peru appears to be pretty tough.

None of this is surprising, and none of it particularly matters. Our blog post referred to “rural Peru or wherever”. The point was that we can expect poorer education systems to throw up nonsense now and then, or even typically; in particular, lacking ready access to good and unharried mathematicians, it is unsurprising if exams and such are mathematically poor and error-prone.

But what could possibly be New Zealand’s excuse for that idiotic question? Even if the maths ed crowd didn’t know what they were doing, there is simply no way that a competent mathematician would have permitted that question to remain as is, and there are plenty of excellent mathematicians in New Zealand. How did a national exam in New Zealand fail to be properly vetted? Where were the mathematicians?

Which brings us to Australia and to NAPLAN. How could the ridiculous problem at the top of this post, or the question discussed here, make it into a nationwide test? Once again: where were the mathematicians?

One more point. When giving NAPLAN a thoroughly deserved whack, Dennis was not referring to blatantly ill-formed problems of the type above, but rather to a systemic and much more worrying issue. Dennis noted that NAPLAN doesn’t offer a mathematics test or an arithmetic test, but rather a numeracy test. Numeracy is pedagogical garbage and in the true spirit of numeracy, NAPLAN’s tests include no meaningful evaluation of arithmetic or algebraic skills. And, since we’re doing the Peru thing, it seems worth noting that numeracy is undoubtedly a first world disease. It is difficult to imagine a poorer country, one which must weigh every educational dollar and every educational hour, spending much time on numeracy bullshit.

Finally, a general note about this blog. It would be simple to write amusing little posts about this or that bit of nonsense in, um, rural Peru or wherever. That, however, is not the purpose of this blog. We have no intention of making easy fun of people or institutions honestly struggling in difficult circumstances; that includes the vast majority of Australian teachers, who have to tolerate and attempt to make sense of all manner of nonsense flung at them from on high. Our purpose is to point out the specific idiocies of arrogant, well-funded educational authorities that have no excuse for screwing up in the manner in which they so often do.

Obtuse Triangles

Whatever the merits of undertaking a line by line critique of the Australian Curriculum, it would take a long time, it would be boring and it would probably overshadow the large, systemic problems. (Also, no one in power would take any notice, though that has never really slowed us down.) Still, the details should not be ignored, and we’ll consider here one of the gems of Homer Simpson cluelessness.

In 2010, Burkard Polster and I wrote an Age newspaper column about a draft of the Australian Curriculum. We focused on one line of the draft, an “elaboration” of Pythagoras’s Theorem:

recognising that right-angled triangle calculations may generate results that can be integral, fractional or irrational numbers known as surds

Though much can be said about this line, the most important thing to say is that it is wrong. Seven years later, the line is still in the Australian Curriculum, essentially unaltered, and it is still wrong.

OK, perhaps the line isn’t wrong. Depending upon one’s reading, it could instead be meaningless. Or trivial. But that’s it: wrong and meaningless and trivial are the only options.

The weird grammar and punctuation is standard for the Australian Curriculum. It takes a special lack of effort, however, to produce phrases such as “right-angled triangle calculations” and “generate results”. Any student who offered up such vague nonsense in an essay would know to expect big red strokes and a lousy grade. Still, we can take a guess at the intended meaning.

Pythagoras’s Theorem can naturally be introduced with 3-4-5 triangles and the like, with integer sidelengths. How does one then obtain irrational numbers? Well, “triangle calculations” on the triangle below can definitely “generate” irrational “results”:

Yeah, yeah, \pi is not a “surd”.  But of course we can replace each \pi by √7 or 1/7 or whatever, and get sidelengths of any type we want. These are hardly “triangle calculations”, however, and it makes the elaboration utterly trivial: fractions “generate” fractions, and irrationals “generate” irrationals. Well, um, wow.

We assume that the point of the elaboration is that if two sides of a right-angled triangle are integral then the third side “generated” need not be. So, the Curriculum writers presumably had in mind 1-1-√2 triangles and the like, where integers unavoidably lead us into the world of irrationals. Fair enough. But how, then, can we similarly obtain the promised (non-integral) fractional sidelengths? The answer is that we cannot.

It is of course notable that two sides of a right-angled triangle can be integral with the third side irrational. It is also notable, however, that two integral sides cannot result in the third side being a non-integral fraction. This is not difficult to prove, and makes a nice little exercise; the reader is invited to give a proof in the comments. The reader may also wish to forward their proof to ACARA, the producers of the Australian Curriculum.

How does such nonsense make it into a national curriculum? How does it then remain there, effectively unaltered, for seven years? True, our 2010 column wasn’t on the front of the New York Times. But still, in seven years did no one at ACARA ever get word of our criticism? Did no one else ever question the elaboration to anyone at ACARA?

But perhaps ACARA did become aware of our or others’ criticism, reread the elaboration, and decided “Yep, it’s just what we want”. It’s a depressing thought, but this seems as likely an explanation as any.

Factoring in the Stupidity

It is very brave to claim that one has found the stupidest maths exam question of all time. And the claim is probably never going to be true: there will always be some poor education system, in rural Peru or wherever, doing something dumber than anything ever done before. For mainstream exams in wealthy Western countries, however, New Zealand has come up with something truly exceptional.

Last year, New Zealand students at Year 11 sat one of two algebra exams administered by the New Zealand Qualifications Authority. The very first question on the second exam reads:

A rectangle has an area of  \bf x^2+5x-36. What are the lengths of the sides of the rectangle in terms of  \bf x.

The real problem here is to choose the best answer, which we can probably all agree is sides of length \pi and (x^2+5x-36)/\pi.

OK, clearly what was intended was for students to factorise the quadratic and to declare the factors as the sidelengths of the rectangle. Which is mathematical lunacy. It is simply wrong.

Indeed, the question would arguably still have been wrong, and would definitely still have been awful, even if it had been declared that x has a unit of length: who wants students to be thinking that the area of a rectangle uniquely determines its sidelengths? But, even that tiny sliver of sense was missing.

So, what did students do with this question? (An equivalent question, 3(a)(i), appeared on the first exam.) We’re guessing that, seeing no alternative, the majority did exactly what was intended and factorised the quadratic. So, no harm done? Hah! It is incredible that such a question could make it onto a national exam, but it gets worse.

The two algebra exams were widely and strongly criticised, by students and teachers and the media. People complained that the exams were too difficult and too different in style from what students and teachers had been led to expect. Both types of criticism may well have been valid. For all of the public criticism of the exams, however, we could find no evidence of the above question or its Exam 1 companion being flagged. Plenty of complaining about hard questions, plenty of complaining about unexpected questions, but not a word about straight out mathematical crap.

So, not only do questions devoid of mathematical sense appear on a nationwide exam. It then appears that the entire nation of students is being left to accept that this is what mathematics is: meaningless autopilot calculation. Well done, New Zealand. You’ve made the education authorities in rural Peru feel very much better about themselves.

UPDATE (04/02/19): Lightning strikes twice, and thrice.

Accentuate the Negative

Each year about a million Australian school students are required to sit the Government’s NAPLAN tests. Produced by ACARA, the same outfit responsible for the stunning Australian Curriculum, these tests are expensive, annoying and pointless. In particular it is ridiculous for students to sit a numeracy test, rather than a test on arithmetic or more broadly on mathematics. It guarantees that the general focus will be wrong and that specific weirdnesses will abound. The 2017 NAPLAN tests, conducted last week, have not disappointed. Today, however, we have other concerns.

Wading into NAPLAN’s numeracy quagmire, one can often find a nugget or two of glowing wrongness. Here is a question from the 2017 Year 9 test:

In this inequality is a whole number.

\color{blue} \dfrac7{n} \boldsymbol{<} \dfrac57

What is the smallest possible value for n to make this inequality true?

The wording is appalling, classic NAPLAN. They could have simply asked:

What is the smallest whole number n for which \color{red} \dfrac7{n} \boldsymbol{<} \dfrac57\, ?

But of course the convoluted wording is the least of our concerns. The fundamental problem is that the use of the expression “whole number” is disastrous.

Mathematicians would avoid the expression “whole number”, but if pressed would most likely consider it a synonym for “integer”, as is done in the Australian Curriculum (scroll down) and some dictionaries. With this interpretation, where the negative integers are included, the above NAPLAN question obviously has no solution. Sometimes, including in, um, the Australian Curriculum (scroll down), “whole number” is used to refer to only the nonnegative integers or, rarely, to only the positive integers. With either of these interpretations the NAPLAN question is pretty nice, with a solution n = 10. But it remains the case that, at best, the expression “whole number” is irretrievably ambiguous and the NAPLAN question is fatally flawed.

Pointing out an error in a NAPLAN test is like pointing out one of Donald Trump’s lies: you feel you must, but doing so inevitably distracts from the overall climate of nonsense and nastiness. Still, one can hope that ACARA will be called on this, will publicly admit that they stuffed up, and will consider employing a competent mathematician to vet future questions. Unfortunately, ACARA is just about as inviting of criticism and as open to admitting error as Donald Trump.

The Median is the Message

Our first post concerns an error in the 2016 Mathematical Methods Exam 2 (year 12 in Victoria, Australia). It is not close to the silliest mathematics we’ve come across, and not even the silliest error to occur in a Methods exam. Indeed, most Methods exams are riddled with nonsense. For several reasons, however, whacking this particular error is a good way to begin: the error occurs in a recent and important exam; the error is pretty dumb; it took a special effort to make the error; and the subsequent handling of the error demonstrates the fundamental (lack of) character of the Victorian Curriculum and Assessment Authority.

The problem, first pointed out to us by teacher and friend John Kermond, is in Section B of the exam and concerns Question 3(h)(ii). This question relates to a probability distribution with “probability density function”

    \[  f(x) =   \left\{\aligned &\frac{(210-x)e^{\frac{x-210}{20}}}{400} \qquad && 0\leqslant x \leqslant 210,\\ &0 && \text{elsewhere.} \endaligned\right.}\]

Now, anyone with a good nose for calculus is going to be thinking “uh-oh”. It is a fundamental property of a PDF that the total integral (underlying area) should equal 1. But how are all those integrated powers of e going to cancel out? Well, they don’t. What has been defined is only approximately a PDF,  with a total area of 1 - 23/2e^{21/2} \approx 0.9997. (It is easy to calculate the area exactly using integration by parts.)

Below we’ll discuss the absurdity of handing students a non-PDF, but back to the exam question. 3(h)(ii) asks the students to find the median of the “probability distribution”, correct to two decimal places. Since the question makes no sense for a non-PDF, of course the VCAA have shot themself in the foot. However, we can still attempt to make some sense of the question, which is when we discover that the VCAA has also shot themself in the other foot.

The median m of a probability distribution is the half-way point. So, in the integration context here we want the m for which

a)      \phantom{\quad}  \int\limits_0^m f(x)\,{\rm d}x = \dfrac12.

As such, this question was intended to be just another CAS exercise, and so both trivial and pointless: push the button, write down the answer and on to the next question. The problem is, the median can also be determined by the equation

b)     \phantom{\quad}  \int\limits_m^{210} f(x)\,{\rm d}x = \dfrac12,

or by the equation

c)     \phantom{\quad} \int\limits_0^m f(x)\,{\rm d}x = \int\limits_m^{210} f(x)\,{\rm d}x.

And, since our function is only approximately a PDF, these three equations necessarily give three different answers: to the demanded two decimal places the answers are respectively 176.45, 176.43 and 176.44. Doh!

What to make of this? There are two obvious questions.

1. How did the VCAA end up with a PDF which isn’t a PDF?

It would be astonishing if all of the exam’s writers and checkers failed to notice the integral was not 1. It is even more astonishing if all the writers-checkers recognised and were comfortable with a non-PDF. Especially since the VCAA can be notoriously, absurdly fussy about the form and precision of answers (see below).

2. How was the error in 3(h)(ii) not detected?

It should have been routine for this mistake to have been detected and corrected with any decent vetting. Yes, we all make mistakes. Mistakes in very important exams, however, should not be so common, and the VCAA seems to make a habit of it.

OK, so the VCAA stuffed up. It happens. What happened next? That’s where the VCAA’s arrogance and cowardice shine bright for all to see. The one and only sentence in the Examiners’ Report that remotely addresses the error is:

“As [the] function f  is a close approximation of the [???] probability density function, answers to the nearest integer were accepted”. 

The wording is clumsy, and no concession has been made that the best (and uniquely correct) answer is “The question is stuffed up”, but it seems that solutions to all of a), b) and c) above were accepted. The problem, however, isn’t with the grading of the question.

It is perhaps too much to expect an insufferably arrogant VCAA to apologise, to express anything approximating regret for yet another error. But how could the VCAA fail to understand the necessity of a clear and explicit acknowledgement of the error? Apart from demonstrating total gutlessness, it is fundamentally unprofessional. How are students and teachers, especially new teachers, supposed to read the exam question and report? How are students and teachers supposed to approach such questions in the future? Are they still expected to employ the precise definitions that they have learned? Or, are they supposed to now presume that near enough is good enough?

For a pompous finale, the Examiners’ Report follows up by snarking that, in writing the integral for the PDF, “The dx was often missing from students’ working”. One would have thought that the examiners might have dispensed with their finely honed prissiness for that one paragraph. But no. For some clowns it’s never the wrong time to whine about a missing dx.

UPDATE (16 June): In the comments below, Terry Mills has made the excellent point that the prior question on the exam is similarly problematic. 3(h)(i) asks students to calculate the mean of the probability distribution, which would normally be calculated as \int xf(x)\,{\rm d}x. For our non-PDF, however, we should should normalise by dividing by \int f(x)\,{\rm d}x. To the demanded two decimal places, that changes the answer from the Examiners’ Report’s 170.01 to 170.06.

UPDATE (05/07/22): The examination report was updated on 18/07/20, and now (mostly) fesses up to the nonsense in 3(h)(ii). There is still no submission for the parallel nonsense in 3(h)(i).