Theorem: Let V be the set of valid arguments against marriage equality. Then V is empty.
Proof: Let P be a valid argument. Then, by now, someone would have argued P. This has not occurred. (Proof: by exhaustion.) By contradiction, it follows that P does not exist, and thus V is empty. QED.
An alternative, direct proof of the theorem was provided by the California Supreme Court; their proof applied the definition of equality.
Consideration of the many straight-forward corollaries of this theorem are left to the reader.
The key findings of Australia’s 2016 National Drug Strategy Household Survey were released earlier this year, and they made for sobering reading. The NDSHS reported that over 15% of Australians had used illicit drugs in the previous year, including such drugs as cannabis, ice and heroin. Shocking, right?
Wrong. Of course.
We’re being silly in a way that the NDSHS reporting was not. Yes, the NDSHS reported that 15% had used illicit drugs at least once (including the possibility of exactly once) in the previous year, but NDSHS also emphasised the composition of that 15%. By far the most commonly used drug was cannabis, at about 10% of the population. Ice use was around 1%, and heroin didn’t register in the summary.
Illicit drug use is a serious problem, and a problem exacerbated by idiotic drug laws. Nothing can be learned, however, and nothing can be solved if one focuses upon a meaningless 15% multicategory. Whatever the specific threats or the reasonableness of concerns over the broad use of cannabis, such concerns pale in comparison to the problems of ice and heroin. The NDSHS makes no such categorical mistake. Unfortunately, there are plenty of clowns who do.
Last week, the Federal Ministers for Social Services and Human Services announced the location of a drug testing trial for job seekers who receive federal benefits. The ironically named Christian Porter and the perfectly named Alan Tudge announced that receipients would be tested “for illicit substances including ice (methamphetamine), ecstasy (MDMA) and marijuana (THC) … People who test positive to drug tests will continue to receive their welfare payment but 80 per cent of their payment will only be accessible through Income Management.” The plan is deliberately nasty and monumentally stupid, and it has been widely reported as such. For all the critical reporting, however, we could find no instance of the media noting the categorical lunacy of effectively equating the use of ice and ecstasy and THC.
Still, one should be fair to Porter and Tudge. They are undeniably dickheads, but Porter and Tudge are hardly exceptional. They are members of a very large group of thuggish, victim-blaming politicians, which includes Malcolm Turnbull, and Peter Dutton, and Adolf Hitler.
This year Australia celebrates ten years of NAPLAN testing, and Australians can ponder the results. Numerous media outlets have reported “a 2.55% increase in numeracy” over the ten years. This is accompanied by a 400% increase in the unintended irony of Australian education journalism.
What is the origin of that 2.55% and precisely what does it mean to have “an increase in numeracy” by that amount? Yes, yes, it clearly means “bugger all”, but bugger all of what? It is a safe bet that no one reporting the percentage has a clue, and it is not easy to determine.
The media appear to have taken the percentage from a media release from Simon Birmingham, the Federal Education and Training Minister. (Birmingham, it should be noted, is one of the better ministers in the loathsome Liberal government; he is merely hopeless rather than malevolent.) Attempting to decipher that 2.55%, it seems to refer to the “% average change in NAPLAN mean scale score [from 2008 to 2017], average for domains across year levels”. Whatever that means.
ACARA, the administrators of NAPLAN, issued their own media release on the 2017 NAPLAN results. This release does not quote any percentages but indicates that the “2107 summary information” can be found at the the NAPLAN reports page. Two weeks after ACARA’s media release, no such information is contained on or linked on that page, nor on the page titled NAPLAN 2017 summary results. Both pages link to a glossary, to explain “mean scale score”, which in turn explains nothing. The 2016 NAPLAN National Report contains the expression 207 times, without once even pretending to explain what it means. The 609-page Technical Report from 2015 (the latest available on ACARA’s website) appears to contain the explanation, though the precise expression is never used and nothing remotely resembling a user-friendly summary is included.
To put it very briefly, each student’s submitted test is given a “scaled score”. One purpose of this is to be able to compare tests and test scores from different years. The statistical process is massively complicated and in particular it includes a weighting for the “difficulty” of each test question. There is plenty that could be queried here, particularly given ACARA’s peculiar habit of including test questions that are so difficult they can’t be answered. But, for now, we’ll accept those scaled scores as a thing. Then, for example, the national average for 2008 Year 3 numeracy scaled scores was 396.9. This increased to 402.0 in 2016, amounting to a percentage increase of 1.29%. The average percentage increases from 2008 to 2017 can then be further averaged over the four year levels, and (we think) this results in that magical 2.55%.
It is anybody’s guess whether that “2.55% increase in numeracy” corresponds to anything real, but the reporting of the figure is simply hilarious. Numeracy, to the very little extent it means anything, refers to the ability to apply mathematics effectively in the real world. To then report on numeracy in such a manner, with a who-the hell-cares free-floating percentage is beyond ironic; it’s perfect.
But of course the stenographic reportage is just a side issue. The main point is that there is no evidence that ten years of NAPLAN testing, and ten years of shoving numeracy down teachers’ and students’ throats, has made one iota of difference.
Australia’s Prime Minister tends to be pretty pleased with himself, and plenty of other people seem to think of Malcolm Turnbull as the smartest guy in the room. Perhaps he sometimes he is.* Malcolm didn’t appear so smart, however, when presenting Australia’s proposal to require the tech giants to decrypt their customers’ encrypted messages. When ZDnet reporter Asha McLean suggested that “the laws of mathematics [might] trump the laws of Australia”, Malcolm was unfazed:
The laws of Australia prevail in Australia, I can assure you of that. The laws of mathematics are very commendable but the only law that applies in Australia is the law of Australia.
And yes, the Government’s plan (for want of a better word) is as clueless as Malcolm makes it sound.
According to The Australian newspaper (paywalled), a bunch of “education and policy experts” have headed to China in an attempt to address Australia’s educational woes:
Frustrated by stagnating maths and STEM standards, [they] are travelling to China for lessons on how to boost maths and science in local classrooms.
Gee, I wonder what they might learn. What secret path to mathematical facility could those inscrutable Chinese possess? A wonderful new app, maybe. Or perhaps Chinese schools flip their classrooms in some really special way.
But, whatever their secret, it may not help us to learn it. The worth of “importing other countries’ teachings practices” is apparently questionable, “given that education is woven within the cultural fabric of nations.”
There’s plenty woven within (?) the cultural fabric of Australia, but whether one should refer to it as education is open to debate.
Which value of x satisfies both of these equations?
It is a multiple choice question, but unfortunately “The question is completely stuffed” is not one of the available answers.
Of course the fundamental issue with simultaneous equations is the simultaneity. Both equations and both variables must be considered as a whole, and it simply making no sense to talk about solutions for x without reference to y. Unless y = -7 in the above equations, and there is no reason to assume that, then no value of x satisfies both equations. The NAPLAN question is way beyond bad.
It is always worthwhile pointing out NAPLAN nonsense, as we’ve done before and will continue to do in the future. But what does this have to do with rural Peru?
In a recent post we pointed out an appalling question from a nationwide mathematics exam in New Zealand. We flippantly remarked that one might expect such nonsense in rural Peru but not in a wealthy Western country such as New Zealand. We were then gently slapped in the comments for the Peruvian references: Josh queried whether we knew anything of Peru’s educational system; and, Dennis questioned the purpose of bringing up Peru, since Australia’s NAPLAN demonstrates a “level of stupidity” for all the World to see. These are valid points.
It would have been prudent to have found out a little about Peru before posting, but we seem to be safe. Peru’s economy has been growing rapidly but is not nearly as strong as New Zealand’s or Australia’s. Peruvian school education is weak, and Peru seems to have no universities comparable to the very good universities in New Zealand and Australia. Life and learning in rural Peru appears to be pretty tough.
None of this is surprising, and none of it particularly matters. Our blog post referred to “rural Peru or wherever”. The point was that we can expect poorer education systems to throw up nonsense now and then, or even typically; in particular, lacking ready access to good and unharried mathematicians, it is unsurprising if exams and such are mathematically poor and error-prone.
But what could possibly be New Zealand’s excuse for that idiotic question? Even if the maths ed crowd didn’t know what they were doing, there is simply no way that a competent mathematician would have permitted that question to remain as is, and there are plenty of excellent mathematicians in New Zealand. How did a national exam in New Zealand fail to be properly vetted? Where were the mathematicians?
Which brings us to Australia and to NAPLAN. How could the ridiculous problem at the top of this post, or the question discussed here, make it into a nationwide test? Once again: where were the mathematicians?
One more point. When giving NAPLAN a thoroughly deserved whack, Dennis was not referring to blatantly ill-formed problems of the type above, but rather to a systemic and much more worrying issue. Dennis noted that NAPLAN doesn’t offer a mathematics test or an arithmetic test, but rather a numeracy test. Numeracy is pedagogical garbage and in the true spirit of numeracy, NAPLAN’s tests include no meaningful evaluation of arithmetic or algebraic skills. And, since we’re doing the Peru thing, it seems worth noting that numeracy is undoubtedly a first world disease. It is difficult to imagine a poorer country, one which must weigh every educational dollar and every educational hour, spending much time on numeracy bullshit.
Finally, a general note about this blog. It would be simple to write amusing little posts about this or that bit of nonsense in, um, rural Peru or wherever. That, however, is not the purpose of this blog. We have no intention of making easy fun of people or institutions honestly struggling in difficult circumstances; that includes the vast majority of Australian teachers, who have to tolerate and attempt to make sense of all manner of nonsense flung at them from on high. Our purpose is to point out the specific idiocies of arrogant, well-funded educational authorities that have no excuse for screwing up in the manner in which they so often do.
Whatever the merits of undertaking a line by line critique of the Australian Curriculum, it would take a long time, it would be boring and it would probably overshadow the large, systemic problems. (Also, no one in power would take any notice, though that has never really slowed us down.) Still, the details should not be ignored, and we’ll consider here one of the gems of Homer Simpson cluelessness.
In 2010, Burkard Polster and I wrote an Age newspaper column about a draft of the Australian Curriculum. We focused on one line of the draft, an “elaboration” of Pythagoras’s Theorem:
recognising that right-angled triangle calculations may generate results that can beintegral, fractional or irrational numbersknown as surds
Though much can be said about this line, the most important thing to say is that it is wrong. Seven years later, the line is still in the Australian Curriculum, essentially unaltered, and it is still wrong.
OK, perhaps the line isn’t wrong. Depending upon one’s reading, it could instead be meaningless. Or trivial. But that’s it: wrong and meaningless and trivial are the only options.
The weird grammar and punctuation is standard for the Australian Curriculum. It takes a special lack of effort, however, to produce phrases such as “right-angled triangle calculations” and “generate results”. Any student who offered up such vague nonsense in an essay would know to expect big red strokes and a lousy grade. Still, we can take a guess at the intended meaning.
Pythagoras’s Theorem can naturally be introduced with 3-4-5 triangles and the like, with integer sidelengths. How does one then obtain irrational numbers? Well, “triangle calculations” on the triangle below can definitely “generate” irrational “results”:
Yeah, yeah, is not a “surd”. But of course we can replace each by √7 or 1/7 or whatever, and get sidelengths of any type we want. These are hardly “triangle calculations”, however, and it makes the elaboration utterly trivial: fractions “generate” fractions, and irrationals “generate” irrationals. Well, um, wow.
We assume that the point of the elaboration is that if two sides of a right-angled triangle are integral then the third side “generated” need not be. So, the Curriculum writers presumably had in mind 1-1-√2 triangles and the like, where integers unavoidably lead us into the world of irrationals. Fair enough. But how, then, can we similarly obtain the promised (non-integral) fractional sidelengths? The answer is that we cannot.
It is of course notable that two sides of a right-angled triangle can be integral with the third side irrational. It is also notable, however, that two integral sides cannot result in the third side being a non-integral fraction. This is not difficult to prove, and makes a nice little exercise; the reader is invited to give a proof in the comments. The reader may also wish to forward their proof to ACARA, the producers of the Australian Curriculum.
How does such nonsense make it into a national curriculum? How does it then remain there, effectively unaltered, for seven years? True, our 2010 column wasn’t on the front of the New York Times. But still, in seven years did no one at ACARA ever get word of our criticism? Did no one else ever question the elaboration to anyone at ACARA?
But perhaps ACARA did become aware of our or others’ criticism, reread the elaboration, and decided “Yep, it’s just what we want”. It’s a depressing thought, but this seems as likely an explanation as any.
It is very brave to claim that one has found the stupidest maths exam question of all time. And the claim is probably never going to be true: there will always be some poor education system, in rural Peru or wherever, doing something dumber than anything ever done before. For mainstream exams in wealthy Western countries, however, New Zealand has come up with something truly exceptional.
A rectangle has an area of . What are the lengths of the sides of the rectangle in terms of .
The real problem here is to choose the best answer, which we can probably all agree is sides of length and .
OK, clearly what was intended was for students to factorise the quadratic and to declare the factors as the sidelengths of the rectangle. Which is mathematical lunacy. It is simply wrong.
Indeed, the question would arguably still have been wrong, and would definitely still have been awful, even if it had been declared that has a unit of length: who wants students to be thinking that the area of a rectangle uniquely determines its sidelengths? But, even that tiny sliver of sense was missing.
So, what did students do with this question? (An equivalent question, 3(a)(i), appeared on the first exam.) We’re guessing that, seeing no alternative, the majority did exactly what was intended and factorised the quadratic. So, no harm done? Hah! It is incredible that such a question could make it onto a national exam, but it gets worse.
The two algebra exams were widely and strongly criticised, by students and teachers and the media. People complained that the exams were too difficult and too different in style from what students and teachers had been led to expect. Both types of criticism may well have been valid. For all of the public criticism of the exams, however, we could find no evidence of the above question or its Exam 1 companion being flagged. Plenty of complaining about hard questions, plenty of complaining about unexpected questions, but not a word about straight out mathematical crap.
So, not only do questions devoid of mathematical sense appear on a nationwide exam. It then appears that the entire nation of students is being left to accept that this is what mathematics is: meaningless autopilot calculation. Well done, New Zealand. You’ve made the education authorities in rural Peru feel very much better about themselves.
UPDATE (04/02/19): Lightning strikes twice, and thrice.
Each year about a million Australian school students are required to sit the Government’s NAPLAN tests. Produced by ACARA, the same outfit responsible for the stunning Australian Curriculum, these tests are expensive, annoying and pointless. In particular it is ridiculous for students to sit a numeracytest, rather than a test on arithmetic or more broadly on mathematics. It guarantees that the general focus will be wrong and that specific weirdnesses will abound. The 2017 NAPLAN tests, conducted last week, have not disappointed. Today, however, we have other concerns.
Wading into NAPLAN’s numeracy quagmire, one can often find a nugget or two of glowing wrongness. Here is a question from the 2017 Year 9 test:
In this inequality n is a whole number.
What is the smallest possible value for n to make this inequality true?
The wording is appalling, classic NAPLAN. They could have simply asked:
What is the smallest whole number n for which
But of course the convoluted wording is the least of our concerns. The fundamental problem is that the use of the expression “whole number” is disastrous.
Mathematicians would avoid the expression “whole number”, but if pressed would most likely consider it a synonym for “integer”, as is done in the Australian Curriculum (scroll down) and some dictionaries. With this interpretation, where the negative integers are included, the above NAPLAN question obviously has no solution. Sometimes, including in, um, the Australian Curriculum (scroll down), “whole number” is used to refer to only the nonnegative integers or, rarely, to only the positive integers. With either of these interpretations the NAPLAN question is pretty nice, with a solution n = 10. But it remains the case that, at best, the expression “whole number” is irretrievably ambiguous and the NAPLAN question is fatally flawed.
Pointing out an error in a NAPLAN test is like pointing out one of Donald Trump’s lies: you feel you must, but doing so inevitably distracts from the overall climate of nonsense and nastiness. Still, one can hope that ACARA will be called on this, will publicly admit that they stuffed up, and will consider employing a competent mathematician to vet future questions. Unfortunately, ACARA is just about as inviting of criticism and as open to admitting error as Donald Trump.
Our first post concerns an error in the 2016 Mathematical Methods Exam 2 (year 12 in Victoria, Australia). It is not close to the silliest mathematics we’ve come across, and not even the silliest error to occur in a Methods exam. Indeed, most Methods exams are riddled with nonsense. For several reasons, however, whacking this particular error is a good way to begin: the error occurs in a recent and important exam; the error is pretty dumb; it took a special effort to make the error; and the subsequent handling of the error demonstrates the fundamental (lack of) character of the Victorian Curriculum and Assessment Authority.
The problem, first pointed out to us by teacher and friend John Kermond, is in Section B of the exam and concerns Question 3(h)(ii). This question relates to a probability distribution with “probability density function”
Now, anyone with a good nose for calculus is going to be thinking “uh-oh”. It is a fundamental property of a PDF that the total integral (underlying area) should equal 1. But how are all those integrated powers of e going to cancel out? Well, they don’t. What has been defined is only approximately a PDF, with a total area of . (It is easy to calculate the area exactly using integration by parts.)
Below we’ll discuss the absurdity of handing students a non-PDF, but back to the exam question. 3(h)(ii) asks the students to find the median of the “probability distribution”, correct to two decimal places. Since the question makes no sense for a non-PDF, of course the VCAA have shot themself in the foot. However, we can still attempt to make some sense of the question, which is when we discover that the VCAA has also shot themself in the other foot.
The median m of a probability distribution is the half-way point. So, in the integration context here we want the m for which
a)
As such, this question was intended to be just another CAS exercise, and so both trivial and pointless: push the button, write down the answer and on to the next question. The problem is, the median can also be determined by the equation
b)
or by the equation
c)
And, since our function is only approximately a PDF, these three equations necessarily give three different answers: to the demanded two decimal places the answers are respectively 176.45, 176.43 and 176.44. Doh!
What to make of this? There are two obvious questions.
1. How did the VCAA end up with a PDF which isn’t a PDF?
It would be astonishing if all of the exam’s writers and checkers failed to notice the integral was not 1. It is even more astonishing if all the writers-checkers recognised and were comfortable with a non-PDF. Especially since the VCAA can be notoriously, absurdly fussy about the form and precision of answers (see below).
2. How was the error in 3(h)(ii) not detected?
It should have been routine for this mistake to have been detected and corrected with any decent vetting. Yes, we all make mistakes. Mistakes in very important exams, however, should not be so common, and the VCAA seems to make a habit of it.
OK, so the VCAA stuffed up. It happens. What happened next? That’s where the VCAA’s arrogance and cowardice shine bright for all to see. The one and only sentence in the Examiners’ Report that remotely addresses the error is:
“As [the] function f is a close approximation of the [???] probability density function, answers to the nearest integer were accepted”.
The wording is clumsy, and no concession has been made that the best (and uniquely correct) answer is “The question is stuffed up”, but it seems that solutions to all of a), b) and c) above were accepted. The problem, however, isn’t with the grading of the question.
It is perhaps too much to expect an insufferably arrogant VCAA to apologise, to express anything approximating regret for yet another error. But how could the VCAA fail to understand the necessity of a clear and explicit acknowledgement of the error? Apart from demonstrating total gutlessness, it is fundamentally unprofessional. How are students and teachers, especially new teachers, supposed to read the exam question and report? How are students and teachers supposed to approach such questions in the future? Are they still expected to employ the precise definitions that they have learned? Or, are they supposed to now presume that near enough is good enough?
For a pompous finale, the Examiners’ Report follows up by snarking that, in writing the integral for the PDF, “The dx was often missing from students’ working”. One would have thought that the examiners might have dispensed with their finely honed prissiness for that one paragraph. But no. For some clowns it’s never the wrong time to whine about a missing dx.
UPDATE (16 June): In the comments below, Terry Mills has made the excellent point that the prior question on the exam is similarly problematic. 3(h)(i) asks students to calculate the mean of the probability distribution, which would normally be calculated as . For our non-PDF, however, we should should normalise by dividing by . To the demanded two decimal places, that changes the answer from the Examiners’ Report’s 170.01 to 170.06.
UPDATE (05/07/22): The examination report was updated on 18/07/20, and now (mostly) fesses up to the nonsense in 3(h)(ii). There is still no submission for the parallel nonsense in 3(h)(i).