By overwhelming demand,* we have decided, much belatedly, to put up a post for discussion of the 2021 Further Mathematics exams. We have no particular plans to update this post, although we will do so if anything of interest arises. We’ll just note the two excerpts below, from Exam 2, the first of which is discussed here, at 5:30. Thanks to Simon and SRK to bringing these to our attention.**

# Category: exams

## Untried Methods

We’re sure we’ll live to regret this post, but yesterday’s VCE Methods Exam 1 looked like a good exam.

No, that’s not a set up for a joke. It actually looked like a nice exam. (It’s not online yet. **Update: ****Now online.**). Sure, there were some meh questions, the inevitable consequence of an incompetent study design. And yes, there was a minor Magritte aspect to the final question. And yes, it’s much easier to get an exam right if it’s uncorrupted by the idiocy of CAS, with the acid test being Exam 2. And yes, we could be plain wrong; we only gave the exam a cursory read, and if there’s a dodo it’s usually in the detail.

But for all that the exam genuinely looked good. The questions in general seemed mathematically natural. A couple of the questions also appeared to be difficult in a good, mathematical way, rather than in the familiar “What the Hell do they want?” manner.

What happened?

## Inferiority Complex

This one is long, a real Gish gallop. Question 4, Part 2 from the 2017 VCE Specialist Mathematics Exam 2 is a mess. The Examiners’ Report is, predictably, worse.

Part (a) of Question 4 is routine, requiring students to express in polar form. One wonders how a quarter of the students could muck up this easy 1-mark question, but the question is fine.

The issues begin with 4(b), for which students are required to

*Show that the roots of are and **.*

The question can be answered with an easy application of completing the square or the quadratic formula. So, why did almost half of the students get it wrong? Were so many students really so clueless? Perhaps, but there is good reason to suspect a different source of the cluelessness.

The Examiners’ Report indicates three general issues with students’ answers. First,

*students confused factors with solutions or did not proceed beyond factorising the quadratic.*

Maybe the students were confused, but maybe not. Maybe some students simply thought that, once having factorised the quadratic, the microstep to then write “Therefore z = …”, to note the roots *written on the exam in front of them*, was too trivial in response to a 1 mark question.

Second, some students reportedly erred by

*not showing key steps in their solution. *

Really? The Report includes the following calculation as a sample solution:

Was this whole tedious, snail-paced computation required for one measly mark? It’s impossible to tell, but the Report remarks generally on ‘show that’ questions that

all steps that led to the given result needed to be clearly and logically set out.

As we have noted previously, demanding “all steps” is both meaningless and utterly mad. For a year 12 advanced mathematics student the identification of the roots is pretty much immediate and a single written step should suffice. True, in 4(b) students are instructed to “show” stuff, but it’s hardly the students’ fault that what they were instructed to show is pretty trivial.

Third, and by far the most ridiculous,

*some students did not correctly follow the ‘show that’ instruction … by [instead] solely verifying the solutions given by substitution.*

Bullshit.

VCAA examiners love to worry that word “show”. In true Princess Bride fashion, however, the word does not mean what they think it means.

There is nothing in standard English usage nor in standard mathematical usage, nor in at least occasional VCE usage (see Q2(a)), that would distinguish “show” from “prove” in this context. And, for 4(b) above, substitution of the given values into the quadratic is a perfectly valid method of proving that the roots are as indicated.

It appears that VCE has a special non-English code, in which “show” has a narrower meaning, akin to “derive“. This cannot alter the fact that the VCE examiners’ use of the word is linguistic and mathematical crap. It also cannot alter the fact that students being penalised for not following this linguistic and mathematical crap is pedagogical and mathematical crap.

Of course all the nonsense of 4(b) could have been avoided simply by asking the students to *find* the roots. The examiners declined to do so, however, probably because this would have violated VCAA’s policy of avoiding asking any mathematical question with some depth or difficulty or further consequence. The result is a question amounting to no more than an infantile and infantilising ritual, penalising any student with the mathematical common sense to answer with the appropriate “well, duh”.

***************************

Onwards we trek to 4(c):

*Express the roots of in terms of .*

Less than a third of students scored the mark for this question, and the Report notes that

*Misunderstanding of the question was apparent in student responses. Many attempts at solutions were not expressed in terms of as required.*

Funny that. The examiners pose a question that borders on the meaningless and somehow this creates a sea of misunderstanding. Who would’ve guessed?

4(c) makes little more sense than to ask someone to write 3 in terms of 7. Given any two numbers there’s a zillion ways to “express” one number “in terms of” the other, as in 3 = 7 – 4 or whatever. Without further qualification or some accepted convention, without some agreed upon definition of “expressed in terms of”, any expression is just as valid as any other.

What was expected in 4(c)? To approach the question cleanly we can first set , as the examiners could have and should have and did not. Then, the intended answers were and .

These expressions for the roots are simple and natural, but even if one accepts a waffly interpretation of 4(c) that somehow requires “simple” solutions, there are plenty of other possible answers. The expressions and and and are all reasonable and natural, but nothing in the Examiners’ Report suggests that these or similar answers were accepted. If not, that is a very nasty cherry on top of an incredibly silly question.

***************************

The pain now temporarily lessens (though the worst is yet to come). 4(d) asks for students to show that the relation has the cartesian form , and in 4(e) students are asked to draw this line on an Argand diagram, together with the roots of the above quadratic.

These questions are routine and ok, though 4(d) is weirdly aimless, the line obtained playing no role in the final parts of Q4. The Examiners’ Report also notes condescendingly that “the ‘show that’ instruction was generally followed”. Yes, people do tend to follow the intended road if there’s only one road.

The final part, 4(g), is also standard, requiring students to find the area of the major segment of the circle |z| = 4 cut off by the line through the roots of the quadratic. The question is straight-forward, the only real trick being to ignore the weird line from 4(d) and 4(e).

***************************

Finally, the debacle of 4(f):

*The equation of the line passing through the two roots of can be expressed as , where . Find in terms of .*

The Report notes that

*This question caused significant difficulty for students.*

That’s hilarious understatement given that 99% of students scored 0/1 on the question. The further statements acknowledging and explaining and apologising for the stuff-up are unfortunately non-existent.

So, what went wrong? The answer is both obvious and depressingly familiar: the exam question is essentially meaningless. Students failed to comprehend the question because it is close to incomprehensible.

The students are asked to write b in terms of a. However, similar to 4(c) above, there are many ways to do that and how one is able to do it depends upon the initial number a chosen. The line through the two roots has equation . So then, for example, with a = -4 we have b = 0 and we can write b = a + 4 or b = 0 x a or whatever. If a = -5 then b = 1 and we can write b = -a – 4, and so on.

Anything of this nature is a reasonable response to the exam question *as written* and none of it resembles the answer in the Report. Instead, what was expected was for students to consider all complex numbers a – *except those on the line itself – *and to consider all associated complex b. That is, in appropriate but non-Specialist terminology, we want to determine b as a function f(a) of a, with the domain of f being most but not all of the complex plane.

With the question suitably clarified we can get down to work (none of which is indicated in the Report). Easiest is to write . Since must be symmetrically placed about the line , it follows that . Then . This gives , and finally

which is the answer indicated in the Examiners’ Report.

In principle 4(f) is a nice question, though 1 mark is pretty chintzy for the thought required. More importantly, the exam question as written bears only the slightest resemblance to the intended question, or to anything coherent, with only the slightest, inaccurate hint of the intended generality of a and b.

99% of 2017 Specialist students have a right to be pissed off.

***************************

That’s it, we’re done. One more ridiculous VCE exam question, and one more ridiculously arrogant Report, unsullied by an ounce of self-reflection or remorse.

## The Arc Enemy

Our previous post was on good guys making a silly, funny and inconsequential mistake. This post is not.

Question B1 of Exam 2 for 2018 Northern Hemisphere Specialist Mathematics begins innocently enough. In part (a), students are required to graph the function over its maximal domain. Then, things begin to get stupid.

In part (b), the graph of f is rotated around the y-axis, to model a vase. Students are required to find the volume of this stupid vase, by setting up the integral and then pushing the stupid buttons on their stupid calculators. So, a reasonable integration question lost in ridiculous pseudomodelling and brainless button-pushing. Whatever. Just standard VCE crap. Then, things stay stupid.

Part (c) is a related rates question. In principle a good problem, though it’s hard to imagine anyone ever requiring dh/dt when the water depth is exactly cm. Whatever. Standard VCE crap. Then, things get really, really stupid.

Part (d) of the problem has a bee climbing from the bottom of the vase to the top. Students are required to find the minimum distance the bee needs to travel.

Where to begin with this idiotic, 1-mark question. Let’s begin with the bee.

Why is it a bee? Why frame a shortest walk question in terms of a bug with wings? Sure, the question states that the bug is climbing, and the slight chance of confusion is overshadowed by other, much greater issues with the question. But still, why would one choose a flying bug to crawl up a vase? It’s not importantly stupid, but it is gratuitously, hilariously stupid.

Anyway, we’re stuck with our stupid bee climbing up our stupid vase. What distance does our stupid bee travel? Well, obviously our stupid, non-flying bee should climb as “up” as possible, without veering left or right, correct?

No and yes.

It is true that a bottom-to-top shortest path (geodesic) on a surface of revolution is a meridian. The proof of this, however, is very far from obvious; good luck explaining it to your students. But of course this is only Specialist Mathematics, so it’s not like we should expect the students to be inquisitive or critical or questioning assumptions or anything like that.

Anyway, our stupid non-flying bee climbs “up” our stupid vase. The distance our stupid bee travels is then the arc length of the graph of the original function f, and the required distance is given by the integral

The integral is ugly. More importantly, the integral is (doubly) improper and thus has no required meaning for Specialist students. Pretty damn stupid, and a stupidity we’ve seen not too long ago. It gets stupider.

Recall that this is a 1-mark question, and it is clearly expected to have the stupid calculator do the work. Great, sort of. The calculator computes integrals that the students are not required to understand but, apart from being utterly meaningless crap, everything is fine. Except, the calculators are really stupid.

Two brands of CAS calculators appear to be standard in VCE. Brand A will readily compute the integral above. Unfortunately, Brand A calculators will also compute improper integrals that don’t exist. Which is stupid. Brand B calculators, on the other hand, will not directly compute improper integrals such as the one above; instead, one first has to de-improper the integral by changing the limits to something like 0.50001 and 1.49999. Which is ugly and stupid. It also requires students to recognise the improperness in the integral, which they are supposedly not required to understand. Which is really stupid. (The lesser known Brand C appears to be less stupid with improper integrals.)

There is a stupid way around this stupidity. The arc length can also be calculated in terms of the inverse function of f, which avoid the improperness and then all is good. All is good, that is, except for the thousands of students who happen to have a Brand B calculator and who naively failed to consider that a crappy, 1-mark button-pushing question might require them to hunt for a Specialist-valid and B-compatible approach.

The idiocy of VCE exams is truly unlimited.

## Inverted Logic

The 2018 Northern Hemisphere Mathematical Methods exams (1 and 2) are out. We didn’t spot any Magritte-esque lunacy, which was a pleasant surprise. In general, the exam questions were merely trivial, clumsy, contrived, calculator-infested and loathsomely ugly. So, all in all not bad by VCAA standards.

There, was, however, one notable question. The final multiple choice question on Exam 2 reads as follows:

**Let f be a one-to-one differentiable function such that f (3) = 7, f (7) = 8, f′(3) = 2 and f′(7) = 3. The function g is differentiable and g(x) = f ^{–1}(x) for all x. g′(7) is equal to …**

The wording is hilarious, at least it is if you’re not a frazzled Methods student in the midst of an exam, trying to make sense of such nonsense. Indeed, as we’ll see below, the question turned out to be too convoluted even for the examiners.

Of course *f *^{–1} is a perfectly fine and familiar name for the inverse of *f*. It takes a special cluelessness to imagine that renaming *f *^{–1} as *g* is somehow required or remotely helpful. The obfuscating wording, however, is the least of our concerns.

The exam question is intended to be a straight-forward application of the inverse function theorem. So, in Leibniz form dx/dy = 1/(dy/dx), though the exam question effectively requires the more explicit but less intuitive function form,

IVT is typically stated, and in particular the differentiability of *f *^{–1 }can be concluded, with suitable hypotheses. In this regard, the exam question needlessly hypothesising that the function *g*^{ }is differentiable is somewhat artificial. However it is not so simple in the school context to discuss natural hypotheses for IVT. So, underlying the ridiculous phrasing is a reasonable enough question.

What, then, is the problem? The problem is that IVT is not explicitly in the VCE curriculum. Really? Really.

Even ignoring the obvious issue this raises for the above exam question, the subliminal treatment of IVT in VCE is absurd. One requires plenty of inverse derivatives, even in a first calculus course. Yet, there is never any explicit mention of IVT in either Specialist or Methods, not even a hint that there is a common question with a universal answer.

All that appears to be explicit in VCE, and more in Specialist than Methods, is application of the chain rule, case by isolated case. So, one assumes the differentiability of *f *^{–1} and and then differentiates *f *^{–1}(f(x)) in Leibniz form. For example, in the most respected Methods text the derivative of *y* = log(*x*) is somewhat dodgily obtained using the chain rule from the (very dodgily obtained) derivative of *x* = e* ^{y}*.

It is all very implicit, very case-by-case, and *very* Leibniz. Which makes the above exam question effectively impossible.

How many students actually obtained the correct answer? We don’t know since the Examiners’ Report doesn’t actually report anything. Being a multiple choice question, though, students had a 1 in 5 chance of obtaining the correct answer by dumb luck. Or, sticking to the more plausible answers, maybe even a 1 in 3 or 1 in 2 chance. That seems to be how the examiners stumbled upon the correct answer.

The Report’s solution to the exam question reads as follows (as of September 20, 2018):

**f(3) = 7, f'(3) = 8, g(x) = f ^{–1}(x) , **

**g**

**‘(x) = 1/2**

**since**

**f'(x) x f'(y) = 1, g(x) = f'(x) = 1/f'(y).**

The awfulness displayed above is a wonder to behold. Even if it were correct, the suggested solution would still bear no resemblance to the Methods curriculum, and it would still be unreadable. And the answer is not close to correct.

To be fair, The Report warns that its sample answers are “not intended to be exemplary or complete”. So perhaps they just forgot the further warning, that their answers are also not intended to be correct or comprehensible.

It is abundantly clear that the VCAA is incapable of putting together a coherent curriculum, let alone one that is even minimally engaging. Apparently it is even too much to expect the examiners to be familiar with their own crappy curriculum, and to be able to examine it, and to report on it, fairly and accurately.

## VCAA Plays Dumb and Dumber

Late last year we posted on Madness in the 2017 VCE mathematics exams, on blatant errors above and beyond the exams’ predictably general clunkiness. For one (Northern Hemisphere) exam, the subsequent VCAA Report had already appeared; this Report was pretty useless in general, and specifically it was silent on the error and the surrounding mathematical crap. None of the other reports had yet appeared.

Now, finally, all the exam reports are out. God only knows why it took half a year, but at least they’re out. We have already posted on one particularly nasty piece of nitpicking nonsense, and now we can review the VCAA‘s own assessment of their five errors:

- Mathematical Methods Exam 1 contained an utter hash of a question, with a fundamentally misleading graph and a (for Methods) undefined endpoint derivative. The Report makes no mention of the error or the dodginess of the graph. Students scored an average of 5% on the question.
- Mathematical Methods Exam 2 contained, twice, a type of question which previous years’ reports had suggested to solve invalidly. This year’s Report approaches the question validly the first time and then ducks the question with a dubious technique the second time. (More on this in a future post.) The Report makes no mention of the previously acceptable invalid approach, or whether that approach is still considered acceptable. Students scored an average of 66% on the first question (suggesting the invalid technique is secretly still acceptable) and 5% on the second.
- Specialist Mathematics Exam 1 contained a doubly improper integral, which cannot be done with Specialist techniques. The Report makes no comment on the error and, in light of the error, the Report provides no clue as to what was expected of the students, nor what was accepted from the students as correct. Students scored an average of 35% on the question.
- Further Mathematics Exam 1 included a multiple choice question with no correct answer. The Report makes no comment, simply indicating one answer (the trickiest answer to prove wrong) as correct, and that 59% of students gave the “correct” answer.
- Further Mathematics Exam 2 included an ill-posed question with erroneously transposed information. The Report makes no comment on the error. The Report does note that 72% of students received 0/1 for the question, with “many” students giving a 3 x 1 matrix rather than the correct 1 x 3, and presumably scoring 0 as a consequence. That’s pretty damn cute, given the examiners had already mixed up the rows and columns.

So, the VCAA responds to five blatant errors with five Trumpian silences. How should one describe such conduct? Unprofessional? Arrogant? Cowardly? VCAA-ish? All of the above?

## Little Steps for Little Minds

Here’s a quick but telling nugget of awfulness from Victoria’s 2017 VCE maths exams. Q9 of the first (non-calculator) Methods Exam is concerned with the function

In Part (b) of the question students are asked to show that “*the gradient of the tangent to the graph of f”* equals .

A normal human being would simply have asked for the derivative of f, but not much can go wrong, right? Expanding and differentiating, we have

Easy, and done.

So, how is it that 65% of Methods students scored 0 on this contrived but routine 1-point question? Did they choke on “the gradient of the tangent to the graph of f” and go on to hunt for a question written in English?

The Examiners’ Report pinpoints the issue, noting that the exam question “*required a step-by-step demonstration …*“. And, “*[w]hen answering ‘show that’ questions, students should include all steps to demonstrate exactly what was done*“ (emphasis added). So the Report implies, for example, that our calculation above would have scored 0 because we didn’t explicitly include the step of obtaining a common denominator.

Jesus H. Christ.

Any suggestion that our calculation is an insufficient answer for a student in a senior maths class is pedagogical and mathematical lunacy. This is obvious, even ignoring the fact that Methods questions way too often are flawed and/or require the most fantastic of logical leaps. And, of course, the instruction that “all steps” be included is both meaningless and utterly mad, and the solution in the Examiners’ Report does nothing of the sort. (Exercise: Try to include all steps in the computation and simplification of f’.)

This is just one 1-point question, but such infantilising nonsense is endemic in Methods. The subject is saturated with pointlessly prissy language and infuriating, nano-step nitpicking, none of which bears the remotest resemblance to real mathematical thought or expression.

What is the message of such garbage? For the vast majority of students, who naively presume that an educational authority would have some expertise in education, the message is that mathematics is nothing but soulless bookkeeping, which should be avoided at all costs. For anyone who knows mathematics, however, the message is that Victorian maths education is in the clutches of a heartless and entirely clueless antimathematical institution.

## The Wild and Woolly West

So, much crap, so little time.

OK, after a long period of dealing with other stuff (shovelled on by the evil Mathologer), we’re back. There’s a big backlog, and in particular we’re working hard to find an ounce of sense in Gonski, Version N. But, first, there’s a competition to finalise, and an associated educational authority to whack.

It appears that no one pays any attention to Western Australian maths education. This, as we’ll see, is a good thing. (Alternatively, no one gave a stuff about the prize, in which case, fair enough.) So, congratulations to Number 8, who wins by default. We’ll be in touch.

A reminder, the competition was to point out the nonsense in Part 1 and Part 2 of the 2017 West Australian Mathematics Applications Exam. As with our previous challenge, this competition was inspired by one specifically awful question. The particular Applications question, however, should not distract from the Exam’s very general clunkiness. The entire Exam is amateurish, as one rabble rouser expressed it, plagued by clumsy mathematics and ambiguous phrasing.

The heavy lifting in the critique below is due to the semi-anonymous Charlie. So, a very big thanks to Charlie, specifically for his detailed remarks on the Exam, and more generally for not being willing to accept that a third rate exam is simply par for WA’s course. (Hello, Victorians? Anyone there? Hello?)

We’ll get to the singularly awful question, and the singularly awful formal response, below. First, however, we’ll provide a sample of some of the examiners’ lesser crimes. None of these other crimes are hanging offences, though some slapping wouldn’t go astray, and a couple questions probably warrant a whipping. We won’t go into much detail; clarification can be gained by referring to the Exam papers. We also don’t address the Exam as a whole in terms of the adequacy of its coverage of the Applications curriculum, though there are apparently significant issues in this regard.

**Question 1**, the phrasing is confusing in parts, as was noted by Number 8. It would have been worthwhile for the examiners to explicitly state that the first term *T _{n}* corresponds to

*n*= 1. Also, when asking for the first term ( i.e. the first

*T*) less than 500, it would have helped to have specifically asked for the corresponding index

_{n}*n*(which is naturally obtained as a first step), and then for

*T*.

_{n}**Question 2(b)(ii)**, it is a little slack to claim that “an allocation of delivery drivers cannot me made yet”.

**Question 5** deals with a survey, a table of answers to a yes-or-no question. It grates to have the responses to the *question* recorded as “agree” or “disagree”. In part (b), students are asked to identify the explanatory variable; the answer, however, depends upon what one is seeking to explain.

**Question 6(a) **is utterly ridiculous. The choice for the student is either to embark upon a laborious and calculator-free and who-gives-a-damn process of guess-and-check-and-cross-your-fingers, or to solve the travelling salesman problem.

**Question 8(b)** is clumsily and critically ambiguous, since it is not stated whether the payments are to be made at the beginning or the end of each quarter.

**Question 10** involves some pretty clunky modelling. In particular, starting with 400 bacteria in a dish is out by an order of magnitude, or six.

**Question 11(d)** is worded appallingly. We are told that one of two projects will require an extra three hours to compete. Then we have to *choose* which project “for the completion time to be at a minimum”. Yes, one can make sense of the question, but it requires a monster of an effort.

**Question 14** is fundamentally ambiguous, in the same manner as Question 8(b); it is not indicated whether the repayments are to be made at the beginning or end of each period.

That was good fun, especially the slapping. But now it’s time for the main event:

# QUESTION 3

Question 3(a) concerns a planar graph with five faces and five vertices, *A*, *B*, *C*, *D* and *E*:

- Determine the number of edges of the graph.
- Draw the planar graph.
- Determine a Hamiltonian cycle for the graph
- Determine whether the graph is Eulerian, semi-Eulerian or neither.

What is wrong with this question? As evinced by the graphs pictured above, pretty much everything.

As pointed out by Number 8, Part (i) can only be answered (by Euler’s formula) if the graph is assumed to be connected. In Part (ii), it is weird and it turns out to be seriously misleading to refer to “the” planar graph. Next, the Hamiltonian cycle requested in Part (iii) is only guaranteed to exist if the graph is assumed to be both connected and simple. Finally, in Part (iv) *any* answer is possible, and the answer is not uniquely determined even if we restrict to simple connected graphs.

It is evident that the entire question is a mess. Most of the question, though not Part (iv), is rescued by assuming that any graph should be connected and simple. There is also no reason, however, why students should feel free or obliged to make that assumption. Moreover, any such reading of 3(a) would implicitly conflict with 3(b), which explicitly refers to a “simple connected graph” three times.

So, how has WA’s Schools Curriculum and Standards Authority subsequently addressed their mess? This is where things get ridiculous, and seriously annoying. The only publicly available document discussing the Exam is the summary report, which is an accomplished exercise in saying nothing. Specifically, this report makes no mention of the many issues with the Exam. More generally, the summary report says little of substance or of interest to anyone, amounting to little more than admin box-ticking.

The first document that addresses Question 3 in detail is the non-public graders’ Marking Key. The Key begins with the declaration that it is “an *explicit* statement about [sic] what the examining panel expect of candidates when they respond to particular examination items.” [emphasis added].

What, then, are the explicit expectations in the Marking Key for Question 3(a)? In Part (i) Euler’s formula is applied without comment. For Part (ii) a sample graph is drawn, which happens to be simple, connected and semi-Eulerian; no indication is given that other, fundamentally different graphs are also possible. For Part (iii), a Hamiltonian cycle is indicated for the sample graph, with no indication that non-Hamiltonian graphs are also possible. In Part (iv), it is declared that “the” graph is semi-Eulerian, with no indication that the graph may non-Eulerian (even if simple and connected) or Eulerian.

In summary, the Marking Key makes not a single mention of graphs being simple or connected, nor what can happen if they are not. If the writers of the Key were properly aware of these issues they have given no such indication. The Key merely confirms and compounds the errors in the Exam.

Question 3 is also addressed, absurdly, in the non-public Examination Report. The Report notes that Question 3(a) failed to explicitly state “the” graph was assumed to be connected, but that “candidates made this assumption [but not the assumption of simplicity?]; particularly as they were required to determine a Hamiltonian cycle for the graph in part (iii)”. That’s it.

Well, yes, it’s obviously the students’ responsibility to look ahead at later parts of a question to determine what they should assume in earlier parts. Moreover, if they do so, they may, unlike the examiners, make proper and sufficient assumptions. Moreover, they may observe that no such assumptions are sufficient for the final part of the question.

Of course what almost certainly happened is that the students constructed the simplest graph they could, which in the vast majority of cases would have been simple and connected and Hamiltonian. But we simply cannot tell how many students were puzzled, or for how long, or whether they had to start from scratch after drawing a “wrong” graph.

In any case, the presumed fact that most (but not all) students were unaffected does not alter the other facts: that the examiners bollocksed the question; that they then bollocksed the Marking Key; that they then bollocksed the explanation of both. And, that SCSA‘s disingenuous and incompetent ass-covering is conveniently hidden from public view.

The SCSA is not the most dishonest or inept educational authority in Australia, and their Applications Exam is not the worst of 2017. But one has to hand it to them, they’ve given it the old college try.

## Polynomially Perverse

What, with its stupid curricula, stupid texts and really monumentally** **stupid exams, it’s difficult to imagine a wealthy Western country with worse mathematics education than Australia. Which is why God gave us New Zealand.

Earlier this year we wrote about the first question on New Zealand’s 2016 Level 1 algebra exam:

*A rectangle has an area of . What are the lengths of the sides of the rectangle in terms of .*

Obviously, the expectation was for the students to declare the side lengths to be the linear factors *x* – 4 and *x* + 9, and just as obviously this is mathematical crap. (Just to hammer the point, set *x* = 5, giving an area of 14, and think about what the side lengths “must” be.)

One might hope that, having inflicted this mathematical garbage on a nation of students, the New Zealand Qualifications Authority would have been gently slapped around by a mathematician or two, and that the error would not be repeated. One might hope this, but, in these idiot times, it would be very foolish to expect it.

A few weeks ago, New Zealand maths education was in the news (again). There was lots of whining about “disastrous” exams, with “impossible” questions, culminating in a pompous petition, and ministerial strutting and general hand-wringing. Most of the complaints, however, appear to be pretty trivial; sure, the exams were clunky in certain ways, but nothing that we could find was overly awful, and nothing that warranted the subsequent calls for blood.

What makes this recent whining so funny is the comparison with the deafening silence in September. That’s when the 2017 Level 1 Algebra Exams appeared, containing the exact same rectangle crap as in 2016 (Question 3(a)(i) and Question 2(a)(i)). And, as in 2016, there is no evidence that anyone in New Zealand had the slightest concern.

People like to make fun of all the sheep in New Zealand, but there’s many more sheep there than anyone suspects.

**UPDATE (04/02/19): **An Oxford school text joins in the fun.

## The Treachery of Images

*Harry scowled at a picture of a French girl in a bikini. **Fred nudged Harry, man-to-man. “Like that, Harry?” he asked.*

*“Like what?”*

*“The girl there.” *

*“That’s not a girl. That’s a piece of paper.”*

*“Looks like a girl to me.” Fred Rosewater leered.*

*“Then you’re easily fooled,” said Harry. It’s done with ink on a piece of paper. That girl isn’t lying there on the counter. She’s thousands of miles away, doesn’t even know we’re alive. If this was a real girl, all I’d have to do for a living would be to stay at home and cut out pictures of big fish.”*

Kurt Vonnegut, *God Bless you, Mr. Rosewater*

It is fundamental to be able to distinguish appearance from reality. That it is very easy to confuse the two is famously illustrated by Magritte’s *The Treachery of Images *(*La Trahison des Images*):

The danger of such confusion is all the greater in mathematics. Mathematical images, graphs and the like, have intuitive appeal, but these images are mere illustrations of deep and easily muddied ideas. The danger of focussing upon the image, with the ideas relegated to the shadows, is a fundamental reason why the current emphasis on calculators and graphical software is so misguided and so insidious.

Which brings us, once again, to Mathematical Methods. Question 5 on Section Two of the second 2015 Methods exam is concerned with the function , where

Here, is a constant, with initially; students are asked to find the minimum (which occurs at ), and to graph . All this is par for the course: a reasonable calculus problem thoroughly trivialised by CAS calculators. Predictably, things get worse.

In part (c)(i) of the problem students are asked to find “the set of possible values of ” for which the minimum of occurs at . (Part (c)(ii) similarly, and thus boringly and pointlessly, asks for which the minimum occurs at ). Arguably, the set of *possible* values of is , which of course is not what was intended; the qualification “possible” is just annoying verbiage, in which the examiners excel.

So, on to considering what the students were expected to have done for (c)(ii), a 2-mark question, equating to three minutes. The Examiners’ Report pointedly remarks that “[a]dequate working must be shown for questions worth more than one mark.” What, then, constituted “adequate working” for 5(c)(i)? The Examiners’ solution consists of first setting and solving to give , and then … well, nothing. Without further comment, the examiners magically conclude that the answer to (c)(i) is .

Only in the Carrollian world of Methods could the examiners’ doodles be regarded as a summary of or a signpost to any adequate solution. In truth, the examiners have offered no more than a mathematical invocation, barely relevant to the question at hand: why should having a stationary point at for have any any bearing on for other values of ? The reader is invited to attempt a proper and substantially complete solution, and to measure how long it takes. Best of luck completing it within three minutes, and feel free to indicate how you went in the comments.

It is evident that the vast majority of students couldn’t make heads or tails of the question, which says more for them than the examiners. Apparently about half the students solved and included in some form in their answer, earning them one mark. Very few students got further; 4% of students received full marks on the question (and similarly on (c)(ii)).

What did the examiners actually hope for? It is pretty clear that what students were expected to do, and the most that students could conceivably do in the allotted time, was: solve (i.e. press SOLVE on the machine); then, look at the graphs (on the machine) for two or three values of ; then, simply presume that the graphs of for all are sufficiently predictable to “conclude” that is the largest value of for which the (unique) turning point of lies in . If it is not immediately obvious that any such approach is mathematical nonsense, the reader is invited to answer (c)(i) for the function where .

Once upon a time, Victorian Year 12 students were taught mathematics, were taught to *prove* things. Now, they’re taught to push buttons and to gaze admiringly at pictures of big fish.