This is our post for discussion of the 2021 NHT Mathematical Methods exams (here and here). See also the accompanying 2021 NHT Specialist exams post, here. See also the relevant Witches: here, here and here.

UPDATE (22/09/21) The exam reports are now out, here and here (VCAA-Word-Stupid). We’ve added a couple comments below, in green.

We’ve looked a little more closely at VCAA’s Draft for the new mathematics VCE subjects. Yes, the time for feedback has ended, unless it hasn’t: the MAV are offering a Zoom session TODAY (Thursday 25/3) for members. God knows how or why. But in any case, it’ll be a while before VCAA cements the thing in place: plenty of time to ignore everyone’s suggestions.

The following are our thoughts on the Draft and Overview. It will be brief and disorganised, since there is no point in doing more; as we wrote, the content doesn’t matter as much as the fact that, whatever content, VCAA will undoubtedly screw it up. Still, there are some clear and depressing points to be made. We haven’t paid much specific attention to what is new nonsense, and what is the same old nonsense; nonsense is nonsense.

GENERAL POINTS ON THE DRAFT

The draft looks like a primary school book report. Someone at VCAA really should learn .

“Computational Thinking” is meaningless buzzery, and will be endemic, insidious and idiotic. It will poison everything. Every step of Methods and Specialist is subject to the scrutiny of Outcome 3:

“On completion of this unit the student should be able to apply computational thinking and use numerical, graphical, symbolic and statistical functionalities of technology to develop mathematical ideas, produce results and carry out analysis in practical situations requiring problem-solving, modelling or investigative techniques or approaches.”

“Statistical functionalities of technology”. And, there’s way more:

“key elements of algorithm design: sequencing, decision-making, repetition, and representation including the use of pseudocode.”

“use computational thinking, algorithms, models and simulations to solve problems related to a given context”

“the role of developing algorithms and expressing these through pseudocode to help determine and understand mathematical ideas and results”

“the purpose and effect of sequencing, decision-making and repetition statements on relevant functionalities of technology, and their role in the design of algorithms and simulations”

“design and implement simulations and algorithms using appropriate functionalities of technology”

This will all be the same aimless, pseudo-exploratory, CAS-drenched garbage that currently screws VCE, but much, much worse. Anybody who signs off on this idiocy should hang their head in shame.

CAS shit will now be worse than ever.

There should be no CAS exam, at all.

There should be no bound notes permitted in any exam.

Don’t write “technology”. It is pompous and meaningless. If you mean “CAS” then write “CAS”.

SACs have always been shit and will always be shit. The increased weight on them is insane.

The statistics is the same pointless bullshit it always was.

The presence of “proof” as a topic in Specialist highlights the anti-mathematical insanity of VCAA and ACARA curricula: proof has zero existence elsewhere. Much of what appears in the proof topic could naturally and engagingly and productively be taught at much lower levels. But of course, that would get in the way of VCAA’s constructivist fantasy, now with New and Improved Computational Thinking.

MATHEMATICAL METHODS

Not including integration by substitution is still and will always be the most stupid aspect of Methods.

Dilations must be understood expressed as both “parallel to an axis” and “from an axis”? But not in terms of the direction the damn points are moving? Cute.

The definition of independent events is wrong.

The demand that, for the composition , the range of must be a subset of the domain of is as pedantic and as pointless as ever.

“literal equations” is the kind of blather that only a maths ed clown could think has value.

The derivative of the inverse is still not in the syllabus, and everyone will still cheat and use it anyway.

“trapezium rule” is gauche but, more importantly, what is the purpose of teaching such integral approximation here? Yes, one can imagine a reasonable purpose, but we’ll lay odds there is no such purpose here.

SPECIALIST MATHEMATICS

The killing of mechanics is a crime.

The inclusion of logic and proof and the discrete topics could be good. But it won’t be. It will be shallow and formulaic and algorithmised, and graded in a painfully pedantic manner. Just imagine, for example, how mathematical induction will be assessed on exams: “Students often wrote instead of . Students should be aware of the proper use of these variables.”

There is no value here in “proof by contrapositive”, and it is confusing. Proof by contradiction suffices.

They’re really including integration by parts? Incredible.

The inclusion of cross products and plane equations makes some sense.

Each question was (arguably) last year’s most difficult exam question on the most difficult mathematics subject in that state. Each question was effectively allocated just under 20 minutes to complete (11/100 x 180 and 13/80 x 120).

Now, you must choose: which question is better, in any sense of the word “better”?

NSW (Formula Marking guide and sample solution are here.)

VIC (Briefly discussed here, marking guide and sample solution are in your dreams.)

We’ve been remiss in not writing further on VCAA’s draft for the new mathematics VCE subjects. It’s just, for reasons we’ll explain briefly here and flesh out elsewhere, we’ve struggled to face up to this new nonsense.

But, feedback is due TODAY (midnight? – see links below), and we really oughta say something. So, here are our brief thoughts and then, after that, why we believe none of it really matters:

“Computational thinking and algorithms” is pure snake oil. Inevitably, it will be nothing but wafer-thin twaddle for the training of data monkeys.

The increased weight on these meaningless, revolting SACs is insidious.

If we read it correctly, more weight will be placed on the non-CAS Methods/Specialist exams; it is not remotely close to enough, but it is good.

Statistics was and is and will always be an insane topic to emphasise in school.

The deletion of mechanics from Specialist Mathematics is criminal, but the topic had already been so bled to meaningless that it hardly matters.

In principle, the inclusion in SM of “logic” and “proof and number” and “combinatorics” is a good thing. We’ll see.

Similarly, in principle the making of SM12 presumed knowledge for SM34 is good; in practice, it is almost certainly bad. Currently, a good teacher at a good school will take the freedom in SM12 to go to town, to show their students some genuine mathematics and real mathematical thought. In the future, that will be close to impossible, and SM12 will likely become as predictable and as dull as MM12 (and MM34 and SM34).

And now, why doesn’t any of it matter? Because, fundamentally it doesn’t matter what you teach, it matters how you teach. What matters is the manner in which you approach your subject and your students, and none of that will change in other than a microscopic manner. Nothing in VCAA has changed, nothing in the general culture of Victorian education had changed. So, why the Hell would twiddling a few dials on utterly insane subjects assessed in an utterly insane manner make any meaningful difference?

Everything VCAA touches, they will turn to shit. That will continue to be true until there is a fundamental cultural shift, in VCAA and generally.

I hate this place.

LINKS

The current (pre-COVID) study design (pdf) is here.

The draft for the new study design (word) is here.

UPDATE (11/09/21) The examination report is here (Word-VCAA-stupid). Corresponding updates, are included with the associated question, in green, and see also here, here and here.

UPDATE (09/09/21) The examination report is here (a Word document, because VCAA is stupid). Corresponding updates, including the noting of blatant errors (MCQ20, and see here, and Q5), are included with the associated question, in green.

UPDATE (21/11/20) A link to a parent complaining about the Methods Exam 2 on 774 is here.

UPDATE (24/11/20 – Corrected) A link to VCAA apparently pleading guilty to a CAS screw-up (from 2010) is here. (Sorry, my goof to not check the link, and thanks to Worm and John Friend.)

UPDATE (05/12/2020)

We’ve now gone through the multiple choice component of the exam, and we’ve read the comments below. In general the questions seemed pretty standard and ok, with way too much CAS and other predictable irritants. A few questions were a bit weird, generally to good effect, although one struck us as off-the-planet weird.

Here are our question-by-question thoughts:

MCQ1. A trivial composition of functions question.

MCQ2. A simple remainder theorem question.

(09/09/21) 56% got this correct, which ain’t great.

MCQ3. A simple antidifferentiation question, although the 2x under the root sign will probably trick more than a few students.

(09/09/21) 86%, so students (or, more accurately, their CASes) weren’t tricked.

MCQ4. A routine trig question made ridiculous in the standard manner. Why the hell write the solutions to other than in the form ?

MCQ5. A trivial asymptotes question.

MCQ6. A standard and easy graph of the derivative question.

MCQ7. A nice chain rule question. It’s easy, but we’re guessing plenty of students will screw it up.

MCQ8. A routine and routinely depressing binomial CAS question.

(09/09/21) 50%, for God knows what reason.

MCQ9. A routine transformation of an integral question. Pretty easy with John Friend’s gaming of the question, or anyway, but these questions seem to cause problems.

(09/09/21) 35%, for Christ’s sake. The examination report contains pointless gymnastics, and presumably students tried this and fell off the beam. (The function is squished by a factor of 2, so the area is halved. Done.)

MCQ10. An unusual but OK logarithms question. It’s easy, but the non-standardness will probably confuse a number of students.

MCQ11. A standard Z distribution question.

MCQ12. A pretty easy but nice trigonometry and clock hands question.

MCQ13. The mandatory idiotic matrix transformation question, made especially idiotic by the eccentric form of the answers.

(09/09/21) 25%, obviously a direct consequence of VCAA’s idiotically cute form of answer.

MCQ14. Another standard Z distribution question: do we really need two of these? This one has a strangely large number of decimal places in the answers, the last of which appears to be incorrect.

(09/09/21) In the General comments, the examination report admonishes students for poor decimalising:

“Do not round too early …”

It seems their idea is actually “do not round at all”.

MCQ15. A nice average value of a function question. It can be done very quickly by first raising and then lowering the function by units.

(09/09/21) 36%, presumably because everyone attempted the thoughtless and painfully slow approach indicated in the examination report.

MCQ16. A routine max-min question, which would be nice in a CAS-free world.

(09/09/21) 53%, for God knows what reason. The question is simple to do in one’s head.

MCQ17. A really weird max-min question. The problem is to find the maximum vertical intercept of a tangent to . It is trivial if one uses the convexity, but that is far from trivial to think of. Presumably some Stupid CAS Trick will also work.

(09/09/21) 42%, which is not surprising. The examination report gives absolutely no clue why the maximising tangent should be at x = 0.

MCQ18. A somewhat tangly range of a function question. A reasonable question, and not hard if you’re guided by a graph, but we suspect students won’t do the question that well.

MCQ19. A peculiar and not very good “probability function” question. In principle the question is trivial, but it’s made difficult by the weirdness, which outweighs the minor point of the question.

(09/09/21) 15%, which is worse than throwing darts. Although the darts would be better saved to throw at the writers of this stupid question.

MCQ20. All we can think is the writers dropped some acid. See here.

(09/09/21) 18%. Now, what did we do with those darts? As follows from the discussion here, the suggested solution in the examination report is fundamentally invalid. One simply cannot conclude that a = 2π in the manner indicated, which means that the question is at minimum a nightmare, and is best described as wrong. Either the report writers do not know what they are writing about, or they are consciously lying to avoid admitting the question is screwed. As to which of the two it is, we dunno. Maybe throw a dart.

UPDATE (06/12/2020)

And, we’re finally done, thank God. We’ve gone through Section B of the exam and read the comments below, and we’re ready to add our thoughts.

This update will be pretty brief. Section B of Methods Exam 2 is typically the Elephant Man of VCE mathematics, and this year is no exception. The questions are long and painful and aimless and ridiculous and CAS-drenched, just as they always are. There’s not much point in saying anything but “No”.

Here are our question-by-question thoughts:

Q1. What could be a nice question about the region trapped between two functions becomes pointless CAS shit. Finding “the minimum value of the graph of ” is pretty weird wording. The sequence of transformations asked for in (d) is not unique, which is OK, as long as the graders recognise this. (Textbooks seem to typically get this wrong.)

Q2. Yet another fucking trig-shaped river. The subscripts are unnecessary and irritating.

Q3. Ridiculous modelling of delivery companies, with clumsy wording throughout. Jesus, at least give the companies names, so we don’t have to read “rival transport company” ten times. And, yet again with the independence:

“Assume that whether each delivery is on time or earlier is
independent of other deliveries.”

Q4. Aimless trapping of area between a function and line segments.

Q5. The most (only) interesting question, concerning tangents of , but massively glitchy and poorly worded, and it’s still CAS shit. The use of subscripts is needless and irritating. More Fantasyland computation, calculating in part (a), and then considering the existence of in part (b). According to the commenters, part (d)(ii) screws up on a Casio. Part (e) could win the Bulwer-Lytton contest:

“Find the values of for which the graphs of and ,
where exists, are parallel and where “

We have no clue what was intended for part (g), a 1-marker asking students to “find” which values of result in having a tangent at some with -intercept at . We can’t even see the Magritte for this one; is it just intended for students to guess? Part (h) is a needless transformation question, needlessly in matrix form, which is really the perfect way to end.

(09/09/21) This could be a horror movie: Revenge of the 1-Pointers.

Part (c) was clearly intended to be an easy 1-pointer: just note that a (tangent) line without an x-intercept is horizontal. But, somehow 77% of students stuffed it up. So, how? The examination report sermonises, thusly:

“The concept of the ‘nature of a tangent line’ was not obvious for many students.”

This suggests that the report writers don’t understand the concept of a concept. It also indicates that the report writers didn’t read their own exam. Q5(c) reads as follows:

“State the nature of the graph of g_{a} when b does not exist. [emphasis added]”

Three sub-questions ago, g_{a} is defined to the be the tangent to a function, and b is defined to be the x-intercept of this tangent (even though it may not exist). So, what was obviously not obvious to the students was the meaning of a vaguely worded question framed in poor notation. God, these people are dumb. And sanctimonious. And dumb.

Part (g), another 1-pointer, is concerned with the function p(x) = x^{3} + wx. The question asks, badly, for which values of w is there a positive t such that the tangent to p at x = t will have x-intercept at -t. 3% of students were smart enough (or lucky enough) to get the right answer, and 97% of students were very smart enough to go “Fuck this for a joke”, and skip it.

The solution in the examination report is lazily incomprehensible, but the idea was to just do the work: equate p'(t) with the rise/run slope of the tangent and see for which w there is a positive t that solves the equation. It turns out that the equation simplifies to w + 5t^{2} = 0, and so as long as w < 0 there will be a solution. It also turns out that specifying t positive is entirely irrelevant. Which is what they do.

It is worth noting the question can also be nutted out qualitatively, if one knows what graphs like y = x^{3} + x and y = x^{3} – x look like. If w ≥ 0 then it it easy see the tangent to p(x) = x^{3} + wx will always hit the x-axis before crossing the y-axis, so no chance of giving a solution for the exam question. If w < 0 then look at tangents at points t between the two turning points. At a turning point the slope of the tangent is horizontal. Then as the slope goes negative the x-intercept of the tangent comes in from ∞ (or -∞), until eventually the x-intercept is 0. Somewhere along the way, there has to be a t that gives a solution for the exam question.

Part (h), another 1-pointer, is the final question on the exam, and is a stuff up. This time, 98% of students were very smart enough to skip the damn thing. As well, as mystery student PURJ pointed out to us, the examination report is also stuffed. The question again concerns the function p. It had been noted earlier that the tangents to p at t and -t are always parallel. Then Part (h) asks, in idiotic matrix notation, how p can not be translated or dilated to so that the new function still has this parallel tangent property. Yep, the question is framed in a negative manner, and the examiners whine about it:

“The key word in this part is restrictions.”

Nope. The key word in this part is “idiotic”. (And, “ungrammatical”.) In any case, the examination report indicates that the key to solving the question is that the transformed function must still be odd. This, as PURJ pointed out, is wrong. A vertical translation is just fine but the resulting function will not be odd. The correct characterisation is that the derivative of the transformed function must be even. In sum, this means one can transform as one wishes, except for horizontal translations. Which equates to the report’s matrix answer, without any noting of the “odd”, but not odd, contradiction.

OK, we should have thought of this earlier. This post is for teachers and students (and fellow travellers) to discuss Methods Exam 1, which was held a few days ago. (There are also posts for Methods Exam 2, Specialist Exam 1 and Specialist Exam 2. We had thought of also putting up posts for Further, but decided to stick to mathematics.) We’ll also update with our brief thoughts in the near future.

Our apologies to those without access to the exam, and unfortunately VCAA is only scheduled to post the 2020 VCE exams sometime in 2023. The VCAA also has a habit of being dickish about copyright (and in general), so we won’t post the exam or reddit-ish links here. If, however, a particular question or two prompts sufficient discussion, we’ll post those questions. And, we might allow (undisplayed) links to the exams stay in the comments.

UPDATE (07/09/21) We’ve finally had a chance to look at the examination report (a Word document, because it’s VCAA). Anything we’ve considered worth remarking upon we’ve included as an update to the associated question, in green.

UPDATE (31/12/20) The exam, and all the exams, are now online. (Thanks to Red Five for the flag.)

UPDATE (21/11/20) The link to the parent complaining about the Methods Exam 1 on 3AW is here. If you see any other media commentary, please note that in a comment (or email me), and we’ll add a link.

UPDATE (23/11/20) OK, we’ve now gone through the first Methods exam quickly but pretty thoroughly, have had thoughts forwarded by commenters Red Five and John Friend, and have pondered the discussion below. Question by question, we didn’t find the exam too bad, although we didn’t look to judge length and coverage of the curriculum. There was a little Magritteishness but we didn’t spot any blatant errors, and the questions in general seemed reasonable enough (given the curriculum, and see here). Here are our brief thoughts on each question, with no warranty for fairness or accuracy. Again, apologies to those without access to the exam.

Q1. Standard and simple differentiation.

Q2. A “production goal” having the probability of requiring an oil change be m/(m+n) … This real-world scenarioising is, of course, idiotic. The intrinsic probability questions being asked are pretty trivial, indeed so trivial and same-ish that we imagine many students will be tricked. It’s not helped by a weird use of “State” in part (a), and a really weird and gratuitous use of “given” in part (b), for a not-conditional probability question.

(07/09/21) On Part (b), the examination report states that

Students generally recognised the conditional probability.

Given there is no conditional probability in the question, we have absolutely no idea what they’re talking about.

Q3. An OK question on the function tan(ax+b). Stating “the graph is continuous” is tone-deaf and, given they’ve drawn the damn thing, a little weird. The information a > 0 and 0 < b < 1 should have been provided when defining the function, not as part of the eventual question. Could someone please send the VCAA guys a copy of Strunk and White, or Fowler, or Gowers, or Dr. Seuss?

(07/09/21) The average score on Part (b) was 0.5/2, the examination report noting

[Students] need to be aware that simply quoting a rule or formula is not sufficient; they are required to demonstrate how it is used within the context of the question (i.e. in this case, give evaluations of Pr(X=2) and Pr(X≥1). Many students did not present their answer in the required form.

This smells of lunacy. Putting aside VCAA’s demand of anidiotic and non-unique form for the answer, if a student works through such a question to get a correct answer then they obviously know exactly what they’re doing. Unlike VCAA examiners.

Q6. An OK graphing-integration question, incorporating VCAA’s fetish. Interestingly, solving the proper equation in (b) is, for a change, straight-forward (although presumably the VCAA will still permit students to cheat, and solve instead). As discussed in the comments, the algebra in part (c) is a little heavier than usual, and perhaps unexpected, although hardly ridiculous. The requirement to express the final answer in the form , however, is utterly ridiculous.

(07/09/21) Part (a) requires finding the inverse of the function f(x) = √x/√2, obviously expecting the usual x-y flipping. On this, the examination report notes

Many students left their answer as y = 2x^{2}, thus assuming f(x) was the same as f^{ -1}(x), in contradiction to their prior working.

Bullshit. The word the examiners are looking for is “implying”, not “assuming”, and leaving the answer in y-x form implies nothing of the sort. Just more obsessive nastiness.

The average on Part (c) was 1.6/4. Presumably the majority of this was getting lost in (by button-happy Methods standards) the heavy algebra, but the examination report also makes clear VCAA’s culpability:

Errors in … answering in the format specified by the question were common.

Q7. This strikes us as a pretty simple tangents-slopes question, although maybe the style of the question will throw students off. Part (c) is in effect asking, in a convoluted manner, the closest point from the x-axis to a no-intercepts parabola. Framed this way, the question is easy. The convolution, however, combined with the no-intercepts property having only appeared implicitly in a pretty crappy diagram, will probably screw up plenty of students.

(07/09/21) Unsurprisingly, Part (c) was a write-off, with an average score of 0.2/2. It is unclear how much of this was due to the crappy diagram.

Q8. A second integration question featuring VCAA’s fetish. Did we really need two? The implicit hint in part (c) (d)(i) and the diagram are probably enough to excuse the Magritteness of part (d), but it’s a close call. Much less excusable is part (b) (c):

“Find the area of the region that is bounded by f, the line x = a and the horizontal axis for x in [a,b], where b is the x-intercept of f.”

Forget Dr. Seuss. Someone get them some Ladybird books.

(07/09/21) The average score on Part (c), an easy follow-on from (b), was 0.5/2, the examination report noting

Many students were unsure of which terminals to use for the definite integral …

The examiners failed to mention that the students were probably “unsure” of the terminals because they couldn’t decipher the ridiculous, stream of consciousness question.

Part (d) may as well not have existed, with an average score of 0.2/3. Students may have ran out of time, or have given up.

Yeah, it’s the same joke, but it’s not our fault: if people keep screwing up trials, we’ll keep making “trial” jokes. In this case the trial is MAV‘s Trial Exam 1 for Mathematical Methods. The exam is, indeed, a trial.

Regular readers of this blog will be aware that we’re not exactly a fan of the MAV (and vice versa). The Association has, on occasion, been arrogant, inept, censorious, and demeaningly subservient to the VCAA. The MAV is also regularly extended red carpet invitations to VCAA committees and reviews, and they have somehow weaseled their way into being a member of AMSI. Acting thusly, and treated thusly, the MAV is a legitimate and important target. Nonetheless, we generally prefer to leave the MAV to their silly games and to focus upon the official screwer upperers. But, on occasion, someone throws some of MAV’s nonsense our way, and it is pretty much impossible to ignore; that is the situation here.

As we detail below, MAV’s Methods Trial Exam 1 is shoddy. Most of the questions are unimaginative, unmotivated and poorly written. The overwhelming emphasis is not on testing insight but, rather, on tedious computation towards a who-cares goal, with droning solutions to match. Still, we wouldn’t bother critiquing the exam, except for one question. This question simply must be slammed for the anti-mathematical crap that it is.

The final question, Question 10, of the trial exam concerns the function

on the domain . Part (a) asks students to find and its domain, and part (b) then asks,

Find the coordinates of the point(s) of intersection of the graphs of and .

Regular readers will know exactly the Hellhole to which this is heading. The solutions begin,

Solve for ,

which is suggested without a single accompanying comment, nor even a Magrittesque diagram. It is nonsense.

It was nonsense in 2010 when it appeared on the Methods exam and report, and it was nonsense again in 2011. It was nonsense in 2012 when we slammed it, and it was nonsense again when it reappeared in 2017 and we slammed it again. It is still nonsense, it will always be nonsense and, at this stage, the appearance of the nonsense is jaw-dropping and inexcusable.

It is simply not legitimate to swap the equation for , unless a specific argument is provided for the specific function. When valid, that can usually be done. Easily. We laid it all out, and if anybody in power gave a damn then this type of problem could be taught properly and tested properly. But, no.

What were the exam writers thinking? We can only see three possibilities:

a) The writers are too dumb or too ignorant to recognise the problem;

b) The writers recognise the problem but don’t give a damn;

c) The writers recognise the problem and give a damn, but presume that VCAA don’t give a damn.

We have no idea which it is, but we can see no fourth option. Whatever the reason, there is no longer any excuse for this crap. Even if one presumes or knows that VCAA will continue with the moronic, ritualistic testing of this type of problem, there is absolutely no excuse for not also including a clear and proper justification for the solution. None.

What of the rest of the MAV, what of the vetters and the reviewers? Did no one who checked the trial exam flag this nonsense? Or, were they simply overruled by others who were worse-informed but better-connected? What about the MAV Board? Is there anyone at all at the MAV who gives a damn?

*********************

Postscript: For the record, here, briefly, are other irritants from the exam:

Q2. There are infinitely many choices of integers and with equal to the indicated answer of .

Q3. This is not, or at least should not be, a Methods question. Integrals of the form with non-linear are not, or at least are not supposed to be, examinable.

Q4. The writers do not appear to know what “hence” means. There are, once again, infinitely many choices of and .

Q5. “Appropriate mathematical reasoning” is a pretty fancy title for the trivial application of a (stupid) definition. The choice of the subscripted is needlessly ugly and confusing. Part (c) is fundamentally independent of the boring nitpicking of parts (a) and (b). The writers still don’t appear to know what “hence” means.

Q6. An ugly question, guided by a poorly drawn graph. It is ridiculous to ask for “a rule” in part (a), since one can more directly ask for the coefficients , and .

Q7. A tedious question, which tests very little other than arithmetic. There are, once again, infinitely many forms of the answer.

Q8. The endpoints of the domain for are needlessly and confusingly excluded. The sole purpose of the question is to provide a painful, Magrittesque method of solving , which can be solved simply and directly.

Q9. A tedious question with little purpose. The factorisation of the cubic can easily be done without resorting to fractions.

Q10. Above. The waste of a precious opportunity to present and to teach mathematical thought.

UPDATE (28/09/20)

John (no) Friend has located an excellent paper by two Singaporean maths ed guys, Ng Wee Leng and Ho Foo Him. Their paper investigates (and justifies) various aspects of solving .