A few days ago, the Sydney Morning Herald had an article on “the hardest question” from this year’s NSW Extension 2 Exam. The question is worth 5 marks, which equates to 9 minutes of a 3 hour exam (accessible here). The question, and another question (6-ish minutes), which apparently came along for the ride, are posted below. Readers’ homework exercise is to Compare and Contrast. Continue reading “How the Other Maths Lives”
Tag: exams
WitCH 74: Aargh!
WitCH 73: Independent Thinking
Here is an old Methods one for a change, from the 2011 Exam 2. The examination report gives the answer as E and indicates that 15% of students selected this answer.
WitCH 72: High Significance
Here is one more from the 2019 Specialist Exam 2, once again courtesy of student PURJ (who is vetting the exams much better than the exam vetters). It consists of the last four parts of Q6, the final question on the exam, concerning a machine that packages noodles. The answers from the examination report appear below the questions.
WitCH 71: Plane Wrong
Here is another from the 2019 Specialist Exam 2, also courtesy of student PURJ. Again, we’ll semi-update from the examination report once people have had a chance to ponder and to comment.
WitCH 70: Troubled Relations
This is not an old one. It is from the 2019 Specialist Exam 2 and comes courtesy of student PURJ, who previously contributed to the discussion here. PURJ noted one, glaring, issue with the question below and its grading, but we think there are other issues as well. As with our previous WitCH, we’ll semi-update with excerpts from the examination report once people have had a chance to ponder and to comment.
Discussion of the 2021 NHT Methods Exams
This is our post for discussion of the 2021 NHT Mathematical Methods exams (here and here). See also the accompanying 2021 NHT Specialist exams post, here. See also the relevant Witches: here, here and here.
UPDATE (08/10/21) The exam reports have reappeared, as PDFs, here and here. The exam 2 report has important updates, and important non-updates; see the comments in blue, below.
UPDATE (22/09/21) The exam reports are now out, here and here (VCAA-Word-Stupid). We’ve added a couple comments below, in green.
MitPY 8: Verification Code
It’s a long time since we’ve had a MitPY. But, the plague goes on (including the plague of right-wing Creightons).
This one comes from frequent commenter Red Five, and we apologise for the huge delay in posting. It is targeted at those familiar with and, more likely, struggling with Victoria’s VCE rituals:
WitCH 43: Period Piece
This one comes courtesy of a smart VCE student, the issue having been flagged to them by a fellow student. It is a multiple choice question from the 2009 Mathematical Methods, Exam 2; the Examination Report indicates, without comment, that the correct answer is D.

UPDATE (08/12/23)
The exam report was amended on 18/09/2020, after this post appeared. The report now includes a note, with no indication that the note was added a decade later:
B was also accepted as it leads to an equivalent expression.
If true, then the original exam report was consciously deceptive.
The Troubling Cosiness of the VCAA and the MAV
It seems that what amounts to VCE exam marking schemes may be available for purchase through the Mathematical Association of Victoria. This seems very strange, and we’re not really sure what is going on, but we shall give our current sense of it. (It should be noted at the outset that we are no fan of the MAV in its current form, nor of the VCAA in any form: though we are trying hard here to be straightly factual, our distaste for these organisations should be kept in mind.)
Each year, the MAV sells VCE exam solutions for the previous year’s exams. It is our understanding that it is now the MAV’s strong preference that these solutions will be written by VCAA assessors. Further, the MAV is now advertising that these solutions are “including marking allocations“. We assume that the writers are paid by the MAV for this work, and we assume that the MAV are profiting from the selling of the product, which is not cheap. Moreover, the MAV also hosts Meet the Assessors events which, again, are not cheap and are less cheap for non-members of the MAV. Again, it is reasonable to assume that the assessors and/or the MAV profit from these events.
We do not understand any of this. One would think that simple equity requires that any official information regarding VCE exams and solutions should be freely available. What we understand to be so available are very brief solutions as part of VCAA’s examiners’ reports, and that’s it. In particular, it is our understanding that VCAA marking schemes have been closely guarded secrets. If the VCAA is loosening up on that, then that’s great. If, however, VCAA assessors and/or the MAV are profiting from such otherwise unavailable information, we do not understand why anyone should regard that as acceptable. If, on the other hand, the MAV and/or the assessors are not so profiting, we do not understand the product and the access that the MAV is offering for sale.
We have written previously of the worrying relationship between the VCAA and the MAV, and there is plenty more to write. On more than one occasion the MAV has censored valid criticism of the VCAA, conduct which makes it difficult to view the MAV as a strong or objective or independent voice for Victorian maths teachers. The current, seemingly very cosy relationship over exam solutions, would only appear to make matters worse. When the VCAA stuffs up an exam question, as they do on a depressingly regular basis, why should anyone trust the MAV solutions to provide an honest summary or evaluation of that stuff up?
Again, we are not sure what is happening here. We shall do our best to find out, and commenters, who may have a better sense of MAV and VCAA workings, may comment (carefully) below.
UPDATE (13/02/20)
As John Friend has indicated in his comment, the “marking allocations” appears to be nothing but the trivial annotation of solutions with the allotted marks, not a break-down of what is required to achieve those marks. So, simply a matter of the MAV over-puffing their product. As for the appropriateness of the MAV being able to charge to “meet” VCAA assessors, and for solutions produced by assessors, those issues remain open.
We’ve also had a chance to look at the MAV 2019 Specialist solutions (not courtesy of JF, for those who like to guess such things.) More pertinent would be the Methods solutions (because of this, this, this and, especially, this.) Still, the Specialist solutions were interesting to read (quickly), and some comments are in order. In general, we thought the solutions were pretty good: well laid out with usually, though not always, the seemingly best approach indicated. There were a few important theoretical errors (see below), although not errors that affected the specific solutions. The main general and practical shortcoming is the lack of diagrams for certain questions, which would have made those solutions significantly clearer and, for the same reason, should be encouraged as standard practice.
For the benefit of those with access to the Specialist solutions (and possibly minor benefit to others), the following are brief comments on the solutions to particular questions (with section B of Exam 2 still to come); feel free to ask for elaboration in the comments. The exams are here and here.
Exam 1
Q5. There is a Magritte element to the solution and, presumably, the question.
Q6. The stated definition of linear dependence is simply wrong. The problem is much more easily done using a 3 x 3 determinant.
Q7. Part (a) is poorly set out and employs a generally invalid relationship between Arg and arctan. Parts (c) and (d) are very poorly set out, not relying upon the much clearer geometry.
Q8. A diagram, even if generic, is always helpful for volumes of revolution.
Q9. The solution to part (b) is correct, but there is an incorrect reference to the forces on the mass, rather than the ring. The expression “… the tension T is the same on both sides …” is hopelessly confused.
Q10. The question is stupid, but the solutions are probably as good as one can do.
Exam 2 (Section A)
MCQ5. The answer is clear, and much more easily obtained, from a rough diagram.
MCQ6. The formula Arg(a/b) = Arg(a) – Arg(b) is used, which is not in general true.
MCQ11. A very easy question for which two very long and poorly expressed solutions are given.
MCQ12. An (always) poor choice of formula for the vector resolute leads to a solution that is longer and significantly more prone to error. (UPDATE 14/2: For more on this question, go here.)
MCQ13. A diagram is mandatory, and the cosine rule alternative should be mentioned.
MCQ14. It is easier to first solve for the acceleration, by treating the system as a whole.
MCQ19. A slow, pointless use of CAS to check (not solve) the solution of simultaneous equations.
UPDATE (14/02/20)
For more on MCQ12, go here.
UPDATE (14/02/20)
Exam 2 (Section B)
Q1. In Part (a), the graphs are pointless, or at least a distant second choice; the choice of root is trivial, since y = tan(t) > 0. For part (b), the factorisation should be noted. In part (c), it is preferable to begin with the chain rule in the form
, since no inverses are then required. Part (d) is one of those annoyingly vague VCE questions, where it is impossible to know how much computation is required for full marks; the solutions include a couple of simplifications after the definite integral is established, but God knows whether these extra steps are required.
Q2. The solution to Part (c) is very poorly written. The question is (pointlessly) difficult, which means clear signposts are required in the solution; the key point is that the zeroes of the polynomial will be symmetric around (-1,0), the centre of the circle from part (b). The output of the quadratic formula is neccessarily a mess, and may be real or imaginary, but is manipulated in a clumsy manner. In particular, a factor of -1 is needlessly taken out of the root, and the expression “we expect” is used in a manner that makes no sense. The solution to the (appallingly written) Part (d) is ok, though the centre of the circle is clear just from symmetry, and we have no idea what “ve(z)” means.
Q3. There is an aspect to the solution of this question that is so bad, we’ll make it a separate post. (So, hold your fire.)
Q4. Part (a) is much easier than the notation-filled solution makes it appear.
Q5. Part (c)(i) is weird. It is a 1-point question, and so presumably just writing down the intuitive answer, as is done in the solutions, is what was expected and is perhaps reasonable. But the intuitive answer is not that intuitive, and an easy argument from considering the system as a whole (see MCQ14) seems (mathematically) preferable. For Part (c)(ii), it is more straight-forward to consider the system as a whole, making the tension redundant (see MCQ14). The first (and less preferable) solution to Part (d) is very confusing, because the two stages of computation required are not clearly separated.
Q6. It’s statistical inference: we just can’t get ourselves to care.
UPDATE (26/06/20)
The Specialist Maths examination reports are finally, finally out (here and here), so it seems worth revisiting the MAV “Assessor” solutions. In summary, the clumsiness of and errors in the MAV solutions as indicated above (and see also here and here) do not appear in the reports; in the main this is because the reports are pretty much silent on any aspect involving some subtlety. Sigh.
Some specific comments:
EXAM 1
Q5 Yes, Magritte-ish. Justifying that the critical points are extrema was not expected, meaning conscientious students wasted their time.
Q6 The error in the MAV solutions is ducked in the report.
Q7 The error in the MAV solutions is ducked in the report.
EXAM 2 (Section A)
MCQ6 The error in the MAV solutions is ducked in the report.
MCQ11 The report is silent.
MCQ12 A huge screw-up of a question, to which the report hemidemisemi confesses: see here.
MCQ14 The report suggests the better method for solving this problem.
EXAM 2 (Section B)
Q2 Jesus. This question was intrinsically confusing and very badly worded, with the students inevitably doing poorly. So, why the hell is the examination report almost completely silent? The MAV solutions were a mess, but the absence of comment in the report is disgraceful.
Q3 The solution in the report is ok, although more could have been written. But, it’s not the garbled nonsense of the MAV solution, as detailed here.