It seems that what amounts to VCE exam marking schemes may be available for purchase through the Mathematical Association of Victoria. This seems very strange, and we’re not really sure what is going on, but we shall give our current sense of it. (It should be noted at the outset that we are no fan of the MAV in its current form, nor of the VCAA in any form: though we are trying hard here to be straightly factual, our distaste for these organisations should be kept in mind.)
Each year, the MAV sells VCE exam solutions for the previous year’s exams. It is our understanding that it is now the MAV’s strong preference that these solutions will be written by VCAA assessors. Further, the MAV is now advertising that these solutions are “including marking allocations“. We assume that the writers are paid by the MAV for this work, and we assume that the MAV are profiting from the selling of the product, which is not cheap. Moreover, the MAV also hosts Meet the Assessors events which, again, are not cheap and are less cheap for non-members of the MAV. Again, it is reasonable to assume that the assessors and/or the MAV profit from these events.
We do not understand any of this. One would think that simple equity requires that any official information regarding VCE exams and solutions should be freely available. What we understand to be so available are very brief solutions as part of VCAA’s examiners’ reports, and that’s it. In particular, it is our understanding that VCAA marking schemes have been closely guarded secrets. If the VCAA is loosening up on that, then that’s great. If, however, VCAA assessors and/or the MAV are profiting from such otherwise unavailable information, we do not understand why anyone should regard that as acceptable. If, on the other hand, the MAV and/or the assessors are not so profiting, we do not understand the product and the access that the MAV is offering for sale.
We have written previously of the worrying relationship between the VCAA and the MAV, and there is plenty more to write. On more than one occasion the MAV has censored valid criticism of the VCAA, conduct which makes it difficult to view the MAV as a strong or objective or independent voice for Victorian maths teachers. The current, seemingly very cosy relationship over exam solutions, would only appear to make matters worse. When the VCAA stuffs up an exam question, as they do on a depressingly regular basis, why should anyone trust the MAV solutions to provide an honest summary or evaluation of that stuff up?
Again, we are not sure what is happening here. We shall do our best to find out, and commenters, who may have a better sense of MAV and VCAA workings, may comment (carefully) below.
As John Friend has indicated in his comment, the “marking allocations” appears to be nothing but the trivial annotation of solutions with the allotted marks, not a break-down of what is required to achieve those marks. So, simply a matter of the MAV over-puffing their product. As for the appropriateness of the MAV being able to charge to “meet” VCAA assessors, and for solutions produced by assessors, those issues remain open.
We’ve also had a chance to look at the MAV 2019 Specialist solutions (not courtesy of JF, for those who like to guess such things.) More pertinent would be the Methods solutions (because of this, this, this and, especially, this.) Still, the Specialist solutions were interesting to read (quickly), and some comments are in order. In general, we thought the solutions were pretty good: well laid out with usually, though not always, the seemingly best approach indicated. There were a few important theoretical errors (see below), although not errors that affected the specific solutions. The main general and practical shortcoming is the lack of diagrams for certain questions, which would have made those solutions significantly clearer and, for the same reason, should be encouraged as standard practice.
For the benefit of those with access to the Specialist solutions (and possibly minor benefit to others), the following are brief comments on the solutions to particular questions (with section B of Exam 2 still to come); feel free to ask for elaboration in the comments. The exams are here and here.
Q5. There is a Magritte element to the solution and, presumably, the question.
Q6. The stated definition of linear dependence is simply wrong. The problem is much more easily done using a 3 x 3 determinant.
Q7. Part (a) is poorly set out and employs a generally invalid relationship between Arg and arctan. Parts (c) and (d) are very poorly set out, not relying upon the much clearer geometry.
Q8. A diagram, even if generic, is always helpful for volumes of revolution.
Q9. The solution to part (b) is correct, but there is an incorrect reference to the forces on the mass, rather than the ring. The expression “… the tension T is the same on both sides …” is hopelessly confused.
Q10. The question is stupid, but the solutions are probably as good as one can do.
Exam 2 (Section A)
MCQ5. The answer is clear, and much more easily obtained, from a rough diagram.
MCQ6. The formula Arg(a/b) = Arg(a) – Arg(b) is used, which is not in general true.
MCQ11. A very easy question for which two very long and poorly expressed solutions are given.
MCQ12. An (always) poor choice of formula for the vector resolute leads to a solution that is longer and significantly more prone to error. (UPDATE 14/2: For more on this question, go here.)
MCQ13. A diagram is mandatory, and the cosine rule alternative should be mentioned.
MCQ14. It is easier to first solve for the acceleration, by treating the system as a whole.
MCQ19. A slow, pointless use of CAS to check (not solve) the solution of simultaneous equations.
For more on MCQ12, go here.
Exam 2 (Section B)
Q1. In Part (a), the graphs are pointless, or at least a distant second choice; the choice of root is trivial, since y = tan(t) > 0. For part (b), the factorisation should be noted. In part (c), it is preferable to begin with the chain rule in the form , since no inverses are then required. Part (d) is one of those annoyingly vague VCE questions, where it is impossible to know how much computation is required for full marks; the solutions include a couple of simplifications after the definite integral is established, but God knows whether these extra steps are required.
Q2. The solution to Part (c) is very poorly written. The question is (pointlessly) difficult, which means clear signposts are required in the solution; the key point is that the zeroes of the polynomial will be symmetric around (-1,0), the centre of the circle from part (b). The output of the quadratic formula is neccessarily a mess, and may be real or imaginary, but is manipulated in a clumsy manner. In particular, a factor of -1 is needlessly taken out of the root, and the expression “we expect” is used in a manner that makes no sense. The solution to the (appallingly written) Part (d) is ok, though the centre of the circle is clear just from symmetry, and we have no idea what “ve(z)” means.
Q3. There is an aspect to the solution of this question that is so bad, we’ll make it a separate post. (So, hold your fire.)
Q4. Part (a) is much easier than the notation-filled solution makes it appear.
Q5. Part (c)(i) is weird. It is a 1-point question, and so presumably just writing down the intuitive answer, as is done in the solutions, is what was expected and is perhaps reasonable. But the intuitive answer is not that intuitive, and an easy argument from considering the system as a whole (see MCQ14) seems (mathematically) preferable. For Part (c)(ii), it is more straight-forward to consider the system as a whole, making the tension redundant (see MCQ14). The first (and less preferable) solution to Part (d) is very confusing, because the two stages of computation required are not clearly separated.
Q6. It’s statistical inference: we just can’t get ourselves to care.
The Specialist Maths examination reports are finally, finally out (here and here), so it seems worth revisiting the MAV “Assessor” solutions. In summary, the clumsiness of and errors in the MAV solutions as indicated above (and see also here and here) do not appear in the reports; in the main this is because the reports are pretty much silent on any aspect involving some subtlety. Sigh.
Some specific comments:
Q5 Yes, Magritte-ish. Justifying that the critical points are extrema was not expected, meaning conscientious students wasted their time.
Q6 The error in the MAV solutions is ducked in the report.
Q7 The error in the MAV solutions is ducked in the report.
EXAM 2 (Section A)
MCQ6 The error in the MAV solutions is ducked in the report.
MCQ11 The report is silent.
MCQ12 A huge screw-up of a question, to which the report hemidemisemi confesses: see here.
MCQ14 The report suggests the better method for solving this problem.
EXAM 2 (Section B)
Q2 Jesus. This question was intrinsically confusing and very badly worded, with the students inevitably doing poorly. So, why the hell is the examination report almost completely silent? The MAV solutions were a mess, but the absence of comment in the report is disgraceful.
Q3 The solution in the report is ok, although more could have been written. But, it’s not the garbled nonsense of the MAV solution, as detailed here.