## PoSWW 30: This Does Not Compute

The above was described by a VCAA apparatchik as a “beautiful poster”. Proving, it seems, that beauty is in the eye of the beholden.

## The State of Such Things

A few days ago, the QEDcat received an email from a Victorian teacher, someone we don’t know and who, it seems, doesn’t follow this blog. The teacher and his colleagues had been discussing a rather weird exercise and he had wanted our opinion on the exercise. We were happy to oblige of course, and the issue turned out to be related to this screw-up. We also took the opportunity to suggest that the teacher’s choice was either to ask us or to ask the VCAA. The teacher replied,

Yep. One of the other teachers said we should check what vcaa say. But I actually want to know the correct answer.

Which is where we are.  A ratbag blogger is considered, correctly, to be a more reliable authority than the actual Authority.

## Discussion: VCAA’s Blunt Implement

This is not one we’ve had time to look at but it seems important. We don’t intend to comment but we’re providing this post as a forum for discussion.

In November last year, VCAA released its draft of the new mathematics study design, to begin in 2023.  The draft is no longer linked on VCAA’s website, but we wrote about the draft here, here and here. The current study design, ostensibly in operation until the end of 2022, is here. Continue reading “Discussion: VCAA’s Blunt Implement”

## VCAA’s MayDay

VCAA’s corona-compliant study design for 2020 has just appeared, here. We’ve had no chance to look at it yet. We’ll be curious to see the response.

## The Troubling Cosiness of the VCAA and the MAV

It seems that what amounts to VCE exam marking schemes may be available for purchase through the Mathematical Association of Victoria. This seems very strange, and we’re not really sure what is going on, but we shall give our current sense of it. (It should be noted at the outset that we are no fan of the MAV in its current form, nor of the VCAA in any form: though we are trying hard here to be straightly factual, our distaste for these organisations should be kept in mind.)

Each year, the MAV sells VCE exam solutions for the previous year’s exams. It is our understanding that it is now the MAV’s strong preference that these solutions will be written by VCAA assessors. Further, the MAV is now advertising that these solutions are “including marking allocations“. We assume that the writers are paid by the MAV for this work, and we assume that the MAV are profiting from the selling of the product, which is not cheap. Moreover, the MAV also hosts Meet the Assessors events which, again, are not cheap and are less cheap for non-members of the MAV. Again, it is reasonable to assume that the assessors and/or the MAV profit from these events.

We do not understand any of this. One would think that simple equity requires that any official information regarding VCE exams and solutions should be freely available. What we understand to be so available are very brief solutions as part of VCAA’s examiners’ reports, and that’s it. In particular, it is our understanding that VCAA marking schemes have been closely guarded secrets. If the VCAA is loosening up on that, then that’s great. If, however, VCAA assessors and/or the MAV are profiting from such otherwise unavailable information, we do not understand why anyone should regard that as acceptable. If, on the other hand, the MAV and/or the assessors are not so profiting, we do not understand the product and the access that the MAV is offering for sale.

We have written previously of the worrying relationship between the VCAA and the MAV, and there is plenty more to write. On more than one occasion the MAV has censored valid criticism of the VCAA, conduct which makes it difficult to view the MAV as a strong or objective or independent voice for Victorian maths teachers. The current, seemingly very cosy relationship over exam solutions, would only appear to make matters worse. When the VCAA stuffs up an exam question, as they do on a depressingly regular basis, why should anyone trust the MAV solutions to provide an honest summary or evaluation of that stuff up?

Again, we are not sure what is happening here. We shall do our best to find out, and commenters, who may have a better sense of MAV and VCAA workings, may comment (carefully) below.

UPDATE (13/02/20)

As John Friend has indicated in his comment, the “marking allocations” appears to be nothing but the trivial annotation of solutions with the allotted marks, not a break-down of what is required to achieve those marks. So, simply a matter of the MAV over-puffing their product. As for the appropriateness of the MAV being able to charge to “meet” VCAA assessors, and for solutions produced by assessors, those issues remain open.

We’ve also had a chance to look at the MAV 2019 Specialist solutions (not courtesy of JF, for those who like to guess such things.) More pertinent would be the Methods solutions (because of this, this, this and, especially, this.) Still, the Specialist solutions were interesting to read (quickly), and some comments are in order. In general, we thought the solutions were pretty good: well laid out with usually, though not always, the seemingly best approach indicated. There were a few important theoretical errors (see below), although not errors that affected the specific solutions. The main general and practical shortcoming is the lack of diagrams for certain questions, which would have made those solutions significantly clearer and, for the same reason, should be encouraged as standard practice.

For the benefit of those with access to the Specialist solutions (and possibly minor benefit to others), the following are brief comments on the solutions to particular questions (with section B of Exam 2 still to come); feel free to ask for elaboration in the comments. The exams are here and here.

Exam 1

Q5. There is a Magritte element to the solution and, presumably, the question.

Q6. The stated definition of linear dependence is simply wrong. The problem is much more easily done using a 3 x 3 determinant.

Q7. Part (a) is poorly set out and employs a generally invalid relationship between Arg and arctan. Parts (c) and (d) are very poorly set out, not relying upon the much clearer geometry.

Q8. A diagram, even if generic, is always helpful for volumes of revolution.

Q9. The solution to part (b) is correct, but there is an incorrect reference to the forces on the mass, rather than the ring. The expression  “… the tension T is the same on both sides …” is hopelessly confused.

Q10. The question is stupid, but the solutions are probably as good as one can do.

Exam 2 (Section A)

MCQ5. The answer is clear, and much more easily obtained, from a rough diagram.

MCQ6. The formula Arg(a/b) = Arg(a) – Arg(b) is used, which is not in general true.

MCQ11. A very easy question for which two very long and poorly expressed solutions are given.

MCQ12. An (always) poor choice of formula for the vector resolute leads to a solution that is longer and significantly more prone to error. (UPDATE 14/2: For more on this question, go here.)

MCQ13. A diagram is mandatory, and the cosine rule alternative should be mentioned.

MCQ14. It is easier to first solve for the acceleration, by treating the system as a whole.

MCQ19. A slow, pointless use of CAS to check (not solve) the solution of simultaneous equations.

UPDATE (14/02/20)

For more on MCQ12, go here.

UPDATE (14/02/20)

Exam 2 (Section B)

Q1. In Part (a), the graphs are pointless, or at least a distant second choice; the choice of root is trivial, since y = tan(t) > 0. For part (b), the factorisation should be noted. In part (c), it is preferable to begin with the chain rule in the form , since no inverses are then required. Part (d) is one of those annoyingly vague VCE questions, where it is impossible to know how much computation is required for full marks; the solutions include a couple of simplifications after the definite integral is established, but God knows whether these extra steps are required.

Q2. The solution to Part (c) is very poorly written. The question is (pointlessly) difficult, which means clear signposts are required in the solution; the key point is that the zeroes of the polynomial will be symmetric around (-1,0), the centre of the circle from part (b). The output of the quadratic formula is neccessarily a mess, and may be real or imaginary, but is manipulated in a clumsy manner. In particular, a factor of -1 is needlessly taken out of the root, and the expression “we expect” is used in a manner that makes no sense. The solution to the (appallingly written) Part (d) is ok, though the centre of the circle is clear just from symmetry, and we have no idea what “ve(z)” means.

Q3. There is an aspect to the solution of this question that is so bad, we’ll make it a separate post. (So, hold your fire.)

Q4. Part (a) is much easier than the notation-filled solution makes it appear.

Q5. Part (c)(i) is weird. It is a 1-point question, and so presumably just writing down the intuitive answer, as is done in the solutions, is what was expected and is perhaps reasonable. But the intuitive answer is not that intuitive, and an easy argument from considering the system as a whole (see MCQ14) seems (mathematically) preferable. For Part (c)(ii), it is more straight-forward to consider the system as a whole, making the tension redundant (see MCQ14). The first (and less preferable) solution to Part (d) is very confusing, because the two stages of computation required are not clearly separated.

Q6. It’s statistical inference: we just can’t get ourselves to care.

UPDATE (26/06/20)

The Specialist Maths examination reports are finally, finally out (here and here), so it seems worth revisiting the MAV “Assessor” solutions. In summary, the clumsiness of and errors in the MAV solutions as indicated above (and see also here and here) do not appear in the reports; in the main this is because the reports are pretty much silent on any aspect involving some subtlety. Sigh.

EXAM 1

Q5 Yes, Magritte-ish. Justifying that the critical points are extrema was not expected, meaning conscientious students wasted their time.

Q6 The error in the MAV solutions is ducked in the report.

Q7 The error in the MAV solutions is ducked in the report.

EXAM 2 (Section A)

MCQ6   The error in the MAV solutions is ducked in the report.

MCQ11 The report is silent.

MCQ12 A huge screw-up of a question, to which the report hemidemisemi confesses: see here.

MCQ14 The report suggests the better method for solving this problem.

EXAM 2 (Section B)

Q2 Jesus. This question was intrinsically confusing and very badly worded, with the students inevitably doing poorly. So, why the hell is the examination report almost completely silent? The MAV solutions were a mess, but the absence of comment in the report is disgraceful.

Q3 The solution in the report is ok, although more could have been written. But, it’s not the garbled nonsense of the MAV solution, as detailed here.

## VCAA’s Mathematical Reasoning

OK, Dear Readers, turn off the footy and/or the cricket. You have work to do.

We have written before of VCAA‘s manipulative “review” of Victoria’s senior mathematics curriculum, complete with scale-thumbing, push-polling and hidden, hand-picked “experts”. Now, according to their latest Bulletin,

[t]he VCAA will undertake a second phase of Stage 1 consultation …

Good. With any luck, the VCAA will subsequently get stuck on the nth phase of Stage 1, and Victoria can be spared their Potemkin Mathematics for another decade or so.

Still, it is strange. The VCAA has indicated nothing of substance about the results of the first phase of consultation. Why not? And, what is the supposed purpose of this second phase? What is the true purpose? According to the VCAA, one of two reasons for Phase Two is

to further investigate [t]he role of aspects of mathematical reasoning and working mathematically in each of the types of mathematics studies.

(The second reason concerns “Foundation Mathematics” which, try as we might, we just cannot pretend any interest.)

As part of this new consultation, VCAA has posted a new paper, and set up a new questionnaire (and PDF here), until 16 September.

• Please fill in the questionnaire.
• Please (attempt to) read VCAA’s new paper and, if you can make any sense of it whatsoever, please comment to this effect below.

We suspect, however, that this is all a game, disguising the true purpose of Phase Two. It’d be easier to be sure if the VCAA had reported anything of substance about the results of Phase One, but we can still hazard a pretty good guess. As one of our colleagues conjectured,

“There was probably sufficient lack of support [in Phase One] for some radical departure from the norm, and so they will take longer to figure out how to make that happen.”

That is, although the VCAA’s nonsense received significant pushback, the VCAA haven’t remotely given up on it and are simply trying to wait out and wear down the opposition. And, since the VCAA controls the money and the process and the “experts” and the “key stakeholders” and the reporting and everything else except public sentiment, they will probably win.

But they should be made to earn it.

## VCAA Puts the “Con” into Consultation

As we have written, the Victorian Curriculum and Assessment Authority is “reviewing” Victoria’s senior secondary maths, which amounts to the VCAA attempting to ram through a vague and tendentious computer-based curriculum, presented with no evidence of its benefit apart from change for the sake of change. Readers can and should respond to the VCAA’s manipulative questionnaire before May 10. In this post we shall point out the farcical nature of VCAA’s “consultation”, as evidenced by VCAA’s overview and questionnaire.

The overview begins by framing VCAA’s review with the following question:

What could a senior secondary mathematics curriculum for a liberal democratic society in a developed country for 2020–2030 look like?

This is peculiar framing, since it is difficult to imagine how a society being “liberal” or “democratic” or otherwise has any bearing on the suitability of a mathematics curriculum. Why would a good curriculum for China not also be good for Victoria?

One could easily write off this framing as just jingoistic puffery; neither word reappears in VCAA’s overview. It is, however, more insidious than that. The framing is, except for the odd omission of the word “suitable”, identical to the title of the Wolfram-CBM paper promoting “computer-based mathematics” in general and Wolfram-CBM in particular. This paper is the heavy propaganda gun VCAA has procured in furtherance of its struggle to liberate us all from the horrors of mathematical calculation. Though the Wolfram-CBM paper never states it explicitly, this makes clear the purpose of the framing:

“[L]iberal” and “democratic” and “developed” amounts to “rich enough to assume, demand and forever more have us beholden to the omnipresence of computers”.

The VCAA overview continues by noting the VCAA’s previous review in 2013-2014 and then notes the preliminary work undertaken in 2018 as part of the current review:

… the VCAA convened an expert panel to make recommendations in preparation for broad consultation in 2019.

Really? On whose authority does this anonymous panel consist of experts? Expert in what? How was this “expert panel” chosen, and by whom? Were there any potential or actual conflicts of interest on the “expert panel” that were or should have been disclosed? How or how not was this “expert panel” directed to conduct its review? Were there any dissenters on this “expert panel”?

The only thing clear in all this is the opacity.

The overview provides no evidence that VCAA’s “expert panel” consists of appropriately qualified or sufficiently varied or sufficiently independent persons, nor that these persons were selected in an objective manner, nor that these persons were able to and encouraged to conduct the VCAA review in an objective manner.

Indeed, any claim to breadth, independence or expertise is undermined by the constrained formulation of the questionnaire, the poverty of and the bias in the proposed curriculum structures and the overt slanting of the overview towards one particular structure. Which brings us to the issue of consultation:

There is no value in “broad consultation” if discussion has already been constrained to the consideration of three extremely poor options.

But, “consult” the VCAA will:

The VCAA will consult with key stakeholders and interested parties to ensure that feedback is gained from organisations, groups and individuals.

Well, great. The writer of this blog is a keenly interested stakeholder, and an individual well known to the VCAA. Should we be waiting by the phone? Probably not, but it hardly matters:

The VCAA has provided no indication that the consultation with “key stakeholders” and “interested parties” will be conducted in a manner to encourage full and proper critique. There is very good reason to doubt that any feedback thus gained will be evaluated in a fair or objective manner.

The overview then outlines three “key background papers” (links here). Then:

… stakeholders are invited to consider and respond to the consultation questionnaire for each structure.

Simply, this is false. Question 1 of VCAA’s questionnaire asks

Which of the proposed structures would you prefer to be implemented for VCE Mathematics?

Questions 2-8 then refer to, and only to, “this structure”. It is only in the final, catch-all Question 9 that a respondent is requested to provide “additional comments or feedback with respect to these structures”. Nowhere is it possible to record in a proper, voting, manner that one wishes to rank the Wolfram-CBM Structure C last, and preferably lower. Nowhere is there a dedicated question to indicate what is bad about a bad structure.

The VCAA questionnaire explicitly funnels respondents away from stating which structures the respondents believe are inferior, and why.

The good news is that the manipulativeness of the questionnaire probably doesn’t matter, since the responses will be presumably just be considered by another VCAA “expert panel”.

The VCAA overview gives no indication how the responses to the questionnaire will be considered and provides no commitment that the responses will be made public.

The VCAA overview goes on to provides outlines of the three structures being considered, which we’ll write upon in future posts. We’ll just comment here that, whereas Structures A and (to a lesser extent) B are laid out in some reasonable detail, Structure C looks to be the work of Chauncey Gardiner:

What is written about Structure C in the VCAA overview could mean anything and thus means nothing.

True, for a “detailed overview” the reader is directed to the Wolfram-CBM paper. That, however, only makes matters worse:

A 28-page sales pitch that promotes particular software and particular commercial links is much more and much less than a clear, factual and dispassionate curriculum structure, and such a pitch has absolutely no place in what VCAA describes as a “blue-sky” review. By giving prominence to such material, the VCAA fails to treat the three proposed structures in anything close to a comparable or fair manner.

If there were any doubt, the overview ends with the overt promotion of Structure C:

The distinctive proposal … contain[s] aspects which the Expert Panel found valuable … There was support for these aspects, indeed, many of the invited paper respondents [to the 2018 paper] independently included elements of them in their considerations, within more familiar structures and models.

Nothing like putting your thumb on the scales.

It is entirely inappropriate for a VCAA overview purportedly encouraging consultation to campaign for a particular structure. A respondent having “included elements” of an extreme proposal is a country mile short of supporting that proposal lock, stock and barrel. In any case, the cherry-picked opinions of unknown respondents selected in an unknown manner have zero value.

Though woefully short of good administrative practice, we still might let some of the above slide if we had trust in the VCAA. But, we do not. Nothing in VCAA’s recent history or current process gives us any reason to do so. We can also see no reason why trust should be required. We can see no reason why the process lacks the fundamental transparency essential for such a radical review.

In summary, the VCAA review is unprofessional and the consultation process a sham. The review should be discarded. Plans can then be made for a new review, to be conducted in the professional and transparent manner that Victoria has every right to expect.

## Reviewing the VCAA Review – Open Discussion

The VCAA is currently conducting a “review” of VCE mathematics. We’ve made our opinion clear, and we plan to post further in some detail. (We’ll update this post with links when and as seems appropriate.) We would also appreciate, however, as much input as possible from readers of (especially critics of) this blog.

This post is to permit and to encourage as much discussion as possible about the various structures the VCAA is considering. People are free to comment generally (but carefully) about the VCAA and the review process, but the intention here is to consider the details of the proposed structures and the arguments for and against them. We’re interested in anything and everything people have to say. Except for specific questions addressed to us, we’ll be pretty much hands-off in the comments section. The relevant links are

## The Wolfram at the Door

(Note added 20/4: A VCAA questionnaire open until May 10 is discussed at the end of this post. Anyone is permitted to respond to this questionnaire, and anyone who cares about mathematics education should do so. It would be appreciated if those who have responded to the questionnaire indicate so in the comments below.)

Victoria’s math education is so awful and aimless that it’s easy to imagine it couldn’t get much worse. The VCAA, however, is in the process of proving otherwise. It begins, and it will almost certainly end, with Conrad Wolfram.

We’ve long hoped to write about Wolfram, the slick salesman for Big Brother‘s Church. Conrad Wolfram is the most visible and most powerful proponent of computer-based maths education; his Trumpian sales pitch can be viewed here and here. Wolfram is the kind of ideologue who can talk for an hour about mathematics and the teaching of mathematics without a single use of the word “proof”. And, this ideologue is the current poster boy for the computer zealots at the VCAA.

The VCAA is currently conducting a “review” of VCE mathematics, and is inviting “consultation”. There is an anonymous overview of the “review”, and responses to a questionnaire can be submitted until May 10. (Below, we give some advice on responding to this questionnaire. Update 25/4: Here is a post on the overview and the questionnaire.) There is also a new slanted (and anonymous) background paper, a 2017 slanted (and anonymous) background paper, a 2014 slanted (and anonymous) background paper, and some propaganda by Wolfram-CBM.

In the next few weeks we will try to forego shooting Cambridge fish in the barrel (after a few final shots …), and to give some overview and critique of the VCAA overview and the slanted (and anonymous) background papers. (We hope some readers will assist us in this.) Here, we’ll summarise the VCAA’s proposals.

The VCAA has stated that it is considering three possible structures for a new VCE mathematics study design:

• Structure A.1 – the same warmed over swill currently offered;
• Structure A.2 – tweaking the warmed over swill currently offered;
• Structure B – compactifying the warmed over swill currently offered, making room for “options”;
• Structure C – A “problem-centred computer-based mathematics incorporating data science”.

What a wealth of choice.

There is way, way too much to write about all this, but here’s the summary:

1. Structure C amounts to an untested and unscripted revolution that would almost certainly be a disaster.

2. The VCAA are Hell-bent on Structure C, and their consultation process is a sham.

So, what can we all do about it? Pretty much bugger all. The VCAA doesn’t give a stuff what people think, and so it’s up to the mathematical heavy hitters to hit heavily. Perhaps, for example, AMSI will stop whining about unqualified teachers and other second order trivia, and will confront these mathematical and cultural vandals.

But, the one thing we all can do and we all should do is fill in the VCAA’s questionnaire. The questionnaire is calculatedly handcuffing but there are two ways to attempt to circumvent VCAA’s push-polling. One approach is to choose Structure C in Q1 as the “prefer[red]” option, and then to use the subsequent questions to critique Structure C. (Update 25/4: this was obviously a poor strategy, since the VCAA could simply count the response to Q1 as a vote for Structure C.) The second approach is to write pretty much anything until the catch-all Q9, and then go to town. (20/4 addition: It would be appreciated if those who have responded to the questionnaire indicate so below with a comment.)

We shall have much more to write, and hopefully sooner rather than later. As always, readers are free to and encouraged to comment, but see also this post, devoted to general discussion.

## The VCAA Dies Another Death

A while back we pointed out two issues with the 2018 Specialist Mathematics Exams. The Exam Reports (though, strangely, not Exam 1) are now online (here and here). (Update 27/02/19: Exam 1 is now also online.) Ignoring some fresh Hell suggested by the Exam 2 Report (B2(b), B3(c)(i), B6(e)), how did the VCAA address these issues?

Question 3(f) on Section B of Exam 2 was a clumsy and eccentrically worded question that covered material outside the curriculum. Unsurprisingly the Report made no mention of these issues. But, what about a blatant error by the Examiners? Would they remain silent in the face of such an error? Again?

Question 6 on Exam 1 (not online) required students to find the “change in momentum” of an accelerating particle. Unfortunately, the students were required to express this change in kg m s-2. The Exam had included the wrong units, just a careless typo, but a blatant error. The Report addressed this blatant error with the following:

Students who interpreted this question as asking for the average rate of change of momentum to be dimensionally consistent with the units and did this correctly were awarded marks accordingly.

That’s it. Not an honest word of having stuffed up. Not a hint of regret or apology. Just some weasely no-harm-no-foul bullshit.