## VCAA’s Welcome, Cowardly Redactions

It’s difficult get a handle on VCAA these days, which seems to be suffering from Multiple Personality Disorder. There are some indications of significant and positive change but there also signs of businessing as usual. To be clear, the MPD is a dramatic improvement over VCAA’s previous decades of unceasing sociopathy and narcissism. But it also makes it difficult to comment on VCAA doings right now, since the mixed messages mean that we don’t know which VCAA is in control. Such is the case with VCAA’s recent exam redactions. Continue reading “VCAA’s Welcome, Cowardly Redactions”

## VCAA’s Mathematical Flea Circus, Part Two

Yesterday’s post was on the 2023 Methods Exam 1 report (Word, idiots), its foggy focus on trivia and also on the ludicrous non-solution of one particular question. Now, it’s on to Exam 2 Report (Word, idiots), which is what prompted these posts.

Here are the Exam 2 General Comments in their entirety. I have commented elsewhere on a couple aspects (here on Q3(a), here on Q3(e) and here on Q5(b)). I won’t comment further now, except to note that it seems to me to be nothing but fleas. I can detect no sign whatsoever of any underlying host.

This is the first year of the new study design and most students were able to respond effectively to the questions involving the introduced concepts, such as Newton’s method in Questions 3f. and 3g. and the point of inflection in Question 3d.

As the examination papers were scanned, students needed to use a HB or darker pencil or a dark blue or black pen.

Some students had difficulty understanding some of the concepts, such as the difference between average value of a function and average rate of change (Questions 2b. and 2c.). This year two questions involved strictly increasing and strictly decreasing functions. In Question 3e. many students used round brackets instead of square brackets. In Question 5b. either bracket type was appropriate as the maximal domain was not required. Some students did not attempt Question 3a. They appeared to be confused by the limit notation .

Students need to ensure that they show adequate working where a question is worth more than one mark, as communication is important in mathematics. A number of questions could be answered using trial and error in this year’s paper, especially in relation to probability, Questions 4f. and 4g. Drawing a diagram can often be helpful to show the output for some of the trials. Students are allowed to use their technology to find the area between two curves using the bounded area function, but they must show some relevant working if the question is worth more than one mark. Often, questions require that a definite integral is written down, such as in Question 1cii. In Question 4d. students were expected to identify and write down the n and p values for the binomial distribution, not just the answer.

There were a number of transcription errors, and incorrect use of brackets and vinculums, in Questions 1b., 1ci., 1d. and 3ci. Students needed to take more care when reading the output from their technology. For example, some students wrote instead of  in Question 1b. Others wrote instead of  in Question 1ci.

Students need to make sure they give their answers to the required accuracy. In most of Question 1, Questions 3b. and 3h., and Question 4i., exact values were required. In all questions where a numerical answer is required, an exact value must be given unless otherwise specified. A number of students gave approximate answers as their final response to these questions. In Question 3f. some students gave their answers correct to two decimal places when the response required three decimal places.

There were a number of rounding errors, especially in Questions 1ciii., Questions 3d. and 3f., Questions 4a. and 4h., and Question 5ci. Students must make sure they have their technology set to the correct float or take more care when reading and transcribing the output.

Students need to improve their communication with ‘show that’ and transformation questions. There was only one ‘show that’ question this year, Question 2a. Sufficient working out, presented as a set of logical steps with a conclusion, needed to be shown. In Question 5a., the transformation question, the correct wording needed to be used, such as ‘reflect in the y-axis’.

In general, students appeared to have made good use of their technology, for example in finding the equations of tangent lines, finding bounded areas and using graph sliders to get approximate answers to complicated questions. Some students, however, need more practice at interpreting the output from their technology, especially when the technology uses numerical methods to find solutions. In Question 3cii., the tangent line passes through the origin, but some students gave y = 4.255x + 8.14E-10 as their final answer, not appearing to recognise that 8.14E-10 should be zero.

Most students made a good attempt at the probability questions. Some students, however, still misinterpreted the wording. Errors occurred in Questions 4a., b., c. and d.

Students are reminded to read questions carefully before responding and then to reread questions after they have answered them to ensure that they have given the required response. Question 1a. required the answers to be in coordinate form. Question 3ci. required an equation. In Question 3cii. many students only found the value of a and did not continue to find the equation of the tangent.

## VCAA’s Mathematical Flea Circus, Part One

To people not involved in Mathematical Methods it is difficult to convey the awfulness of the subject. Even regular readers of this blog, who will be somewhat aware of our million or so posts, cannot properly appreciate Methods’ unrelenting shallowness and pedantry and clunkiness and CASiness. The subject, to the extent there is one, is infested with fleas.

## WitCH 139: Increasing Irritation

This one is from Q3 of the 2023 Method Exam 2 and the subsequent exam report (Word, idiots). The entire question is terrible, as discussed here, but I’ll be interested in what people have to say about the specific part below.

## The New Report on the VCE Exams is Out

The report of the VCE exam review conducted by Dr. John Bennett has been handed down. The media release is below and the Executive Summary of the report is here (and in Word). Presumably, there is no intention to release the full report. (23/03/24. And my presumption was wrong. See the update below.) I haven’t yet had a chance to look at anything, and I may update this post later.

## UPDATE (20/02/24)

VCAA’s acting CEO, Kylie White, gave a lengthy ABC interview this afternoon. There are also reports in the AgeHerald Sun and EducationHQ.

I’ll leave off writing any detailed thoughts about the report until the dust and my brain have somewhat settled. In brief, while there are significant nits to pick the report is very good on some important recommendations. In particular: employ competent mathematicians; and report in a timely manner. These are not magic wands, but they would be big and important steps.

## The Complex Roots of VCAA’s Defence

### 1. Introduction

Sometimes VCAA is their own worst enemy. Well, no: we all know the identity of VCAA’s worst enemy. But on occasion VCAA places runner up. Maybe third.

I’ve been pondering VCAA’s major 2022 errors, how they could have occurred and how VCAA could continue to defend the flawed questions for so long, against all reason. Yes, “VCAA is nuts” springs to mind. But that’s not enough. VCAA being nuts is necessary but not sufficient to explain this extended episode of madness. Continue reading “The Complex Roots of VCAA’s Defence”

## VCAA’s Defence of the Indefensible 2022 Exam Questions

The most maddening aspect of VCAA is their steadfast refusal to admit error. Typically, VCAA simply will not engage and whatever nonsense they’ve written remains written. 2022 was different, however, in that Burkard and I badgered VCAA sufficiently for VCAA to decide to conduct an external review(s). As well, a mathematics teacher complained strongly enough and unwaveringly enough about the four bad Specialist Mathematics questions that a substantial response to his specific objections was required. Continue reading “VCAA’s Defence of the Indefensible 2022 Exam Questions”

## Did VCAA Misgrade the 2022 Exams and, if so, Why?

This will be my last VCAA post for a while. I’m sick of it all. And, anyway, it’s now a matter of waiting to see what the Minister and the newly acting CEO of VCAA will do. I think both of them have demonstrated some genuine interest in repairing the mathematics exams, and I think both of them deserve an honest shot at it, without a nasty blogger sniping at them. But, there’s one last post I think must be written.

One of the lingering puzzles of VCAA’s Deloitte Debacle is why VCAA were so determined to game their mathematics exam reviews so much. VCAA were clearly determined beyond all plausibility to ensure that Deloitte examined nothing beyond trivialities, and it seems VCAA didn’t let a proper mathematician get within a country mile of the exam questions. But why? Sure, the 2022 exams were, as always, bad, with, as always, bad errors, and any halfway honest review would have been plenty embarrassing. But bureaucratic deceit can only hide so much, and not nearly as much as VCAA had to hide. So now, rather than simply looking like a gang of clowns, VCAA looks like a gang of deceitful clowns.  Continue reading “Did VCAA Misgrade the 2022 Exams and, if so, Why?”

## A Student Petition on the VCE Exams

A group called High School Activate has launched a petition, to the VCAA and the Victorian Minister for Education, on the VCE errors business. I obviously agree with and support plenty (but not all) in the letter. But just to be clear, I have no idea who HSA are, and they definitely aren’t me or associated with me.

Presumably the intention is for the petition to be signed by current and finishing school students, but it’s up to you/HSA.