And you’re done (unless you’re doing Foundation). I hope the exams went well for all of you, and all of your students.

## UPDATE (12/11/23)

And, finally, my thoughts on Part B. In brief, it’s shallow drivel. One 3-mark question, and nothing above. The entire exam is a disgrace, just poorly written semi-maths. With errors.

**Q1. **Yet another idiotic walking track. The “minimum turning point” should be “turning point”, which still means pretty much nothing for a walking track. In part (b), the curves meeting “smoothly” would be sort of ok, since this has a casually accepted meaning of the derivatives matching, *except* the uncorrected 2021 exam report might screw kids up. The entire problem is aimless and as boring as dirt.

**Q2**. An OK but too easy complex question. In (a) and (b), the writers use “root” to refer to a solution of a polynomial equation: the kind of thing of thing examiners love to whine about in the reports. Part (c) has the points “represent” complex numbers, and in (d) a point is “represented” by a complex number: both version are needlessly clumsy, but at least make a choice and stick to it. The wording of (d)(ii) is atrocious. Part (e) is just depressing, but (f) is better. In (f)(i), specifying A and B are positive reals is oddly neither here nor there, does not uniquely determine either, but more importantly B, and could have been much better targeted for (f)(ii). The “[g]iven that w = cis(2π/7)” in (f)(ii) doesn’t really mean anything.

**Q3. **Jesus. Rotating a curve does not give a solid. VCAA can keep saying it, and it will keep being wrong. Part (b) should just be “the surface area”, not “the curved surface area”, but of course the idiots screwed up the preamble. Why is (b) split into two parts, which also means (b)(i) has a zillion different answers? Yes, one answer is more natural, but it’s bad mathematical style. The “[h]ence or otherwise” in (b)(ii) is weird and unhelpful: what could possibly be the “otherwise”? Parts (c) and (d) might have been interesting algebra, but are just boring numerical crap.

**Q4. **A boring and clumsy logistic question. The population Q is “modelled” by the solution in the preamble to (c), and then the differential equation is given in (e). Part (e)(i) could be nice algebra, but is presumably just 1-mark CAS crap. The wording of (e)(ii) is atrocious. Part (g) is poorly worded and, although better than the related WitCH, is still borderline wrong: what does “the maximum number of fish that could be supported” mean? One can have more fish in the pond than the equilibrium number, but this seems to be no part of VCAA’s presentation of logistics.

**Q5. **A pedestrian lines and planes question. Parts (b) and (e) should refer to “distance”, not “shortest distance”. More importantly, as discussed here, why are these distance equations examinable? Given they are examinable, why are the relevant formulas not included on the formula sheet? This really seems remarkably incompetent. As Sentinel has pointed out, (d) asks for “an equation [singular] of the line in parametric form”: it should be three equations, and it is not clear a single vector equation will be marked correct (and it damn well should be). Lastly, and it’s no big deal, but the choice of ψ to refer to a plane is eccentric.

**Q6. **It’s stats crap. I don’t do stats crap. But, famously, VCAA screwed up (h).

## UPDATE (12/11/23)

Thank you to everyone for your comments. Here are my thoughts on the multiple choice questions. In brief, they suck.

**MCQ 3. **Excruciatingly bad wording, a genuine literary offence. ~~The endpoints of the interval are included, even though the function is undefined there.~~ (**12/11/23**. The function is fine at the endpoints, but not at all points in the interval. That means the wording is a little poor, but we’ll leave it be.)

**MCQ 4.** A poor question, fundamentally simple but with gratuitous, distracting noise.

**MCQ 5**. In principle a reasonable question but, as discussed below, the specification that is weirdly pointless: students who falsely conclude that will still likely wind up with the correct answer. Plus, for the thousandth time, the proper word is “equals”, not “equivalent”.

**MCQ 6**. It’s pesudocode crap. I don’t do pseudocode crap. But commenters have pointed out that the question is badly flawed: the code does not follow VCAA’s own style guide, and following the code will not print out what is claimed will be printed out.

**MCQ 7.** Gradient field or slope field is more accurate than direction field.

**MCQ 9. **An OK question, but it is slightly odd, and clumsy, to ask for “the slope of the tangent to the path of the particle”. This is not particularly meaningful for a path in space, and “slope” is an oddly casual word.

**MCQ 10. **A very good question with extraordinarily clumsy wording.

**MCQ 11.** In principle a good question, but very clumsily written and too busy. There is no need to say “the curved surface”: it is simply “the surface”. Similarly “part of the curve” should just be “the curve”, since the domain is immediately specified. Then, it is simpler and preferable to say the “the area equals …”, rather than the “the area can be found by evaluating …”. Finally, the last two multiple choice options are different in character, making the question improperly confusing, and the specification “where u = sin(y)” is mathematically meaningless (even if the reason to include it is clear and was well-intentioned).

**MCQ 13.** What trivial SUVAT is doing on an SM exam, God only knows.

**MCQ 14. **An odd and interesting question, but probably not a good question. The point is that **c** must be perpendicular to both **i** and **j**, and then the question is easy. But it is not clear how SM students are supposed to get there. (Plus, the dots for the dot product should be centred.)

**MCQ 15**. A good question.

**MCQ 16. **More trivial SUVAT. Also, “vertical distance” is not really a thing, but ok. Why specify that **i** points east and **j** points north? Who the hell cares?

Hmmm… extended response looks easier than Methods… >.<

The indentation on MCQ 6 is so weird — why are the statements after the function definition and before the start of the while loop indented? The ‘print y’ is also not indented inside the while loop, so at first glance it may look like the code does not print anything, if not for the ‘end while’ at the end.

I agree, the formatting for the pseudocode is unacceptably bad. It gives the impression that the person writing that question has never written pseudocode before. I’m guessing the person writing this question looked up an example of pseudocode for reference, and completely misunderstood the formatting of the example. And then no one proofread it, as this is the sort of thing you’d expect a middle school student with even the slightest of interest in programming to pick up on.

The annoying thing about this is that VCAA specified that their pseudocode has the

basicprinciples– Have only one statement per line.

– Use indentation to show the hierarchy of processes within an algorithm such as repeating sections and conditional decisions.

– End nested processes with an end keyword (end if, end while).

They missed and EndIf in the methods paper and mucked up the indentation in specialist… it really is not so hard. Even if the writer mucked it up, the proof readers should have picked it up.

The Euler’s method question in specialist is easily done using the euler(…) function on the TI Nspire, and the Newton’s method question in methods converged so fast you could just solve the cubic and get the right multiple choice answer…

Thanks, Simon. I pay no attention to pseudocode, but this sounds pretty bad.

The more I look at it, the worse it gets.

The first line is incredibly confusing. Anyone who’s ever read pseudocode before would expect that to be the definition of a procedure, but no, it’s just a random definition of a function. Don’t mind the fact that the next four lines are indented for no reason! The confusion is compounded because the variables used in this function definition are used throughout the rest of the pseudocode.

“define”, “while”, and “end while” are control structures. “print” is nothing like those. If you are going to have different levels of indentation (which you should), do it properly, not like this. The purpose of indentation is to make the boundaries of blocks of code more clear, not to just have different types of statements arbitrarily indented differently for no random reason!

There is no specification of what internal number system is being used here, so there is no guarantee at all that the pseudocode will print 2.709. Most likely it will not, as computers typically don’t print things to three decimal places. Why not write “print y to 3 decimal places”? Or specify “a value in [2.708, 2.710].”

I feel like how the supposed “machine” stores numbers, whether it follows IEEE 754 or whatever, is irrelevant for pseudocode (given the “pseudo”). In my opinion, it is perfectly valid and not necessary to be explicit to assume that we can deal in exact values for pseudocode purposes.

The question asks something along the lines of “after how many iterations will the pseudocode print 2.709.”

The behaviour of how the “machine” stores/prints numbers here is relevant, as if the machine doesn’t output to exactly three decimal places, the machine won’t print 2.709.

I’m with Tungsten here. I think it’s safe to assume that all values will be exact, especially as this is a maths exam and students will certainly not be expected to know about floating point representations.

Another weird thing is the ‘n’ variable that doesn’t do anything. I’m not sure what the intention was to have ‘while n ≥ 0: n ← n + 1’. Maybe something to do with the number of iterations? I still think determining the number of iterations is clear enough without the ‘n’ variable.

I initially thought that the inclusion of a value for n was just intentional obfuscation from VCAA, but I see that it could be used for the number of iterations.

The decimal point thing isn’t as big of a deal as the other issues, but it’s still there. Since this is a multiple choice question, it’s obvious what they are expecting. But what if it wasn’t a multiple choice question? The answer “never” would be a correct answer.

Very good point.

Given everything else weird with the question, I reckon the ‘n’ bit could have been a copy paste or something.

I had a look at the question again, and I do agree that it would be best to specify the number of decimal places.

I believe there is an error in ERQ6 – have they labelled the distributions in part h the wrong way around?

Several people in my class lost over five minutes trying to figure out what was going on with the labelling, pathetic from VCAA to have such an error.

We were thinking the same thing at our school.

H0 and H1 labelled wrongly!

It is shocking! two of my good students noticed !

Lots of one mark questions which is different from normal specialist exam 2 style.

it is terrible. they donot care the process/understanding of the topic, correct working out won’t get any mark. It is not good for anyone, except VCAA markers!

Cannot believe there was an error after all this attention over the General paper. I lost 15 minutes doing and then redoing the last question with my hypotheses swapped (convinced there was no way they hadn’t caught an error after all this) and then deciding to re-do the question once more properly and re-label the normal distributions provided. Likely lost several marks that could’ve been gained from using that time to check MC and the rest of the paper.

That is horrible. The paper would have been finalised and printed a while ago before the General exam, but it is inexcusable in the best of times and after the debacle of last year and media coverage since you would think…

True but they’d caught an error on Methods and one of the two errors in General in time and instructed students to cross out the error and amend it. I’m shocked that days after the education minister has a press conference and the CEO of the VCAA is on ABC radio claiming this won’t happen again, no one thought to finally rigorously proofread the Specialist paper yet to be sat. This isnt even like errors last year where a student on “autopilot” wouldn’t notice and likely is why they weren’t caught by VCAA. Any student with half a brain and certainly and professional mathematician hired to test sit this paper before its release should catch this error under any circumstances.

I wouldn’t have caught it. But I wouldn’t have pretended I was qualified to vet a stats question.

True I mean I’m exaggerating. The reality is more tragic: most students wouldn’t catch this, only those at the top of the spectrum meaning my entire class who were all pursuing 50s are distraught at having wasted so much time whilst most on public forums seem to just be celebrating an easy exam unaware of the error.

Exactly. But VCAA and its allies will imply or say outright, like always, that an error affects everyone equally.

No they won’t. Not here. But they’ll probably say they’ll “ensure no student is disadvantaged”. As I noted in my “quick comments” post, it is impossible for them to ensure that.

I just want to point out that, à la Mike Williamson, I tipped this.

Nah, you would have caught it. Even someone who is not a stats expert (only working at year 12 SM level) who sat down to actually answer the paper as printed would have found it.

You don’t know the way stats and I get along.

For the curious, the hypothesis testing question is written as follows:

It is thought that the mean mass of adult male koalas in the forest is 12kg. The ranger thinks that the true mean mass is less than this and decides to apply a one-tailed statistical test. A random sample of 40 adult male koalas is taken and the sample mean is found to be 11.6 kg.

…

The question then goes on to test type II errors by asking:

Suppose that the true mean mass of adult male koalas in the forest is 11.4 kg, and the standard deviation is 1 kg….

Part h has labelled next to the normal distribution centered about , while is labelled next to the normal distribution centered about . Whoops. I’d imagine a few students would be very confused by this error, since the task asked was to shade the type II error on the diagram.

Attached is the diagram from part h.

The normal distribution curves in the figure indicate identical variances / standard deviations under the two hypotheses considered, is this also reflected in the question text somewhere? The first paragraph only mentions the mean, 12 kg. (An aside: very considerate of Mother Nature to choose a standard deviation of 1 kg to make computations and inspection of the figure easier.)

I guess it’s understandable if at the moment no one cares that a rotated curve doesn’t form a solid.

Ah, ERQ3…. A curve being rotated about the x-axis does not yield a solid of revolution, the wording should be something along the lines of the region bounded by … forms a solid of revolution. And then again in part d, they repeat the same error.

Yeah, no one cares but me. But I care.

I’m not free to look at this right now, but I’m happy for anyone to post a screenshot of the relevant part of QB6.

Here is the whole question.

Part h is the particular issue, although it draws from several earlier parts.

SpecMath2023E2B6

Here is the whole question.

Part h is the particular issue, although it draws from several earlier parts.

SpecMath2023E2B6

am i stupid/delusional/insane or was the graph in the last question the wrong way around? i spent like 10 minutes trying to figure out what they wanted from me lmfao

Commenting in relation to Witch 113, this exam, this time in the ER section had a question to find values of an and b for which the piecewise function is “smooth”. I’m sure marty may have some comments on this once again.

Indeed. I think it says “smoothly”. Will look properly when I can.

Specifically, question asks to “Verify that the two curves meet smoothly at point C” (where ), in reference to a piecewise function:

The a and b thing is in the prior part where you're required to show that and . Still, a use of "smooth" nonetheless…

“Meet smoothly” is not quite the same as “smooth”. I’ll think about it.

I’ve thought about this more, and had pondered adding it to the “smooth” WitCH, but decided against it. There’s a colloquial use of “smoothly”, and that slides into “join smoothly” meaning the gradients match. That was made explicit in a question on the 2006 MM2 exam:

“The track passes smoothly from one section of the track to the other at B (that is, the gradients of the curves are equal at B).”

That’s also the clear enough meaning and task here.

The recent usage of “smooth” in Methods has much more the sense of a formal (and wrong) definition, and a confusion over what is required to prove a function is smooth.

I gather that logic was part of the new syllabus in Specialist Mathematics. Was this examined? (I don’t have copies of the examinations.)

Unless I missed anything it was examined only in one question in exam 1, which seems to be a fairly standard induction problem, though I haven’t looked at it closely.

I thought that there was going to be a lot more, considering logic and proof is supposed to be an entire area of study.

I noticed a trivial (but at least correct) “contrapositive” MCQ, but I haven’t looked properly.

MCQ 1 asks you to find the contrapositive of the statement “If my football team plays badly, then they are not training enough”. No other logic questions that I can see.

Susie O’Brien here from the H Sun. VCAA told me tonight they are investigating today’s spec exam error to make sure no one is disadvantaged. Will kids get an extra mark???

They still say there’s no issue with 2022. They’re sticking with the Deloitte line.

However, they want to use the maths professors from the open letter to help them with a new review.

Bit of a mess.

Here’s tonight’s story. and thanks for all your input. I got a letter from a student tonight saying today’s error in spec cost him 20 minutes in the exam.

An error in the second VCE specialist maths paper – the seventh maths mistake so far in the 2023 exams – is under investigation by the Victorian Curriculum and Assessment Authority.

A spokesman for the VCAA apologised for the “undue stress this has caused schools and students and will take steps to ensure that no student is disadvantaged”.

Teachers are calling for the drafting mistake in a one-mark question about type II errors to lead to all students who sat the exam being awarded a bonus mark.

The mis-labelled graph in Monday’s exam brings the 2023 maths error tally to at least seven, including three errors in general maths and three errors and a recycled question in maths methods.

A VCAA spokesman apologised for the undue stress the exam errors have caused schools and students. Picture: Nicole Cleary

A VCAA spokesman apologised for the undue stress the exam errors have caused schools and students. Picture: Nicole Cleary

The VCAA said it “accepts full responsibility for errors in the 2023 VCE examination papers. These do not meet the high standards that the VCAA sets and that the community rightly expect”.

One of the errors in the general maths exam led to an apology from Education Minister Ben Carroll last week and the awarding of one bonus mark to all students who sat the exam.

The new specialist maths error in section B, question 6, part h was picked up by students and teachers on Monday and involved the mislabelling of a graph, with H1 and H0 switched around. One leading teacher labelled the mistake “very careless and potentially confusing”.

“The question is worth one mark, I wonder if the VCAA will give everyone one mark like they did with general maths?” he told the Herald Sun.

The error-ridden 2023 exams come as 69 eminent professors and university mathematicians signed an open letter to Mr Carroll asking for further investigation into five serious errors in the 2022 maths exams on Sunday night.

The VCAA last year commissioned a review to examine errors in the 2022 specialist and methods maths exams raised by Monash University mathematicians Professor Burkard Polster and Dr Marty Ross.

The VCAA review found the language in the exams could have been more clear, but there were no serious errors that affected students.

This finding was not supported by Professor Polster and Dr Ross, who even offered to vet the exams for free. This was not accepted by the VCAA.

Similar concerns were raised by leading teacher John Kermond in a submission to a state parliamentary inquiry this year.

As a result of the lack of action by the VCAA, Professor Polster and Dr Ross wrote an open letter to Education Minister Ben Carroll expressing concern about the errors and the mishandling of the issues by state officials.

It reads: “We are university mathematicians, we have read the exam questions at issue, and we agree with Polster and Ross. Whatever term one wishes to use, all five questions are unacceptably flawed”.

“Each question exhibits some fundamental misunderstanding or misrepresentation of the underlying mathematics,” the open letter states.

“Each question, at minimum, would have created unnecessary confusion, with a subsequent loss of time for at least some students, and probably many.

“It is difficult to imagine how the questions could have been graded in a fair and consistent manner, and in any case the flaws are simply, on a mathematical basis, unacceptable,” the letter said.

On Monday the VCAA restated its view that the 2022 errors were not serious.

However, it will work with the signatories of the open letter on another review of maths exams “to ensure that the mathematics in future examinations meets the highest possible standard”.

Mr Carroll told the Herald Sun on Monday: “It’s my expectation that all curriculum and assessment is high-quality and grounded in academic integrity.”

Nearly 50,000 Victorian students sat at least one maths exam this year. There is one more exam to go – foundation maths – a new subject in 2023.

Thanks very much, Susie. Re VCAA:

“They still say there’s no issue with 2022. They’re sticking with the Deloitte line.”

“On Monday the VCAA restated its view that the 2022 errors were not serious.”

VCAA can say whatever they want: there’s not a person in the state who believes them.

No great shakes on this sort of crisis but here is my guess: someone (at least one) at VCAA is not having a holiday today and is busy writing a briefing for the Minister which he has demanded be on his desk first thing tomorrow (and will go up the chain tonight via the DET Secretary).

VCAA/DET is busy wondering how to control the process and potential contagion (is it just maths that has this problem – we don’t know what the Deloitte review covered but anything it said was OK is now clearly up for grabs – does this apply to other subjects? is it just errors on exams or is the exam process more generally flawed? are the exams high quality in general? What does this say about VCAA’s general processes of quality if they can’t even get exams without typos, let alone serious academic errors). What does it say about their ability to get rigorous input on exams as well as curriculum development more broadly etc etc? What does it say about their relationships with universities? They will definitely not want to release the Deloitte review. A number of people will be thinking (as they often do) about their careers.

Am guessing another review will be set up, which DET and VCAA will already be drafting terms of reference for and will want to be very very tightly focussed on just errors in maths exams and the process of their development, conducted by a trusted consultant, with minimal input from an external advisory committee.

If you want more details, look up Utopia on ABC iview.

JJ, After reading the posts detailing Marty and Burkard’s correspondence with VCAA, I think your Utopia comment is perfectly on point!

Unfortunately.

Thanks very much, JJ. Two quick questions (and a bunch of slow ones, which I’ll email).

1) VCAA obviously don’t want to release the Deloitte review (and ToR) (and the “state education bodies” review (and ToR)), since it was, even on the basis of the summary, an utter farce(s). But, can they avoid it? Can it likely be obtained either by FOI (very slow, of course) or the Minister demanding it (if he wants it)?

2) Obviously VCAA will seek to game any second review, to the extent they think they can get away with it. How much can they get away with it?

Of course “the signatories of the open letter” needn’t be part of anything they regard as a sham, and it would take one false peep from one of the VCAA clowns to convince any and all of them. (One professor who signed the letter subsequently described the Deloitte story as “blood curdling”.) But presumably also the Minister could demand a review be properly kosher (if he wants it)?

Hi Marty

In politics, speed is of the essence. Issues lose traction very quickly. Being proven right after the event and the decisions have been made is of little use (as I learnt the hard way).

The main game is now the ToR and the review. The Minister can do PRETTY MUCH ANYTHING he wants (but he will presumably want to fix things without too much embarrassment especially as it’s his Government). The usual process would be that the agency would propose a process including terms of reference (may already have gone to him). Prior to this there have likely been phone calls from advisers setting out how the Minister feels and parameters etc.

The current key debate is likely what are the terms of reference of the review (particularly how broad) and then will be who does it and who oversees it (eg by VCAA or DET) .

VCAA will say it’s just proofing. Your job is to show that it’s not just proofing, it’s the whole process and even taking out the typos, the questions would still be wrong. And to show just how unfair and awful it is. Given that VCAA has managed to do this all AFTER you went to them, it’s harder for them, but the Minister will want to make the problem go away. So you need to ensure he realises that fixing things deeply is required, not just a better typo detection. This may be the key now and stories from this website and the public may help (particularly if forwarded to the Minister).

How much VCAA can get away with depends on how the politics goes – that’s the Minister, you, the public and them. It’s all contingent on day by day decisions.

FOI – delay can be almost as good as denial. There are many many tricks to delay it and make it almost the same as blank by blacking out lots of bits on ‘commercial in confidence’ etc etc. By the time the Deloitte review is released if it ever is, it won’t matter because the decisions have been taken. The Minister could easily order its release (he can pretty much do anything) but will he want the embarrassment that causes?

Even when a Minister wants to act, they need to be driven to it – see here: https://www.dailykos.com/stories/2009/05/15/731660/-Make-Me-Do-It

Also worth reading – particularly inaugural speech – gives you some idea of the man – https://new.parliament.vic.gov.au/members/ben-carroll/

Thanks tons, JJ. And please expect an email soon.

Here are my solutions – some comments within

https://www.mathcha.io/editor/JXP0YuKpSJ6i6jsGgWQJDIlLV4XTJ10OzouDxeEGn

Feedback and corrections more than welcome

Thanks for pointing out that MCQ7 does not show a direction field (nor a vector field) but rather a slope field.

Words are important.

Section B, Q2b, I’m not sure if the examiners will require the arguments be in the domain and I would hope your answers would be marked as correct. I’m not sure though.

“Slope field” is more accurate and preferred, but “direction field” is also sometimes used. So, needles not great, but not in the same ballpark as VCAA’s perversion of “smooth”.

They won’t demand Principal arguments in 2(b). Even VCAA is not that stupid.

Thanks!

On the “words are important” strain, there is the whole roots vs solutions confusion in ERQ2.

Oh, God. Really?

Yep – I’m pretty sure that you’ve hammered this one before, but “root” makes too good a pun for the titles of your posts, so I couldn’t find it.

Maybe we should make reading your blog (at least the WiTCH and exam posts) compulsory for exam writers and proof readers…

I think I’ve shared it before – but every time I see or discuss the problems with VCAA papers (not just in maths) I think of this interview about how AQA writes their GCSE and A level maths exams #027 Chief Examiner Trevor Senior: How GCSE Maths exams are written [also see here]

A clear and structured process to help catch problems and create fair assessments.

Thanks, Simon.

The root-solution thing was brought up in the 2023 NHT Methods discussion, here. In fact, I didn’t look to hammer it, and downplayed it as more of a “gauche” grammar thing. But, a commenter replied that the examiners have whined about this very issue. So, sauce for the goose, and I added it as a major (red) error on the Methods error list.

As for my blog not being compulsory reading for exam writers and vetters (and textbook authors), with all due humility I’m surprised. Of course I understand how irritating this blog is for them, and they probably also morally disapprove of my scathing and public approach, perhaps rightly so. But the simple fact is that I know what I am talking about. And, way too often, they do not.

Yes, I engage in plenty of slagging off on this blog (and maybe I should write a post on how it came to be like this). But I also work really, really hard to make the substantive criticism as clear as I possibly can. If these people would read this criticism with an open brain, they would be much less likely to repeat the same nonsense over and over and over. The latest WitCH is a perfect illustration of this.

Checking a little, there are slightly different whinings going on.

In the past, the examiners have whined about confusing “roots” and “factors”, for example, QB2(a)(iii) in 2014 SM2. (I only hunted quickly, and there may be more recent whining.)

In MCQ 17 of 2023 NHT MM2, the writers used “solution” instead of “zero” (or “root”, since it’s a polynomial).

Now, in the exam being discussed here, the writers are using “root” instead of “solution”.

Again, I don’t overly care about any of this. It should be corrected in students’ work as a matter of accuracy and style, but it’s not the end of the world. But, the examiners are so predictably pedantic about all manner of trivialities, including this stuff, they have no business screwing it up.

Just checking, we call a root of a value such that is at , a the solution usually is written as “a solution to the equation “? Would that be what you’re referring to, in ERQ2a-c?

In that case, writing “ is a root of ” would be better written as “ is a solution to the equation “, assuming that the terminology should be distinct.

Hi Sai – yes that’s pretty much it.

A

zeroof a function is an s.t.A

rootis usually reserved to mean a zero of a polynomial (generalising the idea of square and cube roots)A

solutionrequires an equation to solve. So a zero of is a solution to ; or the solutions to are the zeros ofQ3c) should the area of the 2 circle use the y value of 1 and 2 instead of x value?

In general, I am disappointed with this paper as I feel it did not give questions that requires more critical or deeper thought. Especially question 3 , part B, it really feels like a MM questions than a specialist one ( good MM student with provided formula can crack this easily).

I don’t want to look at the pseudocode question, what a mess. And of course, the normal curves. How on Earth this could happened?

How about all logic, proof… that was considered a new part of SD replacing mechanics? I sort of expect at least some vector proof should be incorporated into the vector question.

This paper is really bad from teaching specialist mathematics point of view. And I would need to review how I structure my unit 1 and 2, otherwise, my students will be in disadvantages covering all the dot points in SD.

Thanks, Victoria. I’ve so far had zero chance to look properly at this paper (or Methods 2). Maybe someone above has already flagged this, but what was the “normal curves” thing?

Never mind. Someone pointed out offline that I’m being an idiot.

Hi Victoria – thanks for pointing out Q3c – fixed now.

And I agree – the paper was too broken down into small steps – as mentioned elsewhere, lots of 1 mark questions.

And the amount of proof was a bit underwhelming. That was the only part I did not predict well after Exam 1 when talking to my students about Exam 2.

The whole Unit 1 & 2 is so full now and teaching it all well is tricky. Time spent on Boolean algebra, Matrices (transformations in the plane), and Graph theory is pretty hard to justify…

Thanks very much, Simon.

Anyone else think that no one actually sits these papers under timed conditions before they set it aside and say it’s ready?

It should be completed by 3 independent people minimum who use Ti, Classpad and Mathematica to solve the problems.

It REALLY should not be that hard if they are slightly competent..

State the contrapositive.

From FB – the SACS audit process would be sensible to review (definitely Utopia-like for VCAA to be telling teachers off for their SACS).

“Sensible” like it was “sensible” to review RoboDebt. The SAC auditing is insanely nasty, and insanely insane. But the entire SAC system should be ended. It has zero value and massive cost, for both teachers and students.

Hi Marty – true – of course ‘sensible’ tends to mean in whose interests it is ‘sensible’ of course. It took many years and overwhelming evidence to stop Robodebt, with the full extent only coming to light because of a change in Government.

The challenge now is to convince the Minister that the problems go way beyond “the vetting and proofing process for VCE examinations” – which is phrased incredibly tightly.

You’ve done an amazing job getting this far – but bureaucracies are incredibly tenacious. It will take a lot to get it broader than that – convincing the Minister that the rot goes much deeper will be hard.

Some hard examples of how crazy the system is (like the SAC auditing process) may help.

Thanks, JJ. I understand the point. And I understand that, if Burkard and I get to talk to the Minister’s advisors, we’ll have to think very carefully about all this.

With all scaffolding it s depressing.

Plain boring for good students? Or for good teachers?

If hardest maths students cannot (some surely can) perform multistage reasoning and so must be provided with s/f whats the point of it all?

Then errors.

By the way, how many times in exam1 + exam2 vector product should be used? And surfsce area?

Of course. The awfulness of the exams goes way, way, way, way, way beyond the omnipresence of errors. They are utter garbage.

In Q2 of MCQ, would it be fairer to say that some asymptotes are the ones mentioned?

For surface area from exam1 , would it not be easier to put it in context, say, open ended pipe in the shape of given graph , rotated – how much sheeting is needed… , solves the issue of inside/outside surface or both

I agree re MCQ2 – I started answering that question assuming that the denominator had a double root

Jesus. That is a truly bad question.

That’s really funny. Yes, it’s one instance where VCAA’s incessant real-worlding would have helped them, but for once they ceased.

But honest to God, these questions as straight mathematics have been bog standard since Newton died. It is just not that hard.

Yes! I am so used to VCAA’s multiple choice questions being worded poorly such that you need to assume the word “only” is in the question somewhere (so that answers that are correct but don’t contain every intended correct “solution” are considered wrong) that I was thrown for a good couple mins at this question since it was now correct in that the 2 asymptotes given were on the graph in addition to a third

It’s a very bad question for a couple of reasons, and I’ll WitCH it. Is it an error?

Not sure if it’s been posted anywhere here (only did a quick scan of all of the posts) but VCAA have officially acknowledged an error in Question 6h and said that all students will be awarded the mark. Email came through 10am today.

Dear Principal and VCE Coordinator

Following the identification of incorrect labelling in the diagram in section B, question 6h of the 2023 Specialist Mathematics Examination 2 paper, the Victorian Curriculum and Assessment Authority (VCAA) has taken the decision to award all students who attempted the exam a correct score for this question. This question was worth 1 mark out of a possible 80 for the exam. This decision has been made on the principle that this is the most effective and appropriate way to ensure no student will be disadvantaged and that the assessment process is fair, valid and reliable.

The VCAA reiterates our apology to students for this error and for the undue stress this has caused.

The VCAA has committed to a comprehensive review of the vetting and proofing process for VCE examinations with any recommended changes to be implemented for 2024 examinations.

Please use the text below to notify students who sat the exam of this outcome.

If you have any questions, please contact the VCAA Examinations Unit on (03) 7022 5550 or examinations.vcaa@education.vic.gov.au

Student Communication

“Students are advised that, following the identification of incorrect labelling in the diagram in section B, question 6h of the 2023 Specialist Mathematics Examination 2 paper, the Victorian Curriculum and Assessment Authority (VCAA) has taken the decision to award all students who attempted the exam a correct score for this question.

This question was worth 1 mark out of a possible 80 for the exam. This decision has been made on the principle that this is the most effective and appropriate way to ensure no student will be disadvantaged and that the assessment process is fair, valid and reliable.�

VCAA cannot ensure “no student will be disadvantaged”, and they obviously cannot.

VCAA can do what they can, which is award the mark to all, but that is neither fair nor valid. This isn’t hard.

I agree… What about students who spent 10 minutes trying to work out what the hell was going on? Then went to do multiple choice with 10 minutes less to complete it… started to get anxious due to the lack of time and choked on many of the questions because they were rushing… It’s too late…

*But they get an extra mark, that everyone else gets, rendering it redundant.

A student who spent 10 minutes trying to figure it out for a 1 mark question used bad exam technique. I agree just awarding the mark is sounds fair but is really not, but it’s our responsibliity as teachers to teach kids not to spend excess time on a single question under exam conditions.

Yes, but the student’s bad exam technique pales in comparison to VCAA’s.

I disagree with this comment. I had finished all other questions of the paper and so why would I go back and check aspects of the paper that I believe to have been completed correctly when here I am presented with a question that the diagram was telling me I had done completely incorrectly? Many students in my class went back and changed all responses to question 6 with swapped hypotheses which is a completely valid response to seeing what we did not know to be an error at the time. In the rush to do so within the time allowed, many students likely made mistakes in redoing the question that were not made in their initial attempt at it

I don’t want to excuse VCAA for one microsecond, but I agree with Claire.

All students who attempted the exam, or all students who attempted the question?

I suppose neither option is “fair”.

As for “valid” and “reliable”…

No comment.

All students who attempted the exam. Has to be. Not that it matters.

Perhaps students who feel insulted by VCAA’s response should write to the minister.

I think the Minister is well aware, but it can never hurt.

It’s not so much about the Minister being aware – it’s about him being convinced this is a real issue with traction and being able to convince colleagues. Letters to the Minister will be invaluable. The more the better. Ministers act on political pressure – they track letters and if someone is sufficiently angry to write, it’s a good indication.

There might be a big shake-up in VCE mathematics examinations in the next 12 months.

Why?

Judging by the publicity surrounding this matter, I think that the minister might just say “Fix it.”

To whom?

VCAA perhaps. The ins-and-outs of decision-making at that level are beyond my pay grade.

VCAA are about as capable of fixing this as [choose your analogy].

Out of interest, where does one access the exam? Are they only available for teachers? (Sorry for posting this twice, I don’t think it posted the first time)

Hi, Jay. Sorry, I don’t know what happened to your first comment.

That’s one of the nasty, maddening things about VCAA: you cannot get the exams for ages except by people PDF-ing the thing around and so forth. Other states take a couple days. It is inexcusable.

I haven’t hunted, but if people want to share a reddit link or whatever, I won’t object.

Exam available on Reddit here.

One of the questions was missing from the original scan, but a link to the missing question is provided in the comments.

Thanks, Bugle.

For obvious reasons I’m having difficulty finding time to take and proper look at the exams. But today I had a quick peek at the MCQ of SM2. Do they suck as much as I think they suck?

The best I could say for them is that they are tedious. Many can be done by process of elimination with efficient use of the technology – no real thinking required.

I am often struck by the fact that the people who write the Methods and Specialist exams clearly have no understanding of the cohort of students who undertake each one (cannot comment on General as I do not teach it). The Methods MC contained many multi step processes that take most students a long time to process as they find the subject very challenging. In Spesh there was not a single multi step question or even one that required deeper thinking – in a cohort that contains many students who love the challenge and they are needed to ‘rank’ them.

Thanks, EM. I will try to look more closely soon. But looking quickly yesterday, I was appalled.

(I did a quick scan and no one has mentioned it, so either I’m being an idiot – again – or I’m being too pedantic)

MCQ11. It says from to so should the integral have as the lower terminal, which gives a negative result, which we then interpret as “take the positive because this is an area”?

There is no error.

When you draw a graph of you can see that the lower terminal of the required integral is 0 and the upper terminal is (since you’re integrating ‘upwards’ along the y-axis). You substitute and get option E as the intended and correct answer.

Sure. I get the graphical approach and the need to integrate “upwards”.

I’m just questioning the use of the word “from” here but looks like I’m over-thinking it.

Of course, none of the other options worked, although some were well chosen as distractors.

I think they simply wanted to define the part of the curve over the domain [0, 1].

Likely the case.

The more I read the question (4th time now…) the more I think it is OK, but possibly by accident rather than deliberation.

The question clearly asks for an area and the integral in answer E does give a positive result.

So… all fine.

Whether all of this was considered when the question was written… I’m willing to accept it as a possibility.

When you lose trust in something or someone, it’s very natural to be suspicious and start seeing things that may or may not be there.

(Which possibly becomes a problem for students).

MCQ5 – two small matters I am not totally OK with:

1. The use of the phrase I feel is quite wrong here; the word is a more appropriate choice, but I get the sense this is a lost battle.

2. . Is this made OK by the use of the lowercase to signal a non-principal argument? It just seems to me that, because is a complex number, the argument should be not . The formula sheet I would argue, supports me here… Given and I don’t think anything is lost by using as the argument of .

Or am I just over-thinking again?

1. Yes, of course the appropriate word is “equals”. And yes, it is a losing battle, but I’m still going to hammer “equivalent” every time some prominent idiot uses it incorrectly (which is always).

2. Setting is legitimate, unusual, deliberately complicating and weird. It is inviting students to conclude , which is incorrect. But then what happens?

Nothing really happens, you still can get option E as the correct answer.

Is that the issue?

And is legitimate because of the lowercase? (As in, would be wrong?)

Yes, the fact that the wrong early assumption gets you to the right final answer is weird. Why offer students the opportunity to get the wrong halfway answer if, in the end, it makes no difference?

And yes, the arg rather than Arg makes it (sort of) ok. Definitely Arg would be wrong. I guess, technically arg refers to the set of all possible arguments, and so should also include +2kπ. But I assume a double-think is allowed, and arg is permitted to refer to any one of these values.

Is it an error though?

There is a correct answer and the correct working will get you there.

No. Which makes the deeper point. The issues with these exams go way beyond the presence of errors.

That’s very interesting because VCAA consistently says that “Correct answers obtained using incorrect mathematics receive no marks.” It’s certainly true for this question that the correct answer can be obtained using incorrect mathematics. But there’s no consequence here. For me, that’s the second error (the first being the incorrect use of “equivalent”). We have mathematical errors and typographical errors – what do you call an error where you contradict your own policy? (A hypocritical error?)

I noticed that the itute solutions

use correct reasoning to get the answer.

I wouldn’t call either an error, although both are strong evidence that no mathematician came near this paper.

No, itute also gets it wrong, or at least offers no explanation for the key step.

Thank you all. I’ve updated the post with my many grumpy thoughts on the multiple choice questions.

I hope your “grumpy thoughts” about MCQ 10 include evil thoughts about the gratuitous brackets around the integrand. They are irrelevant clutter that make things so much harder to read. So much for the VCAA’s advice once upon a time that the integral sign and the “dx” act as defacto brackets.

And yet these brackets are not included in Exam 1 Question 5 (I suppose the writers compensated for this with the awful slab of text in Question 6). Does anyone know if the VCAA has an exam writing ‘style guide’, or do the writers just follow their own personal whims?

This seeks like complaining for the sake of complaining. Q5 on SM1 is fine, so don’t confuse it with a “yet”. Decide on your point and make it. Clearly and without obscure and florid touches.

I’m not complaining about Question 5 on Exam 1. I agree it’s fine. I’m simply noting that it doesn’t use brackets around the integrand (which is good) whereas MCQ 10 on Exam 2 does (which I think is not good). And I’m noting the lack of consistency in the use of this ‘notation’ across both exams.

I’ve updated the post with my thoughts on Part B. It is as bad as Part A, probably worse.

Re: “Part (b) should just be “the surface area”, not “the curved surface area”,”

What do you think of surface area? That seems to be the common phrase used in university textbooks.

A curve sweeps out a surface. A surface has a surface area. Done.

I think there might be an issue with Q2(d). I haven’t seen it mentioned so maybe I’m missing something. The instruction in part (i) is to sketch a ray (no reference yet to the form Arg(z-z_0)=theta), so wouldn’t this include z=1? Then this wouldn’t have the form required in part (ii). It seems that students need to use the equation given in part (ii) to see whether to exclude z=1 in part (i).

Rae, you are correct. Very well spotted. There is nothing in part (d)(i) that excludes the point representing z = 1 from the ray.

In fact, I’d argue this makes part (ii) wrong because there are choices of answer to part (i) that make it impossible to find the equation of the ray in the form requested. Part (i) should have been worded something like:

“On the Argand diagram below, sketch the ray that originates at the real root of …”

The writers are a ray of the form in part (ii) will (or should?) be drawn, which is a load of old bollocks.

For what it’s worth, below is a ray in the complex plane which include the terminus (this example is given to preempt any fallacious argument that rays in the complex plane never include the terminus):

.

Hi BiB. What’s the definition of “ray” according to VCEE? For me, and many others, it means a half-line. But that’s not particularly relevant.

I’m asking because if VCEE say that a ray must be an infinitely long straight line, then isn’t there only one ray connecting and ?

BTW “… measured in radians in terms of ” is quite funny.

Hi, Glen. It’s an issue of whether the ray includes the starting point z = 1.

Ah, got it. Hopefully it won’t matter.

It’s difficult to imagine even VCAA being that stupid, but …

Thanks, Rae and thanks BiB. Flat out today, but I’ll look tomorrow.

Took 2 weeks after the exam and only Rae noticed… is obvious – now, but subtle.

We owe her. I mean, students owe her.

Should we chip in for a prize?

Now that’s a very good idea. Next year, I’ll institute some prize(s) system for the discussion posts. This year? I’m open to suggestion.

But it should also be noted that Rae is almost certainly not the first person to have noticed this glitch. It’s difficult to believe that the glitch wouldn’t have been noticed early on in the grading of the thousands of exam papers.

Thanks for the replies. I’m glad I was able to add something more to the discussion of the exam. As you say marty, others probably noticed too.

A prize system is a fun idea for next year. I used to have a lecturer who offered small prizes to students who spotted errors in their lecture notes. He ended up with a class of highly motivated proof readers (who learnt the material a lot better too!)

Thanks, Rae. Definitely seems clumsy, and amounts to a (minor) error. I guess in practice the issue is whether VCAA will demand the ray drawn in (i) agree with the equation given in (ii). I don’t think even VCAA is that crazy but, after 2022, one can hardly be sure.

If I was a student, I would definitely be obtaining a “Statement of Examination Marks” (free) for all my exams.

After looking at the statements, if I thought I’d been stiffed on an exam I’d pay the small fee to inspect my “Examination Response Materials”.

If I could demonstrate “that a clear error has occurred in the assessment of a question/s”, I’d apply for a “VCE Examination Score Review Application, which must be made by your Principal and supported by your subject teacher.”

See https://acedvce.com/my-vce-results-appear-to-be-wrong-can-my-exam-mark-be-rechecked-or-appealed/#:~:text=In%20exceptional%20cases%2C%20where%20it,supported%20by%20your%20subject%20teacher.

This is possibly the process the student mentioned in Tony Guttmann’s unpublished letter to The Age followed:

There is nothing in this exam paper that is worse than this. That is, if including (1,0) deprives you of a mark. But it s bad in any case.

Thanks, Banacek. Which comes back to the insidiousness of VCAA’s secrecy.

It’s definitely a minor error but we won’t know for months, if ever, whether VCAA misgraded the question, and thus whether the error is major.

Marty, I assume the “noise” you refer to is

“… where a is a non-zero real constant”,

which could be better stated as “where ” (or perhaps as ).

I would describe MCQ 4 as one of the all-time poor CAS questions:

Did anyone else find that if you type the expression into a CAS you immediately and trivially get and the option is obvious (but if you wanted to avoid doing any maths at all a CAS will convert into option B).

MCQ4? The entire question is noise. The +1 and -1 immediately cancel out. The a on top and bottom immediately cancel out. Even without CAS, which of course poisons everything, the question is absurd.

I have some questions about MCQ10:

If then how are any of the options defined when (they all involve )?

Is and suggesting that the natural numbers include 0? If so is it reasonable to take this as canon for all future maths exams?

How should the question be worded so it does not have extraordinarily clumsy wording?

Thanks, Anothermouse. This is the perfect example of a question that, although not in “error”, makes mathematicians, and everyone, want to scream. The wording is simply appalling.

First of all, yes, it is implicit in the question that the natural numbers N includes 0. Then, the question asks about I

_{n}for n ≥ 1. The question asked is then mathematically fine, since I_{0}makes sense.Secondly, is this question then a precedent for N including 0 for VCE in the future? Absolutely not.

As I wrote here, it is practically impossible to rely upon a fixed meaning for N. In particular, note that in VCAA’s sample induction materials it is implicit that N begins with 1. The only way that N can have a fixed, accepted meaning in VCE is for VCAA to declare a meaning. (And, until they do, using the notation N in an exam question is thoroughly idiotic.)

Thirdly, how would I reword this question? It’s like that old joke:

“Excuse me, how do I get to Geelong?”

“Well, I wouldn’t start here.”

It’d be very helpful to have a functioning use of N. But assuming not, I’d probably do something like:

For n a nonnegative integer, let I

_{n}= … . If n ≥ 1, then I_{n}equals …MCQ 20:

A 99% confidence interval is given but the population parameter the interval applies to is never stated. We are obviously meant to very reasonably assume that the parameter is the population mean, but I dislike having to make this assumption. Where do you draw the line on the slippery slope? And it should have said that the CI was approximate.

In my opinion the question should have said:

“Given that an approximate 99% confidence interval for , based on …”

(Churlish, but I’m beyond giving them an even break. And 2023 is such a vintage year).

Thanks, BiB. Obviously very poor wording, but not in practice unclear, right? In SM, confidence intervals are not calculated for any other parameter, are they?

Yes.

No.

There’s a fine line between poor wording and error, and churlish or not, I have to admit the question doesn’t cross that line.

Re” Question 5 and “Lastly, and it’s no big deal, but the choice of ψ to refer to a plane is eccentric.”

It’s no small deal either. The choice of ψ is ridiculous. The relevant Exam 2 sample questions (Q3 and Q5) both use and . The preamble to part (a) uses and there’s no good reason why it couldn’t have used and the second plane be called , therefore using a logical notation that’s consistent with that used in the sample questions.

It’s not an error and it’s not “eccentric” (or even *ahem* plain eccentric). I think it’s indicative of either not giving a damn, or deliberately being a smart-arse. It’s low-level BS in my book. Worth more than a big … sigh.

I’m not going to hammer it too hard. I hate gratuitous subscripts much more than obscure Greek letters.

Given a baseline of eccentricity, and given you hate gratuitous subscripts, I’m curious what notation you would have used. There’s no reference to the first plane after part (b), so perhaps the question could have simply been re-worded in such a way that neither plane needed a name. (Or maybe no name given to the first plane but the second plane gets named ).

Christ. You force me to look at this question again, and of course I find that the question is even more absurd than anyone had suggested. Beyond the first plane not needing a name, the question does not need the first plane.

These people are nuts.

Holy cow, you’re right! It’s diabolical. My eyes glided right over it and I had a false memory of it asking to find an equation of the plane.

Section B Question 4 part (f):

The question says:

“Label any axis intercepts and any asymptotes with their equations.”

I have at least two issues with this:

1) It suggests that axis intercepts are to be labelled with an equation, which is plainly wrong.

2) It is inconsistent with the phrasing on all past exams which generically says to “label points with their coordinates and asymptotes with their equations.” (And it is 2 that causes 1).

Gratuitously clumsy wording, no question.

Re: Section B Question 5.

“Given [the distance equations] are examinable, why are the relevant formulas not included on the formula sheet? This really seems remarkably incompetent.”

In fairness, students can include as many distance equation formulae as they want in their bound reference so this doesn’t bother me too much. I think the real questions are:

1) Why does Exam 2 come with a Formula Sheet?

2) Why is the distance of a point/line/plane from a point/line/plane not explicitly included in the Study design (at least as a dot point under Outcome Key Knowledge or Skills)?

3) Why did the implementation advice not include examples that covered all of 2)?

Thanks, BiB. In response to (3), I think sample materials reasonably covered the distance examples, and were, in itself a reasonable flag that any such distances were examinable. But, whatever one thinks of formula sheets (bad) and “bound references” (insane), it is simply absurd to not include the distance formulas on the formula sheet. If you’re not too much bothered by such absurdity, then fine, but I am.

And yes, the study design is, as ever, disastrously uninformative.

If these sorts of questions appeared in Exam 1, not having the distance formulas on the formula sheet bother me (a lot). Having said this, I think another question is raised:

What should be on (and what should be deleted from) the current formula sheet?

I would suggest the following stuff is useless (*) on it:

All the so-called ‘Algebra, number and structure (complex numbers)’

The first two rows of ‘Vectors in two and three dimensions’ and the ‘vector scalar product’.

The first two blocks of ‘Data analysis, probability and statistics’.

Getting rid of the above stuff creates plenty of room for distance formulas. As well as:

Vector resolute of in direction of , critical values of z

(whose absence from the formula sheet I find absurd).

* Useless in the sense that this is NOT stuff that I would expect even a weak student to need to look up. (If they do, I don’t think there’s much hope for them).

I hate formula sheets, and I don’t care about the details. But obviously a formula sheet, if it exists, should contain the significant formulas that a student might be expected to use, to a roughly consistent level of “significant”. For the formula sheet to not contain the distance formulas is plain nuts.

You’re interested in the practicalities. I’m interested when things are plain nuts.