This is our post for teachers and students to discuss Methods Exam 2 (not online). There are also posts for Methods Exam 1, Specialist Exam 1 and Specialist Exam 2.

**UPDATE (21/11/20) **A link to a parent complaining about the Methods Exam 2 on 774 is here.

**UPDATE (24/11/20 – Corrected) **A link to VCAA apparently pleading guilty to a CAS screw-up **(from 2010) **is here. (Sorry, my goof to not check the link, and thanks to Worm and John Friend.)

It’s far too quiet here so I’ll kick things off with some general comments on the questions in Section A:

Q1 – Q2: Simple Methods Unit 1 questions.

Q3: Press some buttons on a CAS. Very simple for Specialist students who would just use DSolve (in Mathematica) or similar on a CAS calculator.

Q4: Button pushing.

Q5: Methods Unit 1.

Q6: Two turning points at x < 0 so only one viable option. Very obvious.

Q7: Something that requires intelligence at last.

Q8: Simple binomial with button pushing.

Q9: Advantages Specialist students. Nevertheless simple if you choose f(x) = 5/4 without loss of generality.

Q10: Trivial if you test each option.

Q11: Trivial if you simplify to Pr(X < 259) < Pr(Z < 1.5) and solve (259 – 250)/sd = 259. Can also easily be done without intelligence using Mathematica and one line of generic code.

Q12: Wordy but simple Methods Unit 2.

Q13: Simple to narrow it down to E using only the translation.

Q14: Trivial using Mathematica.

Q15: Trivial to get the equation of each line, apply the definition and press some buttons.

Q16: Simple Methods Unit 1 max-min problem.

Q17: Tricky but obvious if you draw a graph of y = f(x). Nice.

Q18: Simple Methods Unit 1.

Q19: A bit tricky. The graph is irrelevant.

Q20: Tricky but obvious if you see f(x) = 1 is required. Nice.

So the way I see it, Section A can easily be finished in 30 minutes and there should only be three questions that cause a reasonable student any difficulty (Q17, Q19, Q20).

Verdict: Benign.

I haven't checked carefully for errors.

Since there doesn’t seem to be any err love for this exam, I suppose I’ll start by saying that greater presence of explicit CAS questions (i.e questions that asked for a numerical question) was a bit unusual. I believe it could be to compensate for the probability section which is primarily CAS-active but it leads to the magic question of “did using a particular CAS, i.e Mathematica over a TI-nspire help students?”. Maybe its my inexperience with the TI-nspire since I tried using one to do this years exam with both calculators, but for various parts of SA (Q1e-f, Q2c, Q4, Q5a-e) there felt like a lot of CAS was in play. I did have a listen to the radio and it really just seems like a mishmash of “the format was unexpected” among other not so specific comments. Maybe anyone who has more expertise with these calculators can chime in.

Hmpph. This whole friggin *year* was unexpected but we all just had to suck it up.

This mishmash of “the format was unexpected”. Cry me a friggin river. Didn’t they see the Cover pages and Amended Formula Sheet? If they didn’t, then all the mewling needs to be directed elsewhere, NOT at the exam and its so-called “unexpected format”. And maybe some tough questions need to be aimed in the same direction …

One thing I will say as someone who teaches at a TI school but uses Wolfram products for my computational needs (including playing around with ideas for SACs…) – a computer keyboard and interface (be it Mathematica or the TI emulator) is ALWAYS easier to use than the hand-held.

The new TI calculators (the blue ones that have a new file format, so are pretty much incompatible with the older black model, including rendering the old docking stations useless except as charging stations…) have more memory and are a bit quicker to process things, but all the shortcomings relative to a larger screen device (computer) remain.

Indeed. Far too quiet. No-one even nibbled the (unintended) bait at Q13. Simple to narrow it down to the WRONG answer of E using only the translation.

Simpler to narrow it down to the correct answer (of A) by starting with x = 2x’ + 4. No knowledge of transformations required. Trivial with Mathematica.

A pointless question foisted on students because of the pointless inclusion of transformation matrices.

(Stupid in fact because:

1) matrices are not on the course and yet they are required.

2) it’s another disconnected piece of mathematics welded onto the course).

Re Q7:

Intelligence should one try, or to the CAS (Mathematica and the TI-nspire can just evaluate this as is).

On the other questions, Q19 is a good question, although the graph is pointless as you said. Q17 in my opinion is just another optimisation problem the same way Q16, and one could use the fMax function (TI-nspire) or the Maximize function (Mathematica) to hammer the question without further thought. Of course, getting there could be a step of its own…

I also had a look at a walk through of the extended response solutions courtesy of a friendly neighborhood Worm and had seen that the CASIO classpad does not capture all the solutions for an inequality (Q5dii). Now, one may argue that you could (and should) use the previous results to determine that the casio classpad was not functioning properly (in my opinion) whereas both the TI-nspire and Mathematica will return the complete inequality…. While any student would sketch a graph of this, they would have to go to the graph menu, type in the graph along with the two lines… all for one mark. From what I can see as well, it has no implication on the following questions either.

The topic of matrices in the context of transformations annoys me a lot, since those questions feel disconnected from the other topics. Linear algebra is such an interesting topic, and the watering down of it into transformations of functions is a real injustice to the field. I would also say Further does similar things with topics such as least squares (a good use of motivating inner products) but of course all of it is button mashing.

Sorry… LOL I missed your comment proceeding mine!

That Worm guys seems like a jerk! 😉

Q.19 I think the graph was to possibly spark symmetry in the heads of students?

ER Q5dii… solving the inequality…

TI inspire gives the correct intervals… CASIO Classpad is missing one of the intervals…

How can they have this on the exam? Surely it should be tested beforehand… 98% of students using a CASIO will not pick up on this given the previous question… Very unfair… and how will it be marked?

There are on-going issues with the non-level playing field of technology.

VCAA has gives assurances that all exam questions are tested with all three CAS calculators (the TI-Insipid, the Crapio, and the Spewlett Hackard) for ‘equity’. This is obvious bullshit (https://atarnotes.com/forum/index.php?topic=34681.0) and your comment is the latest evidence. Furthermore, Mathematica leaves the calculators for dead (https://mathematicalcrap.com/2020/07/16/guest-post-mathematica-and-the-potential-gaming-of-vce/), but VCAA does not care (for obvious reasons – it wants Mathematica Methods not Mathematical Methods).

Imagine getting those teachers who already can’t teach methods or spesh, to then learn mathematica! They shy away from the basics of a CAS calculator… and would never consider writing a small program….

Some schools have a mini orgasm when they find out that you can use both the TI Inspire… and CASIO and…. mathematica…

I figure that if they force it on you, you may as well become competent in it!

I would love to have a two hour exam on both that was harder, but they have more time to work on the paper…

I dont know if you guys know of a student called Alex Gunning? He got a perfect score in Mathematics in the international olympiad, yet he could never finish a paper in specialist or methods… and got about ~40 in both subjects….

He’s currently studying mathematics (Masters) at Trinity College, Cambridge. Probably because of his Olympiad success, no thanks to the VCAA exams.

More proof of how stupid the VCAA exams really are (although getting 40’s in Methods and Specialist indicates something, I suppose). Answering a bunch of questions within *unrealistic* time constraints is just plain dumb. But *apparently* the exams are subjected to a blind review, which among other things is meant to check that the exam is of an appropriate length for a reasonable student. Either more bullshit from VCAA or the blind reviewer gets ignored. It’s my experience (given the number of errors in the exams each year) that the VCAA vettors have been asleep at the wheel for many years.

What I’d like is the VCAA exam writers, vettors and DuLL to all sit one of *my* exams under normal examination conditions.

The Methods exams were not overly-difficult, the format was not the big surprise many are alleging, and there were no questions outside the scope of the Adjusted 2020 Study Design. But Exam 1 was undoubtedly too long (by my estimate, about 6 marks = 10 minutes more time was needed) – I blame the VCAA vettors.

I haven’t worked through Exam 2 Section B but I think Section A could easily be done within 30 minutes by a reasonable student (therefore not too long).

I completely agree…

There was a new head of the panel in Methods this year… I would suggest that they probably thought they knew best…

So you mean a ‘knew head’ …

A tough year to be new (whether it be teacher, HoD or exam setting panel), although I’m sure work had started on the 2020 exam before the pandemic struck. If exam 1 had been shorter (by deleting a couple of the 1 mark questions and distributing those marks to other questions), I’d say that the ‘knew head’ had done pretty well.

https://www.heraldsun.com.au/news/victoria/answers-just-didnt-add-up-on-year-12-maths-methods-exam/news-story/fcdf73d3cb1f17e82978573a6208b116?sv=644530f908f30c274e59b2ec6adaf9af

Thanks, John. I haven’t yet worked through the comments on this post, and the article is weirdly vague, but I gather this is the issue Worm raised? I’ll post the link above as well.

The newspaper report is related (I remember this – there was a big uproar). The report was evidence that that VCAA aren’t doing the technology checks it claims to be doing – the Crapio seems to fall short relative to the TI-nsipid on some VCAA exam questions. Worm has raised ER Q5dii from the 2020 exam – it adds to this evidence.

And we certainly know that VCAA have no clue (or don’t care) about the huge advantage Mathematica-using students have relative to CAS-calculator using students.

Ah, I see. Stupid me, and I’ve corrected the update. I’ll check out the 2020 issue, and the whole exam, in the next day or so.

Question 4eiii

For the TI-nspire CX CAS when equating areas and solving for n without inserting a domain the calculator returns 1.087 to 3 decimal places, but when inserting the domain 1<n<3, it returns 1.088. Will both answers be accepted?

Who knows. VCAA probably tossed a coin last Sunday (assessor training day). It seems to be how it does most things.