It Doesn’t GEL: The General Error List

This is the home for Further Mathematics (24/11/23 – now called General Maths) exam errors. The guidelines are given on the Methods error post, and there is also a Specialist error post.


2023 Exam 2 (Here, discussed here)

Q7(c). The question cannot be properly answered as written. 

Q7(d). There are two solutions, one of which will reportedly not be accepted as correct. 

Q9(d). An extra 1 appeared in the matrix. (31/01/23. The published exam has corrected the error but with no acknowledgment of the error. This is sleazy.)

Q11. A poorly written question, with (at least) two correct answers.

Q14(d). There is an extra “of” in the preamble. (24/11/23) As has been point outed in a comment, since this correction was announced at the beginning of the exam, it’s not kosher to list it here as an error.

2023 Exam 1 (Here, discussed here)

MCQ26 The expression “Q multiplied by P” is absolutely fatal in the context of matrices and, if there is a clear meaning to be had, the meaning is Q x P. The question, however, requires P x Q.

2023 NHT Exam 2 (Exam here, report here (Word, idiots))

Q(3) (added 15/08/23) – The stem plot is incorrectly drawn, which can muck up the calculation of the five-number summary, asked for in part (b).

2022 Exam 2 (Exam here, report here (Word, idiots) – discussed here)

QA(3)(a)(i) (added 27/12/22) – This is not an error yet, but it will be. The issue is whether “Day Number” (for eight consecutive days) is a “numerical variable”: based on their past idiocy, it is clear that VCAA will claim, falsely, that it is not. (15/08/23 – Yep, VCAA screwed up. The report notes, “[the answer] 6 was a common error by students who had chosen ‘day number’ as a numerical variable”.)

2022 Exam 1 (Here, report here (Word, idiots) – discussed here)

QA(1) (added 26/12/22) – A badly borderline question on the shape of a distribution. A really great way to start an exam.

QA(21) (added 26/12/22) – A badly ambiguous question on nominal interest rates. It is unclear whether negative rates and/or (more plausibly) periods greater than a year were to be considered.

2022 NHT Exam 2 (Exam here, report here)

QA(2)(a) (added 27/12/22) – The report indicates that “year” is regarded as a “categorical variable”. This is absurd.

2021 Exam 2 (Exam here, report here (Word, idiots) – discussed here)

QA(1)(f) (added 24/11/21 – discussed here) The question asks for a minimum value to be an outlier, which, by definition of an outlier, cannot exist. (26/12/22  – The answer in the exam report is simply false.)

QB(1)(2)(c) (added 24/11/21 – discussed here) The indicated matrix M2 is not the square of the matrix M provided earlier in the question.

2021 NHT, Exam 2 (Here, and report here)

QA(18) (added 02/11/22) A bad question, hingeing on whether “Event”, listed as “1” or “2”, is a “nominal variable”. The report indicates that it is, which is probably best considered wrong. (Compare QA(2) on 2016 Exam 1, and the report.)

2021 NHT, Exam 1 (Here, and report here)

QA(18) (added 02/11/22) Madness. A multiple choice question on compound interest, for which none of the available answers is correct (or even close). The examination report indicates no answer, simply noting

As a result of psychometric analysis, the question was invalidated.

Some psychometric analysis is probably in order, but VCAA appears to be pointing their psych gun at the wrong target.

2019, Exam 2 (Here, and report here)

QA(1)(a) (added 27/12/22) The question asks which of the two variables in a table is “ordinal”. The report indicates that “day number” (for fifteen consecutive days) is ordinal. Given the other choice was “temperature”, there wasn’t much alternative, But the better answer, and only properly correct answer, is “neither”. The exam report notes “A small number of students answered ‘neither’ “, without indicating whether this answer was deemed correct.

2019, Exam 1 (Here, and report here)

QB(6) (added 21/09/20) The solution requires that a Markov process is involved, although this is not stated, either in the question or in the report.

2018 NHT, Exam 1 (Here, and report here)

MCQ4 (added 23/09/20) The question provides a histogram for a continuous distribution (bird beak sizes), and asks for the “closest” of five listed values to the interquartile range. As the examination report almost acknowledges (presumably in time for the grading), this cannot be determined from the histogram; three of the listed values may be closest, depending upon the precise distribution. The report suggests one of these values as the “best” estimate, but does not rely upon this suggestion. See the comments below.

2017 Exam 2 (Here, and report here)

Q1(c)(ii)  (added 13/11/20) – discussed here. The question is fundamentally nonsense, since there are infinitely many 1 x 3 matrices L that will solve the equation. As well, the 3 x 1 matrix given in the question does not represent the total value of the three products as indicated in Q(c)(i). The examination does not acknowledge either error, but does add irony to the error by whining about students incorrectly answering with a 3 x 1 matrix. (30/10/22) The examination report has finally been amended to acknowledge the obvious error, albeit in a snarky “no harm, no foul” manner. The fundamental nonsense of the question remains unacknowledged. As commenter DB has noted, the examination report also makes the hilarious claim,

The overwhelming majority [of students] answered the question in the manner intended without a problem.

We’re not sure the “minority” of 72% of students who scored 0/1 on the question would agree.

2017 Exam 1 (Here, and report here)

MCQ11  (added 13/11/20) – discussed here. None of the available answers is correct, since seasonal indices can be negative. The examination report does not acknowledge the error.

MCQ6 Module 2 (added 05/09/22) – discussed here. The intention of the question is reasonably clear, but the expression “how many different ways” is, at minimum, clumsily ambiguous, and one can argue for either C or D or E being correct. The intended answer was E, but many also students answered C or D. The examination report suggests that the incorrect answers were due to “simple counting errors”, which is possible but far from definite. The report also writes “Most students answered option B, C or D”, which is contradicted by the statistics; presumably the statistics are correct and the sentence is wrong, but it is unclear.

2015 Exam 1 (Here, and report here)

MCQ9 Module 2 (added 30/09/20) The question refers to cutting a wedge of cheese to make a “similar” wedge of cheese, but the new wedge is not (mathematically) similar. The exam report states that the word “similar” was intended “in its everyday sense” but noted the confusion, albeit in a weasely, “who woulda thought?” manner. A second answer was marked correct, although only after a fight over the issue. 14/12/23. The similarity argument in the report refers to d – 8, rather than 8 – d.

2011 Exam 1 (Here, and report here)

MCQ3  (added 23/11/22). The question asks for the “closest” value for the median, but this cannot be determined from the provided histogram; two of the provided answers (B and C) may be correct. The examination report is silent on the issue.

18 Replies to “It Doesn’t GEL: The General Error List”

  1. 2018 NHT Further Maths Exam 1
    Module – Data Analysis Q4.

    The assessor’ report states that option ABC are all accepted because:
    “ The best estimate of the IQR from this histogram is 1.5, therefore option B.
    Under particular conditions it would be possible for the IQR to be as small as 1.0 or as large as 2.0, therefore options A and C were also accepted.”

    What are the “particular conditions”?
    That is an interesting phrase to be sighted in the official exam report.

      1. In fact, (one of) the author of this NHT paper exposed himself somewhere.
        (I am not sure whether it is a good idea to do so – shouldn’t any (previous or future) member in exam setting panel stay confidential?)

        1. Interesting. I think it’s arguable that examiners should be permitted to remain anonymous, and perhaps wise. I’m not I see it should be mandatory.

  2. 2015 Paper 1 Geometry module Q9.

    The use of the word “similar” in the question is wrong, misleading and insanely inappropriate.

    1. This one has a history, some of which I can repeat, and some I cannot. Here is the history I can repeat.

      Two weeks after the exam was held, Burkard and I and the MAV were contacted by an upset teacher, all of whose students had misinterpreted the question. The teacher had been fobbed off by the Examinations Unit, which took two weeks to come back with an idiotic, “nothing wrong” reply. The teacher was then hoping the MAV and/or the Caped Crusaders (whatever) might go in to fight.

      The MAV did … wait for it … nothing.

      I did not do nothing. I emailed the Examinations Unit, and also got a bullshit reply, although they indicated the Examinations Unit was, two weeks after the exam, “continuing to monitor student responses to this question”. (No such indication had been given to the teacher.) I slammed the reply for the crap that it was, and requested the contact information for the Boss. After a couple emails back and forth, the Boss indicated the resolution, as indicated in the post above.

      Why did the grading get changed? It’s unclear, and I don’t want to take undue credit. But, it is clear that two things did *not* get the grading changed: (1) The Examination Unit’s knee-jerk willingess to fob off teachers; (2) The MAV’s knee-jerk willingess to do nothing.

      1. Yes, the MAV … Once again, providing “a voice, leadership and professional support for mathematics education” (its words, not mine). Bravo.

  3. On the 2021 NHT Exam 1 Question 18, the scenario involved an investment that was compounding quarterly and the value of the investment after n years was modelled using a recurrence relation. Students had to find the value of R in the recurrence relation, but none of the options were correct.

    On the Examiners Report, the problem was addressed thus:

    Question 18
    As a result of psychometric analysis, the question was invalidated.

    1. “As a result of psychometric analysis, the question was invalidated.”

      VCAA code for “VCAA made a mistake but will NEVER admit this.”

      I’ve seen the “psychometric analysis” phrase used a few times – the first time I saw it (and yes, it was in the context of VCAA responding to an error) I honestly thought it was made up gobbledygook. I now know better, it’s real gobbledygook.

  4. 2011, Exam 1, Question 3. I believe this question is erroneous, because – for all that we know from the histogram – the median could be 17.5% and as such it’s not possible to state which value the median *is* closest to (of 15% or 20%)

    I suppose it could be rewritten as “the median *could be* closest to”, but then options B and C should be accepted as correct, and the question is still stuffed for that reason.

    (A similar issue exists in 2018 NHT Exam 1, Question 4. In this case at least all three possible IQRs (A, B, C) were accepted).

    1. Thanks, SRK. I finally got a chance to look at these. The 2018 error was already listed. You’re obviously correct about the 2011 error, and I’ve added it.

      I never cease to wonder about such decade-old errors. It can’t be that no one has noticed them previously. So, is it simply that in times past no one bothered to tell VCAA when they stuffed up? Or just that no one demanded they fess up?

      1. A copy of the exam is made available to teachers 30 minutes after reading time. In times past, you could find an error while the exam was still in progress and report it to VCAA. VCAA would send a correction out to all schools, which then got passed on to the invigilator. The invigilator would stop the exam and announce the correction. So I think some errors were reported back in the day and ‘corrected’ while the exam was live (*). But the culture has always been to deny errors in the Examination Reports (**).

        VCAA would be aware of many of these old errors but steadfastly refuses to amend the Examination Reports unless absolutely forced to. (Even then, the amendment never uses the word error).

        I would posit that since 2011, VCAA has rubbed teachers up the wrong way to such an extent that finding errors on exams became a blood sport for some (and their students (***)). And old errors have a habit of rising up from the slime when you get a bright and hard working student who works through those old exams (**) … (Show me the teacher that willingly goes back to ‘old’ exams, works through them, and re-discovers these errors)

        These days, VCAA prefers to think that errors on old exams are irrelevant (because the exams are, like, you know, old) and not even worth a mealy-mouthed amendment the Reports. Having said this, it’s interesting to note the number of amendments made to multiple choice questions – a second correct option suddenly appearing without comment (see multiple choice Q12 here: ), ‘correct’ options deleted without comment ( ) etc.

        Errors on VCAA exams is the big VCAA cover-up.

        * I doubt this happens anymore, for several reasons.

        ** And because the Reports are deceitfully silent on errors, you would think that the error went unnoticed.

        *** Getting students to find the VCAA error in a question can be a fun and useful way teaching (the danger is overdoing it because there are sooooo many errors). VCAA’s right-triangle with a hypotenuse smaller than one of the other sides ( ) still entertains. As does the strictly decreasing with a union of intervals ( )

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 128 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here