Same old thing. Exam 1 discussion is here.
Continue reading “Secret 2022 Further Business: Exam 2 Discussion”
Same old thing. Exam 1 discussion is here.
Continue reading “Secret 2022 Further Business: Exam 2 Discussion”
Further Mathematics is not our thing, but it seemed possible that some frequent visitors may want to discuss today’s exam. (We haven’t seen the exam, and don’t plan to seek it out unless something interesting is flagged.)
Have fun.
UDPATE (31/10/22): Exam 2 discussion is here.
Continue reading “Secret 2022 Further Business: Exam 1 Discussion”
This one is courtesy of frequent commenter, wst. It’s really a PoSWW, but the ideas may be sufficiently unfamiliar for it to be worth encouraging WitCHlike discussion. It comes from the Cambridge textbook General Mathematics 1&2.
If nothing else, people can enjoy a great song.
This one puzzled us when we gave the 2021 Further Mathematics Exams a three-minute scan, but we didn’t bother then to think, um, further. Now, however, since graph theory is likely a mandatory part of Specialist 1&2, and thus also, at least technically, a mandatory part of Specialist 3&4, we have thought a little harder. So, after discussions with John Friend and with a colleague who we shall refer to as Professor Combo, here we are.
By overwhelming demand,* we have decided, much belatedly, to put up a post for discussion of the 2021 Further Mathematics exams. We have no particular plans to update this post, although we will do so if anything of interest arises. We’ll just note the two excerpts below, from Exam 2, the first of which is discussed here, at 5:30. Thanks to Simon and SRK to bringing these to our attention.**
It seems the VCAA has just released their draft of the new study design for Mathematics:
We haven’t yet looked at the draft, because we’re scared. But, don’t let that stop others. May the discussion and the throwing of brickbats begin.
UPDATE (09/03/21)
We’ve written a post with some brief thoughts here.
This is the home for Further Mathematics exam errors. The guidelines are given on the Methods error post, and there is also a Specialist error post.
*******************
2023, NHT Exam 2 (Exam here)
Q(3) (added 15/08/23) – The stem plot is incorrectly drawn, which can muck up the calculation of the five-number summary, asked for in part (b).
2022, Exam 2 (Exam here, report here (Word, idiots) – discussed here)
QA(3)(a)(i) (added 27/12/22) – This is not an error yet, but it will be. The issue is whether “Day Number” (for eight consecutive days) is a “numerical variable”: based on their past idiocy, it is clear that VCAA will claim, falsely, that it is not. (15/08/23 – Yep, VCAA screwed up. The report notes, “[the answer] 6 was a common error by students who had chosen ‘day number’ as a numerical variable”.)
2022, Exam 1 (Here, report here (Word, idiots) – discussed here)
QA(1) (added 26/12/22) – A badly borderline question on the shape of a distribution. A really great way to start an exam.
QA(21) (added 26/12/22) – A badly ambiguous question on nominal interest rates. It is unclear whether negative rates and/or (more plausibly) periods greater than a year were to be considered.
2022, NHT Exam 2 (Exam here, report here)
QA(2)(a) (added 27/12/22) – The report indicates that “year” is regarded as a “categorical variable”. This is absurd.
2021, Exam 2 (Exam here, report here (Word, idiots) – discussed here)
QA(1)(f) (added 24/11/21 – discussed here) The question asks for a minimum value to be an outlier, which, by definition of an outlier, cannot exist. (26/12/22 – The answer in the exam report is simply false.)
QB(1)(2)(c) (added 24/11/21 – discussed here) The indicated matrix M2 is not the square of the matrix M provided earlier in the question.
2021 NHT, Exam 2 (Here, and report here)
QA(18) (added 02/11/22) A bad question, hingeing on whether “Event”, listed as “1” or “2”, is a “nominal variable”. The report indicates that it is, which is probably best considered wrong. (Compare QA(2) on 2016 Exam 1, and the report.)
2021 NHT, Exam 1 (Here, and report here)
QA(18) (added 02/11/22) Madness. A multiple choice question on compound interest, for which none of the available answers is correct (or even close). The examination report indicates no answer, simply noting
As a result of psychometric analysis, the question was invalidated.
Some psychometric analysis is probably in order, but VCAA appears to be pointing their psych gun at the wrong target.
2019, Exam 2 (Here, and report here)
QA(1)(a) (added 27/12/22) The question asks which of the two variables in a table is “ordinal”. The report indicates that “day number” (for fifteen consecutive days) is ordinal. Given the other choice was “temperature”, there wasn’t much alternative, But the better answer, and only properly correct answer, is “neither”. The exam report notes “A small number of students answered ‘neither’ “, without indicating whether this answer was deemed correct.
2019, Exam 1 (Here, and report here)
QB(6) (added 21/09/20) The solution requires that a Markov process is involved, although this is not stated, either in the question or in the report.
2018 NHT, Exam 1 (Here, and report here)
MCQ4 (added 23/09/20) The question provides a histogram for a continuous distribution (bird beak sizes), and asks for the “closest” of five listed values to the interquartile range. As the examination report almost acknowledges (presumably in time for the grading), this cannot be determined from the histogram; three of the listed values may be closest, depending upon the precise distribution. The report suggests one of these values as the “best” estimate, but does not rely upon this suggestion. See the comments below.
2017 Exam 2 (Here, and report here)
Q1(c)(ii) (added 13/11/20) – discussed here. The question is fundamentally nonsense, since there are infinitely many 1 x 3 matrices L that will solve the equation. As well, the 3 x 1 matrix given in the question does not represent the total value of the three products as indicated in Q(c)(i). The examination does not acknowledge either error, but does add irony to the error by whining about students incorrectly answering with a 3 x 1 matrix. (30/10/22) The examination report has finally been amended to acknowledge the obvious error, albeit in a snarky “no harm, no foul” manner. The fundamental nonsense of the question remains unacknowledged. As commenter DB has noted, the examination report also makes the hilarious claim,
The overwhelming majority [of students] answered the question in the manner intended without a problem.
We’re not sure the “minority” of 72% of students who scored 0/1 on the question would agree.
2017 Exam 1 (Here, and report here)
MCQ11 (added 13/11/20) – discussed here. None of the available answers is correct, since seasonal indices can be negative. The examination report does not acknowledge the error.
MCQ6 Module 2 (added 05/09/22) – discussed here. The intention of the question is reasonably clear, but the expression “how many different ways” is, at minimum, clumsily ambiguous, and one can argue for either C or D or E being correct. The intended answer was E, but many also students answered C or D. The examination report suggests that the incorrect answers were due to “simple counting errors”, which is possible but far from definite. The report also writes “Most students answered option B, C or D”, which is contradicted by the statistics; presumably the statistics are correct and the sentence is wrong, but it is unclear.
2015 Exam 1 (Here, and report here)
MCQ9 Module 2 (added 30/09/20) The question refers to cutting a wedge of cheese to make a “similar” wedge of cheese, but the new wedge is not (mathematically) similar. The exam report states that the word “similar” was intended “in its everyday sense” but noted the confusion, albeit in a weasely, “who woulda thought?” manner. A second answer was marked correct, although only after a fight over the issue.
2011 Exam 1 (Here, and report here)
MCQ3 (added 23/11/22). The question asks for the “closest” value for the median, but this cannot be determined from the provided histogram; two of the provided answers (B and C) may be correct. The examination report is silent on the issue.
Last week, the New South Wales government came out with the next great plan to Save Mathematics Education: make mathematics compulsory up until the end of high school. Why? According to Premier Gladys Berejiklian, this will “ensure students have the numeracy skills required to succeed in today’s society”.
Yes, of course. In exactly the same way, for example, that compulsory instruction in ethics ensures that lawyers and cops act ethically.
What’s the source for this latest nonsense? Well, it’s kind of, sort of from the Interim Report of the NSW Curriculum Review, which was released a few days earlier, and which is prominent in the Government’s media release. Like all such reports, the NSW Report is barely readable, the predictable mishmosh of pseudoscience, unweighted survey, statistics of undeterminable worth and contradictory motherhoodisms. Thankfully, there’s no reason to read the Report, since the NSW Government hasn’t bothered to read it either; nothing in the Report points to making mathematics compulsory throughout high school.
Still, it was easy enough to find “maths experts” who “applauded the move”. Jordan Baker, the Sydney Morning Herald‘s education reporter, quoted four such “experts”, although the only expert appearing to say much of substance was doing anything but applauding. Greg Ashman, who is always worth reading (especially when he is needling nitwits), pointed to the need for specialist teachers in lower years. He is then quoted:
“You need to move away from the fashion for inquiry learning and problem-based learning and instead focus on high quality, interactive, explicit teaching of mathematics. Do that, and I believe numbers in year 12 would organically grow.”
In other words, if you stop having shit teachers teaching shit maths in a shit manner in lower years then maybe more kids will choose to stick around a little longer. (Ashman is more collegial than this writer.)
The NSW government’s compulsion will undoubtedly push mathematics in the exact opposite direction, into ever more directionless playing and mathematical trivia dressed up as real world saviour. You know the stuff: figuring out credit cards and, God help us, “how to choose cancer treatment“.
To illustrate the point perfectly, Melbourne’s Age has just published one of its fun exam-time pieces. Titled “Are you smarter than a 12th grader?“, the reader was challenged to solve the following problem from yesterday’s Further Mathematics exam:
A shop sells two types of discs: CDs and DVDs. CDs are sold for $7.00 each and DVDs are sold for $13.00 each. Bonnie bought a total of 16 discs for $178.00. How many DVDS did Bonnie buy?
The question this problem raises isn’t are you smarter than a 12th grader. The real question is, are you smart enough to realise that making mathematics compulsory to 12th grade will doom way too many students to doing 7th grade mathematics for six years in a row? For the NSW government and their cheer squad of “maths experts”, the answer appears to be “No”.
Late last year we posted on Madness in the 2017 VCE mathematics exams, on blatant errors above and beyond the exams’ predictably general clunkiness. For one (Northern Hemisphere) exam, the subsequent VCAA Report had already appeared; this Report was pretty useless in general, and specifically it was silent on the error and the surrounding mathematical crap. None of the other reports had yet appeared.
Now, finally, all the exam reports are out. God only knows why it took half a year, but at least they’re out. We have already posted on one particularly nasty piece of nitpicking nonsense, and now we can review the VCAA‘s own assessment of their five errors:
So, the VCAA responds to five blatant errors with five Trumpian silences. How should one describe such conduct? Unprofessional? Arrogant? Cowardly? VCAA-ish? All of the above?
Our fourth post on the 2017 VCE exam madness will be similar to our previous post: a quick whack of a straight-out error. This error was flagged by a teacher friend, David. (No, not that David.)
The 11th multiple choice question on the first Further Mathematics Exam reads as follows:
Which one of the following statistics can never be negative?
A. the maximum value in a data set
B. the value of a Pearson correlation coefficient
C. the value of a moving mean in a smoothed time series
D. the value of a seasonal index
E. the value of a slope of a least squares line fitted to a scatterplot
Before we get started, a quick word on the question’s repeated use of the redundant “the value of”.
Bleah!
Now, on with answering the question.
It is pretty obvious that the statistics in A, B, C and E can all be negative, so presumably the intended answer is D. However, D is also wrong: a seasonal index can also be negative. Unfortunately the explanation of “seasonal index” in the standard textbook is lost in a jungle of non-explanation, so to illustrate we’ll work through a very simple example.
Suppose a company’s profits and losses over the four quarters of a year are as follows:
So, the total profit over the year is $8,000, and then the average quarterly profit is $2000. The seasonal index (SI) for each quarter is then that quarter’s profit (or loss) divided by the average quarterly profit:
Clearly this example is general, in the sense that in any scenario where the seasonal data are both positive and negative, some of the seasonal indices will be negative. So, the exam question is not merely technically wrong, with a contrived example raising issues: the question is wrong wrong.
Now, to be fair, this time the VCAA has a defense. It appears to be more common to apply seasonal indices in contexts where all the data are one sign, or to use absolute values to then consider magnitudes of deviations. It also appears that most or all examples Further students would have studied included only positive data.
So, yes, the VCAA (and the Australian Curriculum) don’t bother to clarify the definition or permitted contexts for seasonal indices. And yes, the definition in the standard textbook implicitly permits negative seasonal indices. And yes, by this definition the exam question is plain wrong. But, hopefully most students weren’t paying sufficient attention to realise that the VCAA weren’t paying sufficient attention, and so all is ok.
Well, the defense is something like that. The VCAA can work on the wording.