Same old thing. Exam 1 discussion is here.

Continue reading “Secret 2022 Further Business: Exam 2 Discussion”

Skip to content
# Tag: Further Mathematics

## Secret 2022 Further Business: Exam 2 Discussion

## Secret 2022 Further Business: Exam 1 Discussion

## WitCH 82: Graphic Evidence

## WitCH 77: Road to Nowhere

## Further Exams and Further Errors

## Feeling VCAA’s Draft: Discussion

## One FEL Swoop: The Further Error List

## Obsessive Compulsion

## VCAA Plays Dumb and Dumber

## A Madness for all Seasons

Same old thing. Exam 1 discussion is here.

Continue reading “Secret 2022 Further Business: Exam 2 Discussion”

Further Mathematics is not our thing, but it seemed possible that some frequent visitors may want to discuss today’s exam. (We haven’t seen the exam, and don’t plan to seek it out unless something interesting is flagged.)

Have fun.

**UDPATE (31/10/22): **Exam 2 discussion is here.

Continue reading “Secret 2022 Further Business: Exam 1 Discussion”

This one is courtesy of frequent commenter, wst. It’s really a PoSWW, but the ideas may be sufficiently unfamiliar for it to be worth encouraging WitCHlike discussion. It comes from the Cambridge textbook *General Mathematics 1&2.*

If nothing else, people can enjoy a great song.

This one puzzled us when we gave the 2021 Further Mathematics Exams a three-minute scan, but we didn’t bother then to think, um, further. Now, however, since graph theory is likely a mandatory part of Specialist 1&2, and thus also, at least technically, a mandatory part of Specialist 3&4, we have thought a little harder. So, after discussions with John Friend and with a colleague who we shall refer to as Professor Combo, here we are.

By overwhelming demand,* we have decided, much belatedly, to put up a post for discussion of the 2021 Further Mathematics exams. We have no particular plans to update this post, although we will do so if anything of interest arises. We’ll just note the two excerpts below, from Exam 2, the first of which is discussed here, at 5:30. Thanks to Simon and SRK to bringing these to our attention.**

It seems the VCAA has just released their draft of the new study design for Mathematics:

- The current (pre-COVID) study design (pdf) is here.
- The draft for the new study design (word) is here.
- The key changes overview (work) is here.
- The link for feedback (
**until March 9, 2021**) is here.

We haven’t yet looked at the draft, because we’re scared. But, don’t let that stop others. May the discussion and the throwing of brickbats begin.

**UPDATE (09/03/21)**

We’ve written a post with some brief thoughts here.

This is the home for Further Mathematics exam errors. The guidelines are given on the Methods error post, and there is also a Specialist error post.

***********************

**2021, Exam 2 (no exam or report yet available – discussed here)**

**QA(1)(f) (added 24/11/21 – discussed here)** The question asks for a minimum value to be an outlier, which, by definition of an outlier, cannot exist.

**QB(1)(2)(c) (added 24/11/21 – discussed here)** The indicated matrix *M*^{2} is not the square of the matrix *M* provided earlier in the question.

**2021 NHT, Exam 1 (Here, and report here)**

**QA(18) (added 02/11/22)** Madness. A multiple choice question on compound interest, for which none of the available answers is correct (or even close). The examination report indicates no answer, simply noting

*As a result of psychometric analysis, the question was invalidated.*

Some psychometric analysis is probably in order, but VCAA appears to be pointing their psych gun at the wrong target.

**2019, Exam 1 (Here, and report here)**

**QB(6) (added 21/09/20)** The solution requires that a Markov process is involved, although this is not stated, either in the question or in the report.

**2018 NHT, Exam 1 (Here, and report here)**

**MCQ4 (added 23/09/20)** The question provides a histogram for a continuous distribution (bird beak sizes), and asks for the “closest” of five listed values to the interquartile range. As the examination report almost acknowledges (presumably in time for the grading), this cannot be determined from the histogram; three of the listed values may be closest, depending upon the precise distribution. The report suggests one of these values as the “best” estimate, but does not rely upon this suggestion. See the comments below.

**2017 Exam 2 (Here, and report here)**

**Q1(c)(ii) (added 13/11/20) – discussed here****.** The question is fundamentally nonsense, since there are infinitely many 1 x 3 matrices L that will solve the equation. As well, the 3 x 1 matrix given in the question does not represent the total value of the three products as indicated in Q(c)(i). The examination does not acknowledge either error, but does add irony to the error by whining about students incorrectly answering with a 3 x 1 matrix. **(30/10/22) **The examination report has finally been amended to acknowledge the obvious error, albeit in a snarky “no harm, no foul” manner. The fundamental nonsense of the question remains unacknowledged. As commenter DB has noted, the examination report also makes the hilarious claim,

*The overwhelming majority [of students] answered the question in the manner intended without a problem.*

We’re not sure the “minority” of 72% of students who scored 0/1 on the question would agree.

**2017 Exam 1 (Here, and report here)**

**MCQ11 (added 13/11/20) – discussed here.** None of the available answers is correct, since seasonal indices can be negative. The examination report does not acknowledge the error.

**MCQ6 Module 2 (added 05/09/22) – discussed here.** The intention of the question is reasonably clear, but the expression “how many different ways” is, at minimum, clumsily ambiguous, and one can argue for either C or D or E being correct. The intended answer was E, but many also students answered C or D. The examination report suggests that the incorrect answers were due to “simple counting errors”, which is possible but far from definite. The report also writes “Most students answered option B, C or D”, which is contradicted by the statistics; presumably the statistics are correct and the sentence is wrong, but it is unclear.

**2015 Exam 1 (Here, and report here)**

**MCQ9 Module 2 (added 30/09/20)** The question refers to cutting a wedge of cheese to make a “similar” wedge of cheese, but the new wedge is not (mathematically) similar. The exam report states that the word “similar” was intended “in its everyday sense” but noted the confusion, albeit in a weasely, “who woulda thought?” manner. A second answer was marked correct, although only after a fight over the issue.

**2011 Exam 1 (Here, and report here)**

**MCQ3 (added 23/11/22).** The question asks for the “closest” value for the median, but this cannot be determined from the provided histogram; two of the provided answers (B and C) may be correct. The examination report is silent on the issue.

Last week, the New South Wales government came out with the next great plan to Save Mathematics Education: make mathematics compulsory up until the end of high school. Why? According to Premier Gladys Berejiklian, this will “ensure students have the numeracy skills required to succeed in today’s society”.

Yes, of course. In exactly the same way, for example, that compulsory instruction in ethics ensures that lawyers and cops act ethically.

What’s the source for this latest nonsense? Well, it’s kind of, sort of from the Interim Report of the NSW Curriculum Review, which was released a few days earlier, and which is prominent in the Government’s media release. Like all such reports, the NSW Report is barely readable, the predictable mishmosh of pseudoscience, unweighted survey, statistics of undeterminable worth and contradictory motherhoodisms. Thankfully, there’s no reason to read the Report, since the NSW Government hasn’t bothered to read it either; nothing in the Report points to making mathematics compulsory throughout high school.

Still, it was easy enough to find “maths experts” who “applauded the move”. Jordan Baker, the *Sydney Morning Herald*‘s education reporter, quoted four such “experts”, although the only expert appearing to say much of substance was doing anything but applauding. Greg Ashman, who is always worth reading (especially when he is needling nitwits), pointed to the need for specialist teachers in lower years. He is then quoted:

*“You need to move away from the fashion for inquiry learning and problem-based learning and instead focus on high quality, interactive, explicit teaching of mathematics. Do that, and I believe numbers in year 12 would organically grow.”*

In other words, if you stop having shit teachers teaching shit maths in a shit manner in lower years then maybe more kids will choose to stick around a little longer. (Ashman is more collegial than this writer.)

The NSW government’s compulsion will undoubtedly push mathematics in the exact opposite direction, into ever more directionless playing and mathematical trivia dressed up as real world saviour. You know the stuff: figuring out credit cards and, God help us, “how to choose cancer treatment“.

To illustrate the point perfectly, Melbourne’s* Age *has just published one of its fun exam-time pieces. Titled “Are you smarter than a 12th grader?“, the reader was challenged to solve the following problem from yesterday’s Further Mathematics exam:

*A shop sells two types of discs: CDs and DVDs. CDs are sold for $7.00 each and DVDs are sold for $13.00 each. Bonnie bought a total of 16 discs for $178.00. How many DVDS did Bonnie buy?*

The question this problem raises isn’t are you smarter than a 12th grader. The real question is, are you smart enough to realise that making mathematics compulsory to 12th grade will doom way too many students to doing 7th grade mathematics for six years in a row? For the NSW government and their cheer squad of “maths experts”, the answer appears to be “No”.

Late last year we posted on Madness in the 2017 VCE mathematics exams, on blatant errors above and beyond the exams’ predictably general clunkiness. For one (Northern Hemisphere) exam, the subsequent VCAA Report had already appeared; this Report was pretty useless in general, and specifically it was silent on the error and the surrounding mathematical crap. None of the other reports had yet appeared.

Now, finally, all the exam reports are out. God only knows why it took half a year, but at least they’re out. We have already posted on one particularly nasty piece of nitpicking nonsense, and now we can review the VCAA‘s own assessment of their five errors:

- Mathematical Methods Exam 1 contained an utter hash of a question, with a fundamentally misleading graph and a (for Methods) undefined endpoint derivative. The Report makes no mention of the error or the dodginess of the graph. Students scored an average of 5% on the question.
- Mathematical Methods Exam 2 contained, twice, a type of question which previous years’ reports had suggested to solve invalidly. This year’s Report approaches the question validly the first time and then ducks the question with a dubious technique the second time. (More on this in a future post.) The Report makes no mention of the previously acceptable invalid approach, or whether that approach is still considered acceptable. Students scored an average of 66% on the first question (suggesting the invalid technique is secretly still acceptable) and 5% on the second.
- Specialist Mathematics Exam 1 contained a doubly improper integral, which cannot be done with Specialist techniques. The Report makes no comment on the error and, in light of the error, the Report provides no clue as to what was expected of the students, nor what was accepted from the students as correct. Students scored an average of 35% on the question.
- Further Mathematics Exam 1 included a multiple choice question with no correct answer. The Report makes no comment, simply indicating one answer (the trickiest answer to prove wrong) as correct, and that 59% of students gave the “correct” answer.
- Further Mathematics Exam 2 included an ill-posed question with erroneously transposed information. The Report makes no comment on the error. The Report does note that 72% of students received 0/1 for the question, with “many” students giving a 3 x 1 matrix rather than the correct 1 x 3, and presumably scoring 0 as a consequence. That’s pretty damn cute, given the examiners had already mixed up the rows and columns.

So, the VCAA responds to five blatant errors with five Trumpian silences. How should one describe such conduct? Unprofessional? Arrogant? Cowardly? VCAA-ish? All of the above?

Our fourth post on the 2017 VCE exam madness will be similar to our previous post: a quick whack of a straight-out error. This error was flagged by a teacher friend, David. (No, not that David.)

The 11th multiple choice question on the first Further Mathematics Exam reads as follows:

*Which one of the following statistics can never be negative? *

**A**. the maximum value in a data set

**B**. the value of a Pearson correlation coefficient

**C**. the value of a moving mean in a smoothed time series

**D**. the value of a seasonal index

**E**. the value of a slope of a least squares line fitted to a scatterplot

Before we get started, a quick word on the question’s repeated use of the redundant “the value of”.

Bleah!

Now, on with answering the question.

It is pretty obvious that the statistics in A, B, C and E can all be negative, so presumably the intended answer is D. However, D is also wrong: a seasonal index can also be negative. Unfortunately the explanation of “seasonal index” in the standard textbook is lost in a jungle of non-explanation, so to illustrate we’ll work through a very simple example.

Suppose a company’s profits *and losses* over the four quarters of a year are as follows:

So, the total profit over the year is $8,000, and then the **average quarterly profit **is $2000. The **seasonal index** (**SI**)** **for each quarter is then that quarter’s profit (or loss) divided by the average quarterly profit:

Clearly this example is general, in the sense that in any scenario where the seasonal data are both positive and negative, some of the seasonal indices will be negative. So, the exam question is not merely technically wrong, with a contrived example raising issues: the question is *wrong* wrong.

Now, to be fair, this time the VCAA has a defense. It appears to be more common to apply seasonal indices in contexts where all the data are one sign, or to use absolute values to then consider magnitudes of deviations. It also appears that most or all examples Further students would have studied included only positive data.

So, yes, the VCAA (and the Australian Curriculum) don’t bother to clarify the definition or permitted contexts for seasonal indices. And yes, the definition in the standard textbook implicitly permits negative seasonal indices. And yes, by this definition the exam question is plain wrong. *But*, hopefully most students weren’t paying sufficient attention to realise that the VCAA weren’t paying sufficient attention, and so all is ok.

Well, the defense is something like that. The VCAA can work on the wording.