AMSI to the Rescue

The Australian Mathematical Sciences Institute has just released its Report on Australian Year 12 students’ participation in mathematics from 2008 to 2017. The Report indicates, of course, that the percentage of girls doing a “higher level” maths subject is lower than the percentage of boys. (One headline trumpets that “Less girls are studying maths than boys”, proving only that fewer journalists are studying grammar.) More generally, the overall participation in higher level maths is reportedly the lowest for 20 years.


Who would have thought that a boring and aimless curriculum, and lousy texts, and the crappy training of teachers by clowns who have in turn had crappy training, and the belittling of Mathematics by the S and the T and the E of STEM, and the faddish genuflection to technical gods, and decades of just plain dumbing down would have pissed off so many students?

AMSI’s Report of the bleeding obvious doesn’t consider the causes of the decline in participation. Fair enough. The Report is seriously flawed, however, in failing to note that the meaning of “higher level mathematics” is not a constant. The “higher level mathematics” of 2017 is significantly lower than that of 1997, which is lower again than that of 1977. The problem is much, much worse than AMSI’s Report suggests.

AMSI is not just reporting on the decline in participation, they are supposedly working to fix it. AMSI’s new director, Tim Brown, has been out and about, discussing the Report. Professor Brown is reported as saying that the reasons for the decline are “varied”, but of these varied reasons, he appears to have indicated just two to the media; first, “a shortage in qualified maths teachers”; second, “teaching from the textbook” rather than “active learning”.

Really? With all those plump targets, AMSI chooses these two? Yes, the lack of qualified teachers is a problem, and a problem AMSI apparently enjoys talking about. And yes, the current textbooks are appalling. But such low-fruit targets are not the substantive problem, and false fixes of second order issues will do little or nothing to improve matters. The real issue is one of systemic cultural decline.

We believe Professor Brown knows this. The question is, will Professor Brown drag AMSI, finally, into waging the genuine, important fights that need to be fought?

NAPLAN’s Latest Last Legs

The news is that NAPLAN is on its way out. An article from SMH Education Editor Jordan Baker quotes Boston College’s Andy Hargreaves claiming tests such as NAPLAN are on their “last legs”. This has the ring of truth, since Professor Hargreaves is … who knows? We’re not told anything about who Hargreaves is, or why we should bother listening to him.

Perhaps Professor Hargreaves is correct, but we have reason to doubt it. And, Jordan Baker has been administering NAPLAN’s last rites for a while now. Last year, Baker wrote another article, on NAPLAN’s “death knell”.

Regular readers of this blog would be aware that this writer would love nothing more than to see ACARA sink into the sea, taking its idiotic tests and clueless curriculum with it. But it’s important to understand why, and why the argument for getting rid of NAPLAN is no gimme. It is here that we disagree with Hargreaves and (we suspect) Baker.

Baker quotes Hargreaves on national testing such as NAPLAN and its “unintended impact of students’ well-being and learning”:

[They include] students’ anxiety, teaching for the test, narrowing of the curriculum and teachers avoiding innovation in the years when the tests were conducted.

Let’s consider Hargreaves’ points in reverse order.

  • Innovation. Yes, a focus on NAPLAN would discourage innovation. Which would be a bad thing if the innovation wasn’t poisonous, techno-fetishistic nonsense. Hargreaves, someone, has to give a convincing argument that current educational innovation is generally positive. We’ll wait. We won’t hold our breath.    
  • Narrowing of the curriculum? We can only wish. The Australian Curriculum is a blivit, a bloated mass of pointlessness.
  • Teaching to the test is of course a bad thing. Except if it isn’t. If you have a good test then teaching to the test is a great thing.
  • Finally, we have to deal with students’ anxiety, a concern for which has turned into an academic industry. All those poor little petals having their egos bruised. Heaven forbid that we require students to struggle with the hard business of learning.

There is plenty to worry about with any national testing scheme: the age of the students, the frequency of the tests, the reporting and use of test results, and the ability to have an informed public discussion of all of this. But all of this is secondary.

The problem with the NAPLAN tests isn’t their “intended consequences”. The problem with the NAPLAN tests is the tests. They’re shithouse.


PoSWW 5: Intelligence is not a Factor

The following PoSWW comes courtesy of Franz, who states that “when it comes to ‘stupid curricula, stupid texts and really monumentally stupid exams’ no Western country, with the possible exception of the US, is worse than Germany.” We take that as a challenge, and we’re waiting for Franz to back up his crazy-brave claim.

Franz’s PoSWW, however, has nothing to do with Germany. This PoSSW follows on from two of our previous posts, on idiotic questions appearing in New Zealand exams. Franz wrote to us, noting that the same style of question appears in the Oxford Year 8 text My Maths. Indeed, a number of versions of this ludicrous question appear in My Maths, all inventively awful in their own way. The two examples below are enough to give the flavour: