Leading By Example

What a month. It’s raining mendacity.

Today, the ridiculous AMSI-AAMT-MERGA statement received further press coverage, this time in a report from education stenographer, Suzan Delibasic (paywalled, Murdoch):

“Leading experts are calling for a maths curriculum overhaul, with a major review set to focus on fixing declining academic results.”

Once the stage has been set with straight-faced paraphrasing of AMSI-AAMT-MERGA nonsense, Delbonis’s report consists of quotes from three of these “leading experts”, beginning with AAMT‘s CEO, Allan Dougan:

“The whole idea of a maths class where the teacher teaches the content and the students practise it 300 times, that’s what we’re moving away from.”

300 times? If a kid is assigned 30 exercises as practice, the school will call Child Services. 3 times is much closer to the current mark, particularly in primary school, where the real damage is being done.

We have no idea where Dougan dredged up his Dickensian dream, but of course it has nothing to do with reality. The reality is that decades of “leading experts” killing the teaching of technique, of denigrating proper practice is a huge part of why Australian mathematics education is currently a disaster. Dougan apparently imagines the cure is less practice than the trivial amounts that currently exist.

To illustrate the point, Dougan provides his own, striking example:

“[Dougan] said one problem-solving task could involve year 6 students taking part in an activity called It All Adds Up, where each letter of the alphabet is given a dollar value” 

“Letter A is $1 to Z being $26. You can start asking students open questions such as finding a four-letter word that costs $50 —the success of this task is how they approach it and how they think about problem solving.”

Looks like a fun game. How about VOID? Or CLOT? Do I win?

Seriously, Year 6? As an add-on activity for Year 2, maybe Year 3, sure. But if you imagine it reasonable to expect Year 6 students to gain anything from such an addition game, then your sense of appropriate skill level bears no relation to reality. And even for Year 2 or Year 3 students, it’s a game, which by definition cannot be the main game. You learn addition by practising addition – the carefully structured 30 times thing – not by the occasional random sum in the middle of a game.

Our second Leading Expert is AMSI‘s Director, Tim Marchant:

“Australian Mathematical Sciences Institute director Professor Tim Marchant said he was concerned by the shortage of qualified maths teachers.”

“The data shows about 50 per cent of schools have maths classes taught by teachers that aren’t qualified in maths,”

Well, it wouldn’t be AMSI if they weren’t punching down, whining about unqualified teachers. But Professor Marchant also considers classroom activities:

Prof Marchant said group activities in the classroom helped learning and made maths “fun” … He suggested hands-on learning experiences including using Rubik’s cubes to help with problem solving.

Rubik’s cubes. Not enough games, not enough “fun”, that’s the problem.

Once upon a time, we had hope that AMSI would be a genuine force for improving Australian mathematics education. Now, we’d be happy if AMSI would just shut up, stop signing ridiculous statements and go away.

Our final Leading Expert is Peter Sullivan, Emeritus Professor of Education at Monash University:

“The revised curriculum needs to be simply written so teachers can understand and comprehend it; we want the big ideas clearly articulated,”

That’s Peter the Great there, the guy who led the writing of the current Australian mathematics curriculum.

Leading Experts. The “experts” part is debatable, but the “leading” is absolutely clear. These people are leading Australia to an even deeper level of educational Hell.

RatS 11: Taibbi and The Miserableness of Documenting Censorship

We link a lot to Matt Taibbi. He is smart, he cares (mostly) about  the right things, and he is beholden to no one. Which means that, with the polarising idiocy of our times, Taibbi is hated by pretty much everyone. It’s always a good bet to judge a man by his enemies.

Taibbi has an on-going series, Meet the Censored, to which we link whenever he has a new post. Now, Taibbi has written about that series, and the difficulty bringing people on with what the series is really about. If for example, you were cheering when Trump was kicked off Twitter, you might want to think again. Having thought again, you’re probably still fine to cheer Trump being kicked off Twitter, and maybe that’s ok, but it is not a gimme.

Here is Taibbi’s post on the issue:

On the Miserable Necessity of Doing Censorship Stories in Pairs



Why Mathematics Education Must Change

The revisions to Australia’s mathematics curriculum will be out soon, and it appears that the fix may be in. This fix will, of course, fix nothing; our guess is that things are about to get much worse.

As reported in yesterday’s SMH, Australia’s major league Maths Ed groups have released a “Joint Statement on Proposed Maths Curriculum”. Cosigned by AMSI, AAMT and MERGA, as well as AAS and ATSIMA, the statement is titled Why Maths [sic] Must Change. The statement is a triumph of modern educational ignorance.

The statement begins by noting “the proposed revisions to the Australian maths curriculum” are forthcoming, and “the importance of getting it right”. We are then told what “getting it right” means. The statement is poorly written and vague, inappropriately and inaccurately colloquial, but the message is clear enough:

“More than ever, our society needs students who are adaptable, resilient, responsive to challenges and able to handle unfamiliar situations. It is not enough to have knowledge – they must have the skills to take that knowledge and apply it to solve unknown problems, and do it quickly.”

Yes, the cure for our maths ed ills is yet more problem-solving, yet more overhyped exploration. And, this is to be contrasted with the alternative, a focus on “knowledge”. The writers are so proud of this ridiculous straw man that they repeat it:

We need education systems and curricula that help deliver students to society who are up for such a challenge – just having knowledge is no longer enough. Instead, the abilities to problem-solve, mathematise, hypothesise, model are all skills that add worth to acquired knowledge. Mathematics learning cannot sit in silos that focus on content and procedures. Instead, it must be something that gives the knowledge purpose.

We expect no better from AAMT or MERGA, but what about AMSI? Aren’t they like mathematicians, or something? Do AMSI’s glorious leaders really believe this nonsense? Do they really believe that school mathematics is, or was ever, a purposeless “silo” of knowledge-acquisition? Do they honestly think that the problem with Australia’s mathematics education, the reason, for example, why the majority of secondary students have no proper concept of or facility with fractions, is because there has been too much focus on content and procedure? Do they really imagine that these fraction-deficient students can nonetheless boldly venture forth to “solve unknown problems”?

The idea is, of course, absurd. The whole statement is absurd, a mission statement from the very same constructivist, discipline-hating, technique-hating ignorants who have been selling this snake oil for decades, and who are one of the major reasons why Australian mathematics education is now such a disaster. And, of course, their suggested cure for the problem they very much helped create is more of the same snake oil.

There is more in the statement. There is the predictable pointing to Australia’s woeful but irrelevant PISA scores, and the predictable silence on Australia’s woeful and highly relevant TIMSS scores. The writers express the hope, indeed the promise, that PISA results will improve. Which may well be true; it is the mathematics education, and the education generally, that will suffer.

We will remark upon one more, very troubling line from the statement:

As such, the suggested revisions in the curriculum are not just welcomed …

What, exactly, are these “suggested revisions”, and how do the cosigners of this statement know the revisions are “welcomed”? There are strong indications of what ACARA intends, and that what they intend will be awful. As far as we are aware, however, ACARA has yet to make any proposed revisions available for public comment.

What this implies, assuming that the above line is not simply more poor wording, is that the drafters, and perhaps the cosigners of the statement, are privy to ACARA’s inner workings, and that they are pleased with them. As we wrote, it appears that the fix is in.

Anything that will please the cosigners of the statement Why Maths Must Change will be a disaster for Australian mathematics education, and it seems as if the cosigners have reason to be pleased. God help the rest of us.

ACARA’s Illiterature Review

A few weeks ago we wrote about ACARA‘s review of the Australian Curriculum. The mathematics component of ACARA’s review appears to be at least partially in the hands of some loose cannons – the Center for Curriculum Redesign – which ACARA seems now to regret having hired. CCR is still on our to-whack list, but ACARA’s general documentation for their review is also worth scrutiny, particularly the document we consider in this post. In all, it makes a fine example of how hundreds of pages on best practising and evidence-basing and world benchmarking can amount to little more than manipulative blather.

The main webpage for ACARA’s curriculum review consists of overview: welcoming videos, motherhood declarations, the terms of reference, a timeline and the like. The substantive basis for the review then appears on a separate page, Program of Research. It is the documents on this PoR page that we’ll be analysing.

The PoR provides links to six documents. Four of these documents are long “comparative studies” of the current Australian Curriculum with other curricula: in turn, British Columbia, Finland, Singapore and New Zealand. A fifth document then attempts to summarise these four comparative studies, indicating the “key findings”. We intend to write about the Key Findings and the Australia-Singapore Comparative Review in future posts.

The subject of this post is the sixth and final PoR document, a “Literature Review“. Listed first and subtitled “contemporary approaches to comparative education research”, the Literature Review purports to give the theoretical grounding for the application of the substantive curriculum comparisons that follow. The Literature Review begins

“This paper explores developments in the field of comparative education research, including references to methodological approaches that may inform the design and focus of ACARA’s program of research and international comparison (2017-2020)”

And, the Literature Review closes with a final, one-sentence paragraph:

“A critical consideration is the fact that curriculum is only one part of the educational equation.”

Ignoring the questionable grammar, how did the Literature Review get from A to Z, and what does it mean? The Review is dense with jargon and name-dropping, as literature reviews tend to be, and we’ll attempt to give some sense of the Review below. But already ACARA’s main, double-barrelled message is pretty clear:

Regardless of what other countries are doing, ACARA has license to do what they want and, whatever subsequently happens, it is not their fault.

That is bad enough, but things are much, much worse.

The Literature Review begins by noting that international comparisons are all the rage, in education and everything. In particular there are major international tests – the ridiculous PISA, and the not-ridiculous TIMSS, and PIRLS – which invite such comparisons, and which tend to be the focus of media reports, and of subsequent social and political reaction.

The Literature Review continues by discussing this trend, meandering from authority to authority. There are few endorsed conclusions but there is plenty of gaming, with the Review hovering around two implied concerns. Firstly, and presumably the central purpose of the Review, it is suggested that more general national differences make educational comparisons fundamentally difficult:

This area of research has become increasingly contested, however, insofar as there are perceptions of a focus on systemic improvement without a concomitant appreciation of socio-cultural (and other) context, philosophy of education and capacity to effect change. The risks posed by inadequate consideration of local issues are raised in discussions of the ‘rationality and irrationality of international comparative studies’ (Keitel & Kilpatrick, 1999).

Secondly, the focus upon international tests may result in an excessive focus on “literacy and numeracy”, narrowing the comparison of curricula:

… some researchers [claim] that ‘international comparison bolsters an evaluation mandate that promotes a superficial global awareness while stifling originality by displacing the core objectives of education’ (Hebert, 2012, p. 18). This reflects a view that comparative research must move beyond mere comparison of scores (e.g. PISA), and that more studies are needed in areas such as creativity, talent, ethical sensibilities and also in relation to values and attitudes more relevant to the needs of 21st century students (Hebert, 2012).

This leads to a consequent concern, that a “league table” focus on international comparison can result in pressure to “teach to the test”, thus narrowing the curriculum itself:

“In rejecting evaluation mandates, Hebert (2012) observes that literacy and numeracy often overshadow other education objectives (e.g. creativity, ethics, knowledge of history, etc.) central to educational systems as a consequence of ‘unbalanced policy-making’.”

Although containing a kernel of truth, there is plenty to criticise in ACARA’s statements, not least the disingenuous “some people say” framing. What, indeed, is the purpose here of a “literature review”? The only value for such preliminary documents is to determine the basis of the curricula comparisons to come, and a sequence of unsubstantiated claims from unendorsed authorities cannot possibly provide a proper basis. If ACARA has determined the basis of their curriculum review, which of course they must, then they are obligated to take their stand and to state it clearly. The implausible deniability inherent in ACARA’s literature review is ridiculous and cowardly.

As for ACARA’s concerns, well, yes, and no. Sure, there are good reasons for rural Peru to not compare themselves too critically to South Korea; it is much less clear, however, why Australia, with about the same GDP as South Korea and with half the population, should flinch from such a comparison. And true, the league tables don’t necessarily make even a superficial comparison easy; if South Korea is ranked third on some test with a score of 607 and Australia is tenth with a 517, that of itself tells us nothing. If, however, only 61% of Year 8 Australian students can figure out the fourth, very easy angle of a quadrilateral, while 86% of Korean students can do the same (p 181), that suggests something. And, if the same relative failure occurs question after question, that suggests a lot.

On ACARA’s concerns about the narrowing of the curriculum and of curriculum comparisons, one can only wish it were so. ACARA’s “values and attitudes more relevant to the needs of 21st century students” is undefined and undefinable; it is meaningless twaddle. And, whatever the place of “creativity” and “ethical values” and so forth in a curriculum, it can only be meaningful coming on top a solid foundation of reading and writing and mathematical sense. The deep and proper teaching of the three Rs, and establishing the necessary classroom culture in which to do it, is the critical basis of any coherent curriculum, as it has always been.

Ironically, while ACARA’s conscious undermining of international comparisons is strained and weak, ACARA also fails to raise other, much more substantial concerns. To begin, ACARA doesn’t even consider the possibility that international tests can intrinsically, on their merits, be awful; this is regrettable, since the only rational response to the question of how to use PISA scores for international comparison, or anything, is “Don’t”. Further, even if the test is not awful, it is not automatic that using the test results for comparison is necessary or particularly enlightening. What, for example, if less than half of Year 8 Australian students can give the prime factorisation of 36 in answer to a multiple choice question (p 6)? What if less than half of the same cohort can rewrite \boldsymbol{\frac{4}{14}} as \boldsymbol{\frac{\Box}{21}}, again in a multiple choice question (p 14)? Does one really need to look to how Korean kids are doing to recognise that something is seriously screwed up in Australia?

It is arguably worse than that. International comparisons, even those based upon intrinsically good tests, may not be just unnecessary but also misleadingly optimistic. What if, as there is reason to believe, the tests are getting dumber? What if, as there is reason to believe, the entire World is getting dumber?  Australia coming a constant tenth in a dumbing world is not a constant; it is a decline, and possibly a steep decline.

ACARA’s simultaneous failure to grasp the clear benefits and the genuine flaws of international comparison education is entirely predictable. It stems from ACARA’s inability to contemplate, let alone declare, a simple, objective basis for a coherent and productive school curriculum. If ACARA had any such ability then they would realise that, rather than current Australia being compared to other places, it should be compared to other times, to other centuries. If one wallows in a nonsense-swamp of 21st century idolatry, it is impossible to contemplate that anything might have been done better, and much better, in an earlier time. Such is ACARA’s blinkered thinking and such is Australia’s fate, and the fate of the World.

We could go on. ACARA’s Literature Review contains much more, and almost nothing. We could point out further, monumental flaws in the Review, but there is probably no need. We’ll simply note that nothing in the Review could assist in making useful international comparisons of education. And, much more importantly, there is nothing in the Review that could assist in the creation of a simple, coherent and productive school curriculum.

ACARA’s review of the Australian Curriculum is destined to be a disaster. The review will undoubtedly leave Australia with the same bloated, baseless, aimless idiocy that it has now. ACARA, and the educational authorities with which they consort and upon which they rely, are congenitally incapable of anything else.

The Coronavirus Vaccine and Australia’s Dangerous Clot

Most people will be aware that Australia’s rollout of coronavirus vaccines is being threatened by a dangerous clot. But it’s not just Greg Hunt. As well as the Health Minister there are a number of other problems, including a second, worrying clot.

Of course, ScoMoFo and his team of incompetent goons have screwed up Australia’s vaccination program, and of course they’re too busy image-managing and blame-shifting to work to fix it up. But there is also a serious question about the AstraZeneca vaccine and the prevalence of dangerous blood-clotting.

In brief, is the AstraZeneca vaccine sufficiently safe to warrant its use? How likely is the vaccine to protect a person, and protect them from what? What are the dangers of the vaccine, how likely are the dangers to eventuate, and how dangerous are the dangers?

We’re open-minded on the question, we haven’t looked hard for trustworthy answers, and we’d appreciate it if anyone can point us to reasonable and reliable evidence. We’ll happily work to digest any good-faith analyses, and will look to write about it in future posts. It would not surprise us if we ended up being convinced that the AstraZeneca vaccine, although not without significant risks, is worth the risks. But, as it stands, we’re not convinced.

One thing that does absolutely nothing to convince us that the AstraZeneca vaccine is sufficiently safe is pronouncements from ScoMoFo and GregHunt that the vaccine is sufficiently safe. Declarations from these self-interested con men, on anything, are worthless. We are also not at all comforted by government apparatchik’s chanting “no proven link”, as if some formal proof of a link rather than statistical evidence is the critical issue right now. All these people may be telling the truth, but they are so heavily invested in manipulation and crowd management that it is impossible to tell.

We look forward to reading, and writing upon, whatever non-crazy material is thrown our way.




The Saber-Tooth Curriculum

A few months ago, frequent commenter Red Five noted a pretty much forgotten book, The Saber-Tooth Curriculum. Written in 1939 by the mythical J. Abner Peddiwell – the creation of education professor H. R. W. Benjamin – the book is a series of drunken lectures on the nature of education during the paleolithic era. That education supposedly included lessons such as saber-tooth-tiger-scaring-with-fire, long after saber-tooth tigers had disappeared, and so on.

The book is crazily satirical, happily takes shots at everybody, and it holds up well. Maybe not well enough to bother hunting out – it is difficult to sustain such a parody for 100+ pages – but The Saber-Tooth Curriculum is clever and pretty funny. Surprisingly so since humour, particularly topical humour, tends to date quickly.

Below is our favourite passage from the book, concerned with the establishment of university courses for teachers, and the introduction of professors of paleolithic education.


The crude, naive work of the education professors was regarded with contempt by the subject-matter specialists. It was inevitable that a man who who had devoted a lifetime of productive scholarship or systematic speculation to such a problem as The Mystical Element in Sputtering Firebrands as Applied to Tiger-Whiskers or Variations in Thumb-Holds for Grabbing Fish Headed Outward from the Grabber at an Angle of Forty-Five Degrees Plus or Minus Three should be contemptuous of pseudo scholars who were merely trying to show students how to teach.

The academic contempt for pedagogy had a good effect on the education professors. Stung by justified references to their low cultural status, they resolved to make their discipline respectable. With a magnificent display of energy and self-denial, they achieved this goal. First, they organized their subject systematically, breaking it down into respectably small units, erecting barriers to keep professors conventionally isolated from ideas outside their restricted areas, and demanding specialization and more specialization in order to achieve the narrow knowledge and broad ignorance which the paleolithic university demanded of its most truly distinguished faculty members.

Second, they required all members of their group to engage in scientific research in education by counting and measuring quantitatively everything related to education which could be counted and measured. It was here that the professors of education showed the greatest courage and ingenuity. They confronted almost insuperable obstacles in the fact that education dealt with the changing of human minds, a most complex phenomenon. The task of measuring a learning situation involving an unknown number of factors continually modifying each other at unknown rates of speed and with unknown effects was a tremendous one, but the professors did not hesitate to attack it.

Finally, the professors of education worked for academic respectability by making their subject hard to learn. This, too, was a difficult task, but they succeeded admirably by imitating the procedures of their academic colleagues. They organized their subject logically. This necessarily resulted in their giving the abstract and philosophical courses in education first, delaying all practical work in the subject until the student was thoroughly familiar with the accustomed verbalizations of the craft and, thereby, immunized against infection from new ideas. They adopted the lecture method almost exclusively and labored with success to make it an even duller instrument of instruction than it was in the fields of ichthyology, equinology and defense engineering. They developed a special terminology for their lectures until they were as difficult to understand as any in the strictly cultural fields.

Thus the subject of education became respectable. It had as great a variety of specialists as any field. Some of its professors tried to cover the whole area of the psychology of learning, it is true, but most of them confined their efforts to some more manageable topic like the psychology of learning the preliminary water approach in fish-grabbing. Its research workers were so completely scientific that they could take a large error in the measurement of what they thought maybe was learning in a particular situation and refine it statistically until it seemed to be almost smaller and certainly more respectable than before. Its professors could lecture on modern activity methods of instruction with a scholarly dullness unequalled even by professors of equicephalic anatomy. Their cultured colleagues who had once treated them with contempt were now forced to regard them with suspicious but respectful envy. They had arrived academically.


What Does “Technology” Mean?

To be more precise, what does “digital technology” mean and, precisely as possible, how is Digital Technology X used in Year Y of schooling?

It is now impossible, of course, to write a document on education without genuflecting to the God of Technology. The repetitious chanting of “technology”, like a wired Tibetan monk, is the way people with no sense of the past or the present indicate how hip they are with the future. But, what do they mean? What technology are they talking about? It is a serious question, of which we only vaguely know the answer. We want help.

Of course by “technology”, the Education Experts are never intending to refer to something like blackboards and chalk. They would not even recognise such primitive devices as products of technology, although of course they are. No, what the EE mean by “technology” is electronic devices, mostly computers and computer programs, and preferably devices that are internetted. So, calculators and electronic whiteboards and Mathletics and Reading Eggs and iPads, and so forth.

The question is, precisely how are these devices used in specific classrooms? For example, are calculators used in Year 5 to perform arithmetic calculations, or to check calculations that have been done by hand? Is Mathletics used in Year 7 to teach ideas or to test knowledge and/or skills?

The same question applies to all subjects. Are word processors used in Year 6 to check and/or teach spelling and grammar? Are iPads used in Year 8 to check the definitions of words?

We want to know as much as possible, and as specifically as possible, what electronic gizmos are being used, and with whom and how.

RatS 10: Adam Curtis’s CGYOoMH

Well, to be more accurate, this is a WatS, since it’s a BBC series.

Adam Curtis is a unique, brilliant filmmaker, exploring the psychology and politics of modern society like no one else. Two previous series, The Power of Nightmares and The Century of the Self, are musts. Curtis now has a new series, Can’t Get You Out of My Head: an Emotional History of the Modern WorldIt is viewable on BBC iPlayer (with VPN trickery) and, at least for now, here. It is great.

MitPY 13: Trigonometry and Wolfram Alpha

This MitPY comes from frequent commenter, John Friend:

Dear Colleagues,

I gave a CAS-FREE question to my Specialist students whose first part was to solve (exactly) the equation \boldsymbol{\cot 2x = \sec x}. I solved it two different ways and got two different answers that are equivalent. I’ve attached my calculations.

I checked my answers using Mathematica, which lead to my question: Mathematica gives a third different but equivalent answer (scroll down to real solutions). How has Mathematica got this answer?

It may be that Mathematica ‘used’ my Method 2, got my tan answer and then for some arcane reason ‘manipulated’ this answer into the one it finally gives. If so, I can ascribe the answer to a Mathematica quirk. But it may be that Mathematica is using a method unclear to me that leads to its answer. If so, I’m curious.

Any thoughts are appreciated.

Click to access Calculations.pdf

The VCAA Draft and its Third Rail

We’ve looked a little more closely at VCAA’s Draft for the new mathematics VCE subjects. Yes, the time for feedback has ended, unless it hasn’t: the MAV are offering a Zoom session TODAY (Thursday 25/3) for members. God knows how or why. But in any case, it’ll be a while before VCAA cements the thing in place: plenty of time to ignore everyone’s suggestions.

The following are our thoughts on the Draft and Overview. It will be brief and disorganised, since there is no point in doing more; as we wrote, the content doesn’t matter as much as the fact that, whatever content, VCAA will undoubtedly screw it up. Still, there are some clear and depressing points to be made. We haven’t paid much specific attention to what is new nonsense, and what is the same old nonsense; nonsense is nonsense.


  • The draft looks like a primary school book report. Someone at VCAA really should learn \LaTeX.
  • “Computational Thinking” is meaningless buzzery, and will be endemic, insidious and idiotic. It will poison everything. Every step of Methods and Specialist is subject to the scrutiny of Outcome 3:

“On completion of this unit the student should be able to apply computational thinking and use numerical, graphical, symbolic and statistical functionalities of technology to develop mathematical ideas, produce results and carry out analysis in practical situations requiring problem-solving, modelling or investigative techniques or approaches.”

“Statistical functionalities of technology”. And, there’s way more:

“key elements of algorithm design: sequencing, decision-making, repetition, and representation including the use of pseudocode.”

“use computational thinking, algorithms, models and simulations to solve problems related to a given context”

“the role of developing algorithms and expressing these through pseudocode to help determine and understand mathematical ideas and results”

“the purpose and effect of sequencing, decision-making and repetition statements on relevant functionalities of technology, and their role in the design of algorithms and simulations”

“design and implement simulations and algorithms using appropriate functionalities of technology”

This will all be the same aimless, pseudo-exploratory, CAS-drenched garbage that currently screws VCE, but much, much worse. Anybody who signs off on this idiocy should hang their head in shame.

  • CAS shit will now be worse than ever.
  • There should be no CAS exam, at all.
  • There should be no bound notes permitted in any exam.
  • Don’t write “technology”. It is pompous and meaningless. If you mean “CAS” then write “CAS”.
  • SACs have always been shit and will always be shit. The increased weight on them is insane.
  • The statistics is the same pointless bullshit it always was.
  • The presence of “proof” as a topic in Specialist highlights the anti-mathematical insanity of VCAA and ACARA curricula: proof has zero existence elsewhere. Much of what appears in the proof topic could naturally and engagingly and productively be taught at much lower levels. But of course, that would get in the way of VCAA’s constructivist fantasy, now with New and Improved Computational Thinking.



  • Not including integration by substitution is still and will always be the most stupid aspect of Methods.
  • Dilations must be understood expressed as both “parallel to an axis” and “from an axis”? But not in terms of the direction the damn points are moving? Cute.
  • The definition of independent events is wrong.
  • The demand that, for the composition \boldsymbol{f\circ g}, the range of \boldsymbol{g} must be a subset of the domain of \boldsymbol{f} is as pedantic and as pointless as ever.
  • “literal equations” is the kind of blather that only a maths ed clown could think has value.
  • The derivative of the inverse is still not in the syllabus, and everyone will still cheat and use it anyway.
  • “trapezium rule” is gauche but, more importantly, what is the purpose of teaching such integral approximation here? Yes, one can imagine a reasonable purpose, but we’ll lay odds there is no such purpose here.



  • The killing of mechanics is a crime.
  • The inclusion of logic and proof and the discrete topics could be good. But it won’t be. It will be shallow and formulaic and algorithmised, and graded in a painfully pedantic manner. Just imagine, for example, how mathematical induction will be assessed on exams: “Students often wrote \boldsymbol{n} instead of \boldsymbol{k}. Students should be aware of the proper use of these variables.”
  • There is no value here in “proof by contrapositive”, and it is confusing. Proof by contradiction suffices.
  • They’re really including integration by parts? Incredible.
  • The inclusion of cross products and plane equations makes some sense.