This adventure game comes courtesy of Victoria’s Department of Education and Training. Click on the graphic, or go here. (Don’t try to do it all. You will die.) If you can be bothered, you can also complete a survey here.
A few days ago we received an email from Aaron, a primary school teacher in South Australia. Apparently motivated by some of our posts, and our recent thumping of PISA in particular, Aaron wrote on his confusion on what type of mathematics teaching was valuable and asked for our opinion. Though we are less familiar with primary teaching, of course we intend to respond to Aaron. (As readers of this blog should know by now, we’re happy to give our opinion on any topic, at any time, whether or not there has been a request to do so, and whether or not we have a clue about the topic. We’re generous that way.) It seemed to us, however, that some of the commenters on this blog may be better placed to respond, and also that any resulting discussion may be of general interest.
With Aaron’s permission, we have reprinted his email, below, and readers are invited to comment. Note that Aaron’s query is on primary school teaching, and commenters may wish to keep that in mind, but the issues are clearly broader and all relevant discussion is welcome.
Good afternoon, my name is Aaron and I am a primary teacher based in South Australia. I have both suffered at the hands of terrible maths teachers in my life and had to line manage awful maths teachers in the past. I have returned to the classroom and am now responsible for turning students who loathe maths and have big challenges with it, into stimulated, curious and adventure seeking mathematicians.
Upon commencing following your blog some time ago I have become increasingly concerned I may not know what it is students need to do in maths after all!
I am a believer that desperately seeking to make maths “contextual and relevant” is a waste, and that learning maths for the sake of advancing intellectual curiosity and a capacity to analyse and solve problems should be reason enough to do maths. I had not recognised the dumbing-down affect of renaming maths as numeracy, and its attendant repurposing of school as a job-skills training ground (similarly with STEM!) until I started reading your work. Your recent post on PISA crap highlighting how the questions were only testing low level mathematics but disguising that with lots of words was also really important in terms of helping me assess my readiness to teach. I have to admit I thought having students uncover the maths in word problems was important and have done a lot of work around that in the past.
I would like to know what practices you believe constitutes great practice for teaching in the primary classroom. I get the sense it involves not much word-problem work, but rather operating from the gradual release of responsibility (I do – we do – you do) explicit teaching model.
I would really value your thoughts around this.
Below are two “units” (scenarios) used in the PISA 2012 testing of mathematics. The units appeared in this collection of test questions and sample questions, and appear to be the most recent questions publicly available. Our intention is for the units to be read in conjunction with this post, and see also here, but of course readers are free to comment here as well. The two units below are, in our estimation, the most difficult or conceptually involved of the PISA 2012 units publicly available; most questions in most other units are significantly more straight-forward.
Here’s an interesting tidbit: PISA‘s mathematics testing doesn’t test mathematics. Weird, huh? Who knew?
Well, we kinda knew. Trustworthy colleagues had suggested to us that PISA was slanted, but finding out the extent of that slant, like lying-on-the-ground slant, was genuinely surprising. (We’re clearly just too optimistic about the world of education.) Not that we had any excuse for being surprised; there were clues of mathematical crime in plain sight, and it was easy enough to locate the bodies.
The first clues are on PISA’s summary page on “Mathematics Performance“. The title is already a concern; qualifications and elaborations of “mathematics” usually indicate some kind of dilution, and “performance” sounds like a pretty weird elaboration. Perhaps “mathematics performance” might be dismissed as an eccentricity, but what follows cannot be so dismissed. Here is PISA’s summary of the meaning of “mathematical performance”:
Mathematical performance, for PISA, measures the mathematical literacy of a 15 year-old to formulate, employ and interpret mathematics in a variety of contexts to describe, predict and explain phenomena, recognising the role that mathematics plays in the world. The mean score is the measure. A mathematically literate student recognises the role that mathematics plays in the world in order to make well-founded judgments and decisions needed by constructive, engaged and reflective citizens.
The alarms are set off by “mathematical literacy”, a pompous expression that promises more than, while signalling we’ll be getting much less than, straight mathematics. All doubt is then ended with the phrase “the role that mathematics plays in the world”, which is so fundamental that it is repeated verbatim.
What this sums to, of course, is numeracy, the noxious weed that inevitably chokes everything whenever there’s an opportunity to discuss the teaching of mathematics. What this promises is, akin to NAPLAN, PISA’s test of “mathematical performance” will centre on shallow and contrived scenarios, presented with triple the required words, and demanding little more than simple arithmetic. Before investigating PISA’s profound new world, however, there’s another aspect of PISA that really could do with a whack.
We have been told that the worldly mathematics that PISA tests is needed by “constructive, engaged and reflective citizens”. Well, there’s nothing like irrelevant and garishly manipulative salesmanship to undermine what you’re selling. The puffing up of PISA’s “world” mathematics has no place in what should be a clear and dispassionate description of the nature of the testing. Moreover, even on its own terms, the puffery is silly. The whole point of mathematics is that it is abstract and transferrable, that the formulas and techniques illustrated with one setting can be applied in countless others. Whatever the benefits of PISA’s real world mathematics for constructive, engaged and reflective citizens, there will be the exact same benefits for destructive, disengaged psychopaths. PISA imagines Florence Nightingale calculating drip rates? We imagine a CIA torturer calculating drip rates.
PISA’s flamboyent self-promotion seems part and parcel of its reporting. Insights and Inpretations, PISA’s summary of the 2018 test results, comes served with many flavours of Kool-Aid. It includes endless fussing about “the digital world” which, we’re told, “is becoming a sizeable part of the real world”. Reading has changed, since it is apparently “no longer mainly about extracting information”. And teaching has changed, because there’s “the race with technology”. The document wallows in the growth mindset swamp, and on and on. But not to fear, because PISA, marvellous PISA, is on top of it, and has “evolved to better capture these demands”. More accurately, PISA has evolved to better market itself clothed in modern educational fetishism.
Now, to the promised crimes. The PISA test is administered to 15 year old students (typically Year 9 or, more often, Year 10 in Australia). What mathematics, then, does PISA consider worth asking these fifteen year olds? PISA’s tests questions page directs to a document containing questions from the PISA 2012 test, as well as sample questions and questions from earlier PISAs; these appear to be the most recent questions made publicly available, and are presumably representative of PISA 2018. In total, the document provides eleven scenarios or “units” from the PISA 2012 test, comprising twenty-six questions.
To illustrate what is offered in those twenty-six questions from PISA 2012, we have posted two of the units here, and a third unit here. It is also not difficult, however, to indicate the general nature of the questions. First, as evidenced by the posted units, and the reason for posting them elsewhere, the questions are long and boring; the main challenge of these units is to suppress the gag reflex long enough to digest them. As for the mathematical content, as we flagged, there is very little; indeed, there is less mathematics than there appears, since students are permitted to use a calculator. Predictably, every unit is a “context” scenario, without a single straight mathematics question. Then, for about half of the twenty-six questions, we would categorise the mathematics required to be somewhere between easy and trivial, involving a very simple arithmetic step (with calculator) or simple geometric idea, or less. About a quarter of the questions are computationally longer, involving a number of arithmetic steps (with calculator), but contain no greater conceptual depth. The remaining questions are in some sense more conceptual, though that “more” should be thought of as “not much more”. None of the questions could be considered deep, or remotely interesting. Shallowness aside, the breadth of mathematics covered is remarkably small. These are fifteen year old students being tested, but no geometry is required beyond the area of a rectangle, Pythagoras’s theorem and very simple fractions of a circle; there is no trigonometry or similarity; there is no probability; there are no primes or powers or factorisation; there are no explicit functions, and the only implicit functional behaviour is linear.
D = dv/(60n) .
(The meaning of the variables and the formula needn’t concern us here, although we’ll note that it takes a special type of clown to employ an upper case D and a lower case d in the same formula.)
There are two questions on this equation, the first asking for the change in D if n is doubled. (There is some WitCHlike idiocy in the suggested grading for the question, but we’ll leave that as a puzzle for the reader.) For the second question (labelled “Question 3” for God knows what reason), students are given specific, simple values of D, d and n, and they are required to calculate v (with a calculator). That’s it. That is the sum total of the algebra on the twenty-six questions, and that is disgraceful.
Algebra is everything in mathematics. Algebra is how we name the quantity we’re after, setting the stage for its capture. Algebra is how we signify pattern, allowing us to hunt for deeper pattern. Algebra is how we indicate the relationship between quantities. Algebra is how Descartes captured geometry, and how Newton and Leibniz captured calculus.
It is not difficult to guess why PISA sidelines algebra, since it is standard, particularly from numeracy fanatics, to stereotype algebra as abstract, as something only within mathematics. But of course, even from PISA’s blinkered numeracy perspective, this is nonsense. You want to think about mathematics in the world? Then the discovery and the analysis of patterns, and the analysis of relationships, of functions is the heart of it. And what makes the heart beat is algebra.
Does PISA offer anything of value? Well, yeah, a little. It is a non-trivial and worthwhile skill to be able to extract intrinsically simple mathematics from a busy and wordy scenario. But it’s not that important, and it’s hardly the profound “higher order” thinking that some claim PISA offers. It is a shrivelled pea of an offering, which completely ignores vast fields of mathematics and mathematical thought.
PISA’s disregard of algebra is ridiculous and shameful, the final stake in PISA’s thoroughly nailed coffin. It demonstrates that PISA isn’t “higher” or “real”, it is just other, and it is an other we would all be much better off without.
The VCAA is reportedly planning to introduce Foundation Mathematics, a new, lower-level year 12 mathematics subject. According to Age reporter Madeleine Heffernan, “It is hoped that the new subject will attract students who would not otherwise choose a maths subject for year 12 …”. Which is good, why?
Predictably, the VCAA is hell-bent on not solving the wrong problem. It simply doesn’t matter that not more students continue with mathematics in Year 12. What matters is that so many students learn bugger all mathematics in the previous twelve years. And why should anyone believe that, at that final stage of schooling, one more year of Maths-Lite will make any significant difference?
The problem with Year 12 that the VCAA should be attempting to solve is that so few students are choosing the more advanced mathematics subjects. Heffernan appears to have interviewed AMSI Director Tim Brown, who noted the obvious, that introducing the new subject “would not arrest the worrying decline of students studying higher level maths – specialist maths – in year 12.” (Tim could have added that Year 12 Specialist Mathematics is also a second rate subject, but one can expect only so much from AMSI.)
It is not clear that anybody other than the VCAA sees any wisdom in their plan. Professor Brown’s extended response to Heffernan is one of quiet exasperation. The comments that follow Heffernan’s report are less quiet and are appropriately scathing. So who, if anyone, did the VCAA find to endorse this distracting silliness?
But, is it worse than silly? VCAA’s new subject won’t offer significant improvement, but could it make matters worse? According to Heffernan, there’s nothing to worry about:
“The new subject will be carefully designed to discourage students from downgrading their maths study.”
Maybe. We doubt it.
Ms. Heffernan appears to be a younger reporter, so we’ll be so forward as to offer her a word of advice: if you’re going to transcribe tendentious and self-serving claims provided by the primary source for and the subject of your report, it is accurate, and prudent, to avoid reporting those claims as if they were established fact.
A few days ago the Sydney Morning Herald published yet another opinion piece on Australia’s terrific PISA results. The piece was by Richard Holden, a professor of economics at UNSW, and Adrian Piccoli, formerly a state Minster for Education and now director of the Gonski Institute at UNSW. Holden’s and Piccoli’s piece was titled
‘Back to basics’ is not our education cure – it’s where we’ve gone wrong
Oh, really? And what’s the evidence for that? The piece begins,
A “back to basics” response to the latest PISA results is wrong and ignores the other data Australia has spent more than 10 years obsessing about – NAPLAN. The National Assessment Program – Literacy and Numeracy is all about going back to basics ...
The piece goes on, arguing that the years of emphasis on NAPLAN demonstrate that Australia has concentrated upon and is doing fine with “the basics”, and at the expense of the “broader, higher-order skills tested by PISA”.
So, here’s our message:
Dear Professors Holden and Piccoli, if you are so ignorant as to believe NAPLAN and numeracy is about “the basics”, and if you can exhibit no awareness that the Australian Curriculum has continued the trashing of “the basics”, and if you are so stuck in the higher-order clouds to be unaware of the lack of and critical need for properly solid lower-order foundations, and if you can write an entire piece on PISA without a single use of the words “arithmetic” and “mathematics” then please, please just shut the hell up and go away.
The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:
Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?
Q2. Do you regard it important to note and to clarify these distinctions?
Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?
The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.
Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.
The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.
The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.
The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.
And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.
How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?
The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.
The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.
The PISA results were released on Tuesday, and Australians having been losing their minds over them. Which is admirably consistent: the country has worked so hard at losing minds over the last 20+ years, it seems entirely reasonable to keep on going.
We’ve never paid much attention to PISA. We’ve always had the sense that the tests were tainted in a NAPLANesque manner, and in any case we can’t imagine the results would ever indicate anything about Australian maths education that isn’t already blindingly obvious. As Bob Dylan (almost) sang, you don’t need a weatherman to know which way the wind is blowing.
And so it is with PISA 2018. Australia’s mathematical decline is undeniable, astonishing and entirely predictable. Indeed, for the NAPLANesque reasons suggested above, the decline in mathematics standards is probably significantly greater than is suggested by PISA. Greg Ashman raises the issue in this post.
So, how did this happen, and what are we to do? Unsurprisingly, there has been no reluctance from our glorious educational leaders to proffer warnings and solutions. AMSI, of course, is worrying their bone, whining for about the thirtieth time about unqualified teachers. The Lord of ACER thinks that Australia is focusing too much on “the basics”, at the expense of “deep understandings”. If only the dear Lord’s understanding was a little deeper.
Others suggest we should “focus systematically on student and teacher wellbeing“, whatever that means. Or, we should reduce teachers’ “audit anxiety“. Or, the problem is “teachers [tend] to focus on content rather than student learning“. Or, the problem is a “behaviour crisis“. Or, we should have “increased scrutiny of university education degrees” and “support [students’] schooling at home”. And, we could introduce “master teachers”. But apparently “more testing is not the answer“. In any case, “The time for talk is over“, according to a speech by Minister Tehan.
Some of these suggestions are, of course, simply ludicrous. Others, and others we haven’t mentioned, have at least a kernel of truth, and a couple we can strongly endorse.
No institution we can see, however, no person we have read, seems ready to face up to the systemic corruption, to see the PISA results in the light of the fundamental perversion of mathematics education in Australia. Not a word we could see questioning the role of calculators and the fetishisation of their progeny. Not a note of doubt about the effect of computers. Not a single suggestion that STEM may not be an antidote but, rather, a poison. Barely a word on the “inquiry” swampland that most primary schools have become. And, barely a word on the loss of discipline, on the valuable and essential meanings of that word. What possible hope is there, then, for meaningful change?
We await PISA 2021 with unbated breath.
NAPLAN has been much in the news of late, with moves for the tests to go online while simultaneously there have been loud calls to scrap the tests entirely. And, the 2018 NAPLAN tests have just come and gone. We plan to write about all this in the near future, and in particular we’re curious to see if the 2018 tests can top 2017’s clanger. For now, we offer a little, telling tidbit about ACARA.
In 2014, we submitted FOI applications to ACARA for the 2012-2014 NAPLAN Numeracy tests. This followed a long and bizarre but ultimately successful battle to formally obtain the 2008-2011 tests, now available here: some, though far from all, of the ludicrous details of that battle are documented here. Our requests for the 2012-2014 papers were denied by ACARA, then denied again after ACARA’s internal “review”. They were denied once more by the Office of the Australian Information Commissioner. We won’t go into OAIC’s decision here, except to state that we regard it as industry-capture idiocy. We lacked the energy and the lawyers, however, to pursue the matter further.
Here, we shall highlight one hilarious component of ACARA’s reasoning. As part of their review of our FOI applications, ACARA was obliged under the FOI Act to consider the public interest arguments for or against disclosure. In summary, ACARA’s FOI officer evaluated the arguments for disclosure as follows:
- Promoting the objects of the FOI Act — 1/10
- Informing a debate on a matter of public importance — 1/10
- Promoting effective oversight of public expenditure — 0/10
Yes, the scoring is farcical and self-serving, but let’s ignore that.
ACARA’s FOI officer went on to “total” the public interest arguments in favour of disclosure. They obtained a “total” of 2/10.
We then requested an internal review, pointing out, along with much other nonsense, ACARA’s FOI officer’s dodgy scoring and dodgier arithmetic. The internal “review” was undertaken by ACARA’s CEO. His “revised” scoring was as follows:
- Promoting the objects of the FOI Act — 1/10
- Informing a debate on a matter of public importance — 1/10
- Promoting effective oversight of public expenditure — 0/10
And his revised total? Once again, 2/10.
These are the clowns in charge of testing Australian students’ numeracy.
The Victorian Minister for Education has announced that the state’s senior school curriculum will undergo a review. The stated focus of the review is to consider whether “there should be a more explicit requirement for students to meet minimum standards of literacy and numeracy …“. The review appears to be strongly supported by industry, with a representative of the Australian Industry Group noting that “many companies complained school leavers made mistakes in spelling and grammar, and could not do basic maths“.
Dumb and dumber.
First, let’s note that Victorian schools have 12 years (plus prep) to teach the 3 Rs. That works out to 4 years (plus prep/3) per R, yet somehow it’s not working. Somehow the standards are sufficiently low that senior students can scale an exhausting mountain of assignments and exams, and still too many students come out lacking basic skills.
Secondly, the Minister has determined that the review will be conducted by the VCAA, the body already responsible for Victorian education.
If the definition of insanity is doing the same thing over and over and expecting different results, then the definition of insane governance is expecting the arrogant clown factory responsible for years of educational idiocy to have any willingness or ability to fix it.