Our suspicion is that, at least in Victoria, the answer is “no”. We haven’t thought hard about it, however. So, while we (try to find time to) think some more, we’d be interested in what others have to say.
Very quickly, here are the arguments we see for opening schools in Victoria:
- Federal health officials suggest schools are safe.
- The NSW study (not yet peer-reviewed) suggests schools are safe.
- Year 12 students are getting seriously dicked around.
Here are the arguments we see for keeping schools in Victoria closed:
- ScoMoFo is an idiot.
- Dumbo Dan makes ScoMoFo look smart.
- Daniel Andrews and his Chief Medical Officer are not idiots.
- No one has a real sense of what will happen when restrictions ease.
- The kids (P-11) miss a term of school? Big deal.
Dan Tehan, the Federal minister for screwing up education, has announced a rescue package for Australia’s universities. This was clearly necessary, since the universities are no longer in a position to fleece international students. The package guarantees funding for the universities, and introduces a range of cheap six-month courses in “areas considered national priorities”.
The government’s package is “unashamedly focused on domestic students”. That was inevitable since:
a) the government, and Tehan in particular, doesn’t give a stuff about international students;*
And, what of these “priority” courses? According to the ABC,
The Government said prices would be slashed for six-month, remotely delivered diplomas and graduate certificates in nursing, teaching, health, IT and science provided by universities and private tertiary educators.
OK, so ignoring all the other nonsense, we have a few questions about those six-month online teaching diplomas:
- Will such a diploma entitle the bearer to teach?
- If not, then what is it good for?
- If so, then what is a school to do with the mix of 6-month diploma-qualified applicants and the standard 24-month Masters-qualified applicants?
- And, if so, what does that tell us of the intrinsic worth of those standard 24-month Masters?
To be clear, we have no doubt that six months is plenty sufficient for the initial training of a teacher, and indeed is at least five months too many. We also have no doubt that a diploma-trained teacher has the same chance to be a good teacher as someone who has suffered a Masters. They have a better chance, in fact, since there will have been less time to pervert natural instincts and feelings and techniques with poisonous edu-babble.
But, good or bad, who is going to give these diploma teachers a shot? Then, if the teachers should be and are given a shot, who is going to address the contradiction, the expensive and idiotic orthodoxy of demanding two year post-grad teaching degrees?
*) Or anyone, but international students are near the bottom.
I’ve been busy the last couple of days, and will be for the foreseeable future, since my girlfriend and I have taken our two young children out of school.
I have informed my parent friends of this decision, but I am not advocating that they, or anyone, follow our lead. My girlfriend and I are lucky in that we are financially secure (for now), and are currently freer of work than we might otherwise be.* It is easy for us to bring the kids home, and we could see no good argument against it. Other parents are much less fortunate, and may have a very difficult decision ahead, very soon. I really feel for them, and for everyone dealing with this mess.
This brings up a general and hugely important question: should schools stay open? Honestly, I have no idea. It is an aspect of Australian discussion that I have been trying, and failing, to get my head around. It seems that the main argument for keeping schools open is simply as a childminding service, so that the oldies don’t do the minding and the doctors and the nurses can get on with running themselves ragged. Is that a sufficient argument? I’m sceptical, but I don’t feel confident to say “no”.
Here are two links to articles discussing the matter (in Australia), neither of which I either vouch for or reject:
Why Australia is not shutting schools (The Guardian)
No, Australia is not putting teachers in the coronavirus firing line (The Conversation)
I’m open to people’s thoughts. But, I’ll just add one thing. Prime Scott Morrison has threatened private schools that might close, and he has expressed his confidence in the decision to keep kids at school:
I’m telling you that, as a father, I’m happy for my kids to go to school. There’s only one reason your kids shouldn’t be going to school and that is if they are unwell.
I wonder if all of Morrison’s Liberal colleagues agree.
Below are two “units” (scenarios) used in the PISA 2012 testing of mathematics. The units appeared in this collection of test questions and sample questions, and appear to be the most recent questions publicly available. Our intention is for the units to be read in conjunction with this post, and see also here, but of course readers are free to comment here as well. The two units below are, in our estimation, the most difficult or conceptually involved of the PISA 2012 units publicly available; most questions in most other units are significantly more straight-forward.
Here’s an interesting tidbit: PISA‘s mathematics testing doesn’t test mathematics. Weird, huh? Who knew?
Well, we kinda knew. Trustworthy colleagues had suggested to us that PISA was slanted, but finding out the extent of that slant, like lying-on-the-ground slant, was genuinely surprising. (We’re clearly just too optimistic about the world of education.) Not that we had any excuse for being surprised; there were clues of mathematical crime in plain sight, and it was easy enough to locate the bodies.
The first clues are on PISA’s summary page on “Mathematics Performance“. The title is already a concern; qualifications and elaborations of “mathematics” usually indicate some kind of dilution, and “performance” sounds like a pretty weird elaboration. Perhaps “mathematics performance” might be dismissed as an eccentricity, but what follows cannot be so dismissed. Here is PISA’s summary of the meaning of “mathematical performance”:
Mathematical performance, for PISA, measures the mathematical literacy of a 15 year-old to formulate, employ and interpret mathematics in a variety of contexts to describe, predict and explain phenomena, recognising the role that mathematics plays in the world. The mean score is the measure. A mathematically literate student recognises the role that mathematics plays in the world in order to make well-founded judgments and decisions needed by constructive, engaged and reflective citizens.
The alarms are set off by “mathematical literacy”, a pompous expression that promises more than, while signalling we’ll be getting much less than, straight mathematics. All doubt is then ended with the phrase “the role that mathematics plays in the world”, which is so fundamental that it is repeated verbatim.
What this sums to, of course, is numeracy, the noxious weed that inevitably chokes everything whenever there’s an opportunity to discuss the teaching of mathematics. What this promises is, akin to NAPLAN, PISA’s test of “mathematical performance” will centre on shallow and contrived scenarios, presented with triple the required words, and demanding little more than simple arithmetic. Before investigating PISA’s profound new world, however, there’s another aspect of PISA that really could do with a whack.
We have been told that the worldly mathematics that PISA tests is needed by “constructive, engaged and reflective citizens”. Well, there’s nothing like irrelevant and garishly manipulative salesmanship to undermine what you’re selling. The puffing up of PISA’s “world” mathematics has no place in what should be a clear and dispassionate description of the nature of the testing. Moreover, even on its own terms, the puffery is silly. The whole point of mathematics is that it is abstract and transferrable, that the formulas and techniques illustrated with one setting can be applied in countless others. Whatever the benefits of PISA’s real world mathematics for constructive, engaged and reflective citizens, there will be the exact same benefits for destructive, disengaged psychopaths. PISA imagines Florence Nightingale calculating drip rates? We imagine a CIA torturer calculating drip rates.
PISA’s flamboyent self-promotion seems part and parcel of its reporting. Insights and Inpretations, PISA’s summary of the 2018 test results, comes served with many flavours of Kool-Aid. It includes endless fussing about “the digital world” which, we’re told, “is becoming a sizeable part of the real world”. Reading has changed, since it is apparently “no longer mainly about extracting information”. And teaching has changed, because there’s “the race with technology”. The document wallows in the growth mindset swamp, and on and on. But not to fear, because PISA, marvellous PISA, is on top of it, and has “evolved to better capture these demands”. More accurately, PISA has evolved to better market itself clothed in modern educational fetishism.
Now, to the promised crimes. The PISA test is administered to 15 year old students (typically Year 9 or, more often, Year 10 in Australia). What mathematics, then, does PISA consider worth asking these fifteen year olds? PISA’s tests questions page directs to a document containing questions from the PISA 2012 test, as well as sample questions and questions from earlier PISAs; these appear to be the most recent questions made publicly available, and are presumably representative of PISA 2018. In total, the document provides eleven scenarios or “units” from the PISA 2012 test, comprising twenty-six questions.
To illustrate what is offered in those twenty-six questions from PISA 2012, we have posted two of the units here, and a third unit here. It is also not difficult, however, to indicate the general nature of the questions. First, as evidenced by the posted units, and the reason for posting them elsewhere, the questions are long and boring; the main challenge of these units is to suppress the gag reflex long enough to digest them. As for the mathematical content, as we flagged, there is very little; indeed, there is less mathematics than there appears, since students are permitted to use a calculator. Predictably, every unit is a “context” scenario, without a single straight mathematics question. Then, for about half of the twenty-six questions, we would categorise the mathematics required to be somewhere between easy and trivial, involving a very simple arithmetic step (with calculator) or simple geometric idea, or less. About a quarter of the questions are computationally longer, involving a number of arithmetic steps (with calculator), but contain no greater conceptual depth. The remaining questions are in some sense more conceptual, though that “more” should be thought of as “not much more”. None of the questions could be considered deep, or remotely interesting. Shallowness aside, the breadth of mathematics covered is remarkably small. These are fifteen year old students being tested, but no geometry is required beyond the area of a rectangle, Pythagoras’s theorem and very simple fractions of a circle; there is no trigonometry or similarity; there is no probability; there are no primes or powers or factorisation; there are no explicit functions, and the only implicit functional behaviour is linear.
D = dv/(60n) .
(The meaning of the variables and the formula needn’t concern us here, although we’ll note that it takes a special type of clown to employ an upper case D and a lower case d in the same formula.)
There are two questions on this equation, the first asking for the change in D if n is doubled. (There is some WitCHlike idiocy in the suggested grading for the question, but we’ll leave that as a puzzle for the reader.) For the second question (labelled “Question 3” for God knows what reason), students are given specific, simple values of D, d and n, and they are required to calculate v (with a calculator). That’s it. That is the sum total of the algebra on the twenty-six questions, and that is disgraceful.
Algebra is everything in mathematics. Algebra is how we name the quantity we’re after, setting the stage for its capture. Algebra is how we signify pattern, allowing us to hunt for deeper pattern. Algebra is how we indicate the relationship between quantities. Algebra is how Descartes captured geometry, and how Newton and Leibniz captured calculus.
It is not difficult to guess why PISA sidelines algebra, since it is standard, particularly from numeracy fanatics, to stereotype algebra as abstract, as something only within mathematics. But of course, even from PISA’s blinkered numeracy perspective, this is nonsense. You want to think about mathematics in the world? Then the discovery and the analysis of patterns, and the analysis of relationships, of functions is the heart of it. And what makes the heart beat is algebra.
Does PISA offer anything of value? Well, yeah, a little. It is a non-trivial and worthwhile skill to be able to extract intrinsically simple mathematics from a busy and wordy scenario. But it’s not that important, and it’s hardly the profound “higher order” thinking that some claim PISA offers. It is a shrivelled pea of an offering, which completely ignores vast fields of mathematics and mathematical thought.
PISA’s disregard of algebra is ridiculous and shameful, the final stake in PISA’s thoroughly nailed coffin. It demonstrates that PISA isn’t “higher” or “real”, it is just other, and it is an other we would all be much better off without.
The VCAA is reportedly planning to introduce Foundation Mathematics, a new, lower-level year 12 mathematics subject. According to Age reporter Madeleine Heffernan, “It is hoped that the new subject will attract students who would not otherwise choose a maths subject for year 12 …”. Which is good, why?
Predictably, the VCAA is hell-bent on not solving the wrong problem. It simply doesn’t matter that not more students continue with mathematics in Year 12. What matters is that so many students learn bugger all mathematics in the previous twelve years. And why should anyone believe that, at that final stage of schooling, one more year of Maths-Lite will make any significant difference?
The problem with Year 12 that the VCAA should be attempting to solve is that so few students are choosing the more advanced mathematics subjects. Heffernan appears to have interviewed AMSI Director Tim Brown, who noted the obvious, that introducing the new subject “would not arrest the worrying decline of students studying higher level maths – specialist maths – in year 12.” (Tim could have added that Year 12 Specialist Mathematics is also a second rate subject, but one can expect only so much from AMSI.)
It is not clear that anybody other than the VCAA sees any wisdom in their plan. Professor Brown’s extended response to Heffernan is one of quiet exasperation. The comments that follow Heffernan’s report are less quiet and are appropriately scathing. So who, if anyone, did the VCAA find to endorse this distracting silliness?
But, is it worse than silly? VCAA’s new subject won’t offer significant improvement, but could it make matters worse? According to Heffernan, there’s nothing to worry about:
“The new subject will be carefully designed to discourage students from downgrading their maths study.”
Maybe. We doubt it.
Ms. Heffernan appears to be a younger reporter, so we’ll be so forward as to offer her a word of advice: if you’re going to transcribe tendentious and self-serving claims provided by the primary source for and the subject of your report, it is accurate, and prudent, to avoid reporting those claims as if they were established fact.
The NAPLAN Numeracy Test Test is intended for education academics and education reporters. The test consists of three questions:
Q1. Are you aware that “numeracy”, to the extent that it is anything, is different from arithmetic and much less than solid school mathematics?
Q2. Do you regard it important to note and to clarify these distinctions?
Q3. Are you aware of the poverty in NAPLAN testing numeracy rather than mathematics?
The test is simple, and the test is routinely failed. NAPLAN is routinely represented as testing the “basics”, which is simply false. As a consequence, the interminable conflict between “inquiry” and “basics” has been distorted beyond sense. (A related and similarly distorting falsity is the representation of current school mathematics texts as “traditional”.) This framing of NAPLAN leaves no room for the plague-on-both-houses disdain which, we’d argue, is the only reasonable position.
Most recently this test was failed, and dismally so, by the writers of the Interim Report on NAPLAN, which was prepared for the state NSW government and was released last week. The Interim Report is short, its purpose being to prepare the foundations for the final report to come, to “set out the major concerns about NAPLAN that we have heard or already knew about from our own work and [to] offer some preliminary thinking”. The writers may have set out to do this, but either they haven’t been hearing or they haven’t been listening.
The Interim Report considers a number of familiar and contentious aspects of NAPLAN: delays in reporting, teaching to the test, misuse of test results, and so on. Mostly reasonable concerns, but what about the tests themselves, what about concerns over what the tests are testing? Surely the tests’ content is central? On this, however, at least before limited correction, the Report implies that there are no concerns whatsoever.
The main section of the Report is titled Current concerns about NAPLAN, which begins with a subsection titled Deficiencies in tests. This subsection contains just two paragraphs. The first paragraph raises the issue that a test such as NAPLAN “will” contain questions that are so easy or so difficult that little information is gained by including them. However, “Prior experimental work by ACARA [the implementers of NAPLAN] showed that this should be so.” In other words, the writers are saying “If you think ACARA got it wrong then you’re wrong, because ACARA told us they got it right”. That’s just the way one wishes a review to begin, with a bunch of yes men parroting the organisation whose work they are supposed to be reviewing. But, let’s not dwell on it; the second paragraph is worse.
The second “deficiencies” paragraph is concerned with the writing tests. Except it isn’t; it is merely concerned with the effect of moving NAPLAN online to the analysis of students’ tests. There’s not a word on the content of the tests. True, in a later, “Initial thinking” section the writers have an extended discussion about issues with the writing tests. But why are these issues not front and centre? Still, it is not our area and so we’ll leave it, comfortable in our belief that ACARA is mucking up literacy testing and will continue to do so.
And that’s it for “deficiencies in tests”, without a single word about suggested or actual deficiencies of the numeracy tests. Anywhere. Moreover, the term “arithmetic” never appears in the Report, and the word “mathematics” appears just once, as a semi-synonym for numeracy: the writers echo a suggested deficiency of NAPLAN, that one effect of the tests may be to “reduce the curriculum, particularly in primary schools, to a focus on literacy/English and numeracy/mathematics …”. One can only wish it were true.
How did this happen? The writers boast of having held about thirty meetings in a four-day period and having met with about sixty individuals. Could it possibly be the case that not one of those sixty individuals raised the issue that numeracy might be an educational fraud? Not a single person?
The short answer is “yes”. It is possible that the Report writers were warned that “numeracy” is snake oil and that testing it is a foolish distraction, with the writers then, consciously or unconsciously, simply filtering out that opinion. But it is also entirely possible that the writers heard no dissenting voice. Who did the writers choose to meet? How were those people chosen? Was the selection dominated by the predictable maths ed clowns and government hacks? Was there consultation with a single competent and attuned mathematician? It is not difficult to guess the answers.
The writers have failed the test, and the result of that failure is clear. The Interim Report is nonsense, setting the stage for a woefully misguided review that in all probability will leave the ridiculous NAPLAN numeracy tests still firmly in place and still just as ridiculous.
We’re not particularly looking to blog about censorship. In general, we think the problem (in, e.g., Australia and the US) is overhyped. The much greater problem is self-censorship, where the media and the society at large can’t think or write about what they fail to see; so, for example, a major country can have a military coup, but no one seems to notice. Sometimes, however, the issue is close enough to home and the censorship is sufficiently blatant, that it seems worth noting.
Greg Ashman, who we had cause to mention recently, has been censored in a needless and heavy-handed manner by Sasha Petrova, the education editor of The Conversation. The details are discussed by Ashman here, but it is easy to give the story in brief.
Kate Noble of the Mitchell Institute wrote an article for The Conversation, titled Children learn through play – it shouldn’t stop at pre-school. As the title suggests, Noble was arguing for more play-based learning in the early years of primary school. Ashman then added a (polite and referenced and carefully worded) comment, noting Noble’s failure to distinguish between knowledge that is more susceptible or less susceptible to play-based learning, and directly querying one of Noble’s examples, the possible learning benefits (or lack thereof) of playing with water. Ashman’s comment, along with the replies to his comment, was then deleted. When Ashman emailed Petrova, querying this, Petrova replied:
“Sure. I deleted [Ashman’s comment] as it is off topic. The article doesn’t call for less explicit instruction, nor is there any mention of it. It calls for more integration of play-based learning in early years of school to ease the transition to formal instruction – not that formal instruction (and even here it doesn’t specify that formal means “explicit”) must be abolished.”
Subsequently, it appears that Petrova has also deleted the puzzled commentary on the original deletion. And, who knows what else she has deleted? Such is the nature of censorship.
In general we have a lot of sympathy for editors, such as Petrova, of public fora. It is very easy to err one way or the other, and then to be hammered by Team A or Team B. Indeed, and somewhat ironically, Ashman had a post just a week ago that was in part critical of The Conversation’s new policy towards climate denialist loons; in that instance we thought Ashman was being a little tendentious and our sympathies were much more with The Conversation’s editors.
But, here, Petrova has unquestionably screwed up. Ashman was adding important, directly relevant and explicitly linked qualification to Noble’s article, and in a properly thoughtful and collegial manner. Ashman wasn’t grandstanding, he was contributing in good faith. He was conversing. Moreover, Petrova’s stated reason for censoring Ashman is premised on a ludicrously narrow definition of “topic”, which even on its own terms fails here, and in any case has no place in academic discourse or public discourse.
Petrova, and The Conversation, owes Ashman an apology.