We *still* haven’t gotten to ACARA’s sparkly new curriculum. We do have David de Carvalho, however, to tell us all about its wonderfulness, and the wonderfulness of ACARA’s processes.

A couple weeks ago, De Carvalho was interviewed by Geraldine Doogue on Radio National’s *Saturday Extra*. Of course, Doogue asked De Carvalho all the tough questions. You can listen yourself, at the link below. Doogue introduces the discussion and De Carvalho as follows:

*… primarily, the terms of reference for the new curriculum were to reduce the amount of content, the common ground being that all involved in education thought the previous curriculum was, quote, “a mile wide and an inch deep”. Well, my next guest argues that we now have a more teachable curriculum, that focusses on conceptual understanding … *

Below are some of De Carvalho’s compelling answers to Doogue’s intense probing. We could annotate it, and hyperlink it, but there’s probably no point: De Carvalho’s tenuous hold on reality is pretty well established. This’ll do.

*… we do go through a pretty exhaustive consultative process, with all the jurisdictional curriculum authorities and our colleagues in states and territories, as well as many more people, in terms of subject associations, academics and a public consultation process, which we went through last year. *

…

*People were very engaged … we were very encouraged by the response. Now some people would have thought that we were dismayed by some of the controversy that was caused, but we fully expected that some controversy would ensue. … It was a really interesting and sometimes difficult and stressful time, as we tried to get compromise and consensus, and listen to the experts. But we got there in the end, and ministers were very happy to approve the version that we put to them in April. *

…

*Decluttering has two aspects to it. Firstly there is just the cutting back of content. But then, as you referred to as the Marie Kondo treatment, there’s the tidying up of what’s left, to ensure that it’s more coherent, that there are better connections, that teachers can find what they want more easily. This is all part of making it a more teachable curriculum … *

*In your introduction you said the analogy of a mile wide and inch deep. And [teachers] felt under enormous pressure to cover content, to get through the content and move on to the next thing. But often that means that they could leave children behind, before they’ve really grasped the key concepts. And so the idea of the decluttering was to give teachers more time, to teach in greater depth, so that children can move on, into the next topic once they’ve fully understood the topic that they’ve just been taught. *

…

*Particularly mathematics was the venue, I guess, for the debates around [problem-based learning]. What we are trying to do in the Australian Curriculum is set out what should be taught, and we have to be careful about being too specific, if you like, about the way teachers teach that material. And there is an on-going debate, some of it is a little over-engineered in terms of a false dichotomy, between explicit teaching on the one hand, which is very important, particularly in the early years, and what you’re referring to as enquiry-based learning or problem-based learning. *

*These different pedagogical approaches are actually choices that teachers face, depending upon where their students are up to. But what people are getting at when they are talking about enquiry-based learning as opposed to, say, explicit teaching, … instead of explicit teaching, where the teacher will come in and show the students what they need to do to solve a problem, and then ask the students to solve that problem with the clear guidance of the teacher. What the enquiry-based approach is, and I’m summarising crudely here, is to say “well, look, students, here’s a problem, have a crack at solving it, and then we’ll come back and reflect on how you went”. So it’s a kind of a coming at the same issue from a different angle. … Sometimes that’s appropriate and sometimes it’s less appropriate, but it’s an important part of the pedagogical toolkit, the teaching toolkit that teachers have to have at their disposal. But we have to be careful in constructing the Australian Curriculum that we’re not privileging one over the other. These are decisions that teachers have to make. *

*Now it becomes particularly important when we look at the PISA results. PISA is a test that is administered every three years by the OECD, and Australia’s results have been going down consistently since it started in 2020 [sic], particularly in maths. … One of the things that PISA specifically tests is the ability to transfer your knowledge from one particular real-life context to another. This is what they talk about in terms of real-world problem solving. *

Send this to ABC/RMIT FactCheck and see what they make of it…

My prediction: “SPIN”.

“[…] false dichotomy, between explicit teaching on the one hand, which is very important, particularly in the early years, and what you’re referring to as enquiry-based learning.”

1. “Particularly in the early years”??

As one gets older, and the amount of stuff one can draw on increases (and as the inter-connections get more complex), does “explicit teaching” become more or less central.

2. It *is* indeed a “false dichotomy” – because the two approaches are not alternatives. I have tried “enquiry-based” approaches at all levels – from primary to undergraduate. I love them. But they don’t deliver the essential core.

So enquiry-based approaches are more like toffee: very nice from time to time – *alongside* a main diet of fresh vegetables, carbs (and for me meat) as a main diet. “Explicit teaching” and “enquiry-learning” may live in the same classroom; but they belong to different categories.

Tony, Let me ask a question of you – or anyone listening. The school in which I teach is a normal government secondary school in Australia, dealing with years 7-10. The school is making special efforts to encourage the students who are more able in mathematics. We do this by various means and I am involved in this effort. (I often turn to your books for inspiration.)

How do we measure the impacts of our efforts?

All suggestions welcome.

I should also add the the school makes special efforts to assist students who struggle with mathematics.

Right there is the question that policy makers and school leaders seem to think has an easy answer but no-one is yet to present one of any great merit.

Possibly because, as you hinted at, it really depends on the school and the students within it.

Learners are diverse. Needs change.

How do we measure success? As Tony says later, you often judge “by feel” unless someone higher up demands otherwise.

I would think if you are doing all the effort, you would have thought about how to measure impact more. I’m broadly in agreement with Gardiner. But flustered that you pose this in such a from scratch consideration manner. There are many metrics to consider (performance of your top 10% on standard tests, acceleration, competitions, numbers reaching special awards/metrics, etc.)

Separate issue, but I disagree slightly with Gardiner on tracking movement into STEM. I mean who cares if an engineer got an English award or a doctor got a math award. Some people are studs and get awards in multiple areas. All that said, if you have fun topics and impressive teachers there will be some natural movement of kids into those topics (I majored in a topic where the high school teacher was impressive). But tracking it feels a little silly. And I don’t even really agree with the whole “push more kids into STEM” idea. Supply and demand will take care of that.

Thanks, Tony. De Carvalho has no idea what he is talking about, and he doesn’t care that he has no idea what he’s talking about. And Doogue doesn’t care. No one cares. It’s all unanchored fantasy.

Why do so many commentators focus on results from PISA and ignore TIMSS?

Because: (a) PISA is winning the propaganda wars for international testing supremacy; (b) Australia’s PISA results have been steadily going down; (c) commentators and politicians falsely believe that Australia’s TIMSS results are good and are going up; (d) many commentators are, in equal parts, ignorant and stupid.

(e) it is easier than doing research and/or calculations for yourself.

@TerryMills: The fact that you ask “How do we measure the impact (of efforts to encourage able students)?” reflects the fact that this is not a standard activity, and that there may be no “straight-off-the-shelf” answers. So forgive this extended comment.

1. Encouraging able students includes getting them to appreciate that, in *routine* tasks/exams, one can (and mostly should) aim for 100%. In other words: one can get things completely right (modulo silly questions, weird mark schemes, and flaky marking), and should celebrate this fact by doing this for starters.

Up to Year 10, you may only have homeworks, classwork, and internal exams. But the same applies.

2. Another aspect is that, if one makes a special effort with selected students, it seems wasted if they all go off and want to become medics.

So, it is worth keeping track of what they go on to do in Years 11/12 and beyond. (Find the data to show what the students went on to do *before* you started the current efforts; and set yourselves what seem like reasonable goals to you in your context – adjusting them if they prove optimistic/pessimistic. For example, it is worth aiming for *more* of each cohort

(i) to choose the harder maths modules in Years 11 and 12 (and maybe harder schools/colleges in Years 11/12),

(ii) to study harder maths-based courses at uni, and

(iii) for more of them to study maths itself at uni.

The target numbers for (iii) may be lower than for (ii), and (ii) may be lower than (i). And all three are statistical rather than deterministic (there are always oddballs who like maths – and benefit from additional provision; but who then go to study theology, or whatever).

Your early targets may prove either optimistic (if the other pressures are stronger than you thought), or pessimistic (if your provision is more effective than anticipated). But it still helps to set goals, and to think collectively about the outcomes.

3. But all this is long-term, and requires patience. So it is good to have some kind of short term goals.

Not knowing the local scene, I may be missing something. But the most obvious short-term goal is to make more systematic use of AMT competitions.

Enrichment is all very well; but it can lack a focus. While it should certainly not degenerate into “practising past papers” in the hope of getting a few more certificates than last year, if you devise a structure that suits local conditions, it should be aiming to convey the fact that “not knowing for sure” how to attack a problem is mathematically normal, and can be enjoyed as a challenge – and this should have a pay-off when faced with AMT multiple choice problems. (You may do this through a maths club; or through occasional class digressions/problems; or through an in school project competition/task – maybe with prizes for presentation as well as for math quality; or ….) Students may welcome the idea that one spin-off of your programme to help them *do a bit better* as an informal goal: they may then enjoy doing sample problems from time-to-time, and a couple of preparatory papers beforehand – and take the challenge seriously instead of lightly. It would be nice to think that, if you make an effort, and pitch things right, then this will show (statistically) in the school results. (Of course some individuals may be disappointed in their own outcomes; but they should learn that this is *not* deterministic – and then to see if they do better the next year.)

This offers a crude annual indicator as to whether you are having an impact.

4. It also raises a further possible goal/measure – which some may prefer to put up front: namely, getting one or two students through to whatever rounds, or follow up activities, arise for high scorers in the multiple choice papers. This will depend on the school. (E.g. you might use the materials from whatever has superceded what used to be the “Maths challenge for young Australians” as the focus for a programme of work – not just for those who would qualify naturally, but adapting the materials for a wider, possibly mixed-age, group.) A larger group can still enjoy the maths, even if they won’t all be equally successful in qualifying officially. And this offers a short-term indication of what impact one may be having on “the few”.

Whatever you do, or however you try to measure its impact, use (and learn to trust) your own collective judgement (even if that means finding my suggestions hopeless and going off to devise your own).

Thanks Tony; very helpful

“What we are trying to do in the Australian Curriculum is set out what should be taught”

except that they don’t.

AC9M9M03 says ‘solve spatial problems, applying angle properties, scale, similarity, Pythagoras’ theorem and trigonometry in right-angled triangles’.

So, do they need to use trig to find angles or sides or both?

I could just teach finding angles, couldn’t I?

Year 10 is no better: “solve practical problems applying Pythagoras’ theorem and trigonometry of right-angled triangles, including problems involving direction and angles of elevation and depression AC9M10M03”. Again I could just ask them to find angles.

This will be less of a problem in some states that publish their own interpretation of the Australian Curriculum Mathematics which, hopefully, will be more specific, but in South Australia, the South Australian Government website for R-10 curriculum just dumps you into the ACARIA webpage. Click on Mathematics on this page https://www.education.sa.gov.au/schools-and-educators/curriculum-and-teaching/curriculum-south-australia-early-years-year-12 and you end up at https://www.australiancurriculum.edu.au/f-10-curriculum/mathematics/?layout=1

I’m dreading the new curriculum and have not heard anything about when we will need to teach it, maybe next year?

As I said, De Carvalho’s hold on reality is tenuous. I was probably over-generous.

One of the treasures of Bad Maths was Marty’s reference in September last year to the work of Tony Gardiner in the UK, and in particular to his Open Book Publishers 2016 title Teaching Mathematics at Secondary Level (available free to download).

And now we have Bad Maths posted contributions from Tony himself!

As highlighted earlier by Marty, he has a great deal to say about problem-solving, in a way that articulates matters so much better than anything from ACARA (or what we’ve seen from our Maths Education professionals).

Two paragraphs from his Introduction to Teaching Mathematics:

“This extended essay started out as a modest attempt to offer some supporting structure for teachers struggling to implement a rather unhelpful National Curriculum. It then grew into a Mathematical manifesto that offers a broad view of secondary mathematics, which should interest both seasoned practitioners and those at the start of their teaching careers.”

“We leave others to draft recipes for translating the official curriculum into a scheme of work with the minimum of thought or reflection. This study is aimed at anyone who would like to think more deeply about the discipline of “elementary mathematics”, so that whatever decisions they may take will be more soundly based.”

Problems for Grade 5 students:

https://link.springer.com/book/10.1007/978-3-030-52946-8