Does the Draft Mathematics Curriculum Contain Any Problem-Solving?

We’ve written about this before, and the point is obvious. But, it’s apparently not sufficiently obvious for some wilfully blind mathematicians. So, let’s go again. Plus, there’s a prize for the best comment.*

ACARA is playing people with a cute syllogism.

  • Problem-solving is good.
  • The draft curriculum contains lots of problem-solving.
  • Therefore the draft curriculum is good.

Yep, the syllogism is flawed from the get go. But in this post we want to focus on the second line, and we ask:

Does the draft mathematics curriculum contain any problem-solving?

Certainly the draft curriculum contains a hell of a lot of something. As we’ve noted, the draft refers to “investigating” or some variation of the word 298 times. And, students get to “explore” and the like 236 times, and they “model” or whatever 264 times. That’s a baker’s ton of inquiring and real-worlding, which some people, including some really clueless mathematicians, regard as a good thing. Ignoring such cluelessness, what about genuine mathematical problem-solving?

The draft curriculum refers to “problem(s)” to “solve” 154 times. But what do they mean? When, if ever, is the draft referring to a clearly defined mathematical problem that has a clearly defined answer, and which is to be solved with a choice of clearly defined mathematical techniques? To the extent that there are any such “problems”, do they rise above the level of a trivial exercise or computation? In the case of such trivial “problems”, is the label “problem-solving” more than a veil-thin disguise for the mandating of inquiry-learning?

In brief, is there more than a token amount of the draft’s “problem-solving” that is not either real-world “exploring/modelling/investigating” or routine exercises/skills to be taught in a ridiculously inappropriate inquiry manner?

Perhaps genuine mathematical problem-solving is there, and we are honestly curious to see what people have found or can find. But, we’ve found essentially nothing.

And so, to the competition. Find the best example of genuine, mathematical problem-solving in the draft curriculum. Answer in the comments below. The most convincing example will win a signed copy of the number one best-selling** A Dingo Ate My Math Book.

 

*) Yes, yes. we have those other competitions we still haven’t finalised. We will soon, we promise. As soon as we’re out of this ACARA swamp, we’ll be taking significant time out to catch up on our massive tidying backlog.

**) In Polster and Ross households.

 

Update (29/07/21)

We’ve finally ended this. The winner is, hilariously, Glen. See here for details.

 

 

41 Replies to “Does the Draft Mathematics Curriculum Contain Any Problem-Solving?”

  1. What they mean by “problem solving” is the dishonest creation of a simple calculation based on some word problem.

    I doubt any actual problem solving is in the draft curriculum.

    But I do like myself a prize, so I’ll take another look….

    1. You’re ignoring the forest for the trees. The Daft curriculum as a whole is the mathematical problem to be solved.

      1. I’ve made that joke. But I’m asking a serious question. If a big shot mathematician suggests that the draft’s emphasis on problem solving is “welcome”, what specifically might they be referring to? I think they are being stupid, but I want to hone in on the tiny amount of sense, if it exists. At the moment I doubt even that tiny amount of sense exists.

        1. 1. The difference is, I wasn’t joking.

          2. “If a big shot mathematician suggests that the draft’s emphasis on problem solving is “welcome”, what specifically might they be referring to?”

          Picking up from Glen, I don’t think this question can be answered until the phrase “problem solving” is defined by the ‘big shot’ (who has probably big shot off his/her big shot mouth before reading the daft curriculum). Such ‘big shots’ are usually more politician than mathematician.

          3. I’ll use the definition implied in the 4th and 3rd last paragraphs of Marty’s blog. When I get a chance, I’ll take a look for the pot of gold at the end of the achromatic (although the value of the book – signed as it may be – is less than the value of the person-hours I estimate this fruitless task requires).

          1. Hi, John

            1. Neither was I.

            2. That is the question, and I honestly don’t know the answer. However,

            3. I’ve tried to indicate the kind of “problem solving” that I believe could be valuable in a curriculum. My pseudo-definition is looser than is standard, and blends in with “exercise”, rather than just a more “not immediately clear the approach to take” type of problem. But that is conscious, to make it more likely to capture any decent example of “problem solving” in the draft.

            What I’m ruling out is real-world exploration/modelling/investigation, and inquiry discovery of the basics. The question is, is there any mathematical problem-solving in the draft?

  2. If I may re-word the question as follows:

    “Does the draft curriculum seek to increase the quantity and/or quality of problem solving being taught in high school Mathematics classes?”

    Then the answer is “I don’t know. The motives of the authors of this document remain a mystery to me.”

    If you remove the phrase “seek to” from my re-wording the answer must be “No. It hasn’t been tested yet so it cannot have achieved anything (yet).”

    Which, in a very roundabout way leads me to this re-wording:

    “Does the proposed curriculum establish a framework in which problem solving is going to be emphasized in Mathematics classes?”

    The answer to this is, unfortunately, “We may never know.”

    Why? Short answer: because the curriculum guide is not an assessment guide.

    Teachers have the VCE curriculum guide, but (in my experience) are more likely to look at past examination papers to decide what to emphasize in the classroom. If VCAA decides to examine the chain rule for derivatives in 70% of exam papers then a lot of emphasis will be placed on teaching students how to use the chain rule with the types of functions favored by VCAA examiners.

    Likewise, just because something is mentioned in a curriculum document (such as different notations for the derivative, think D sub-x) it doesn’t mean it gets taught because it is NOT ASSESSED. EVER.

    So… will more problem solving be taught in schools? Maybe. If schools choose to somehow assess problem solving skills (which can be a difficult task in itself, separating assessing the process from assessing the outcome) then yes, it will be taught. Will the new draft curriculum have an impact on this? Maybe, it depends how individual teachers and schools interpret it.

    Curriculum drafts come and go pretty regularly. Teachers will probably keep doing what they’ve always done.

    1. RF, you may reword the question if you offer your own prizes. Your question is interesting, and entirely off the current point.

      The current point is the draft curriculum, and the question of whether the purported emphasis on “problem-solving” includes an iota of improvement over the current curriculum. Whatever the final new curriculum does or does not contain, there will be plenty of time later to debate what it means.

      1. OK. Your prize, your rules.

        My answer remains the same: No.

        Longer answer: No, the new draft does not include any improvement, emphasis or no emphasis.

  3. Re: What does ACARA mean by ‘Problem solving’?

    This might shed some light (copied from https://www.australiancurriculum.edu.au/resources/mathematics-proficiencies/portfolios/problem-solving/):

    “Problem-Solving

    In F–2, students solve problems when they use mathematics to represent unfamiliar or meaningful situations.

    In Years 3–6, students solve problems when they use mathematics to represent unfamiliar or meaningful situations and plan their approaches.

    In Years 7–8, students formulate and solve problems when they use mathematics to represent unfamiliar or meaningful situations, plan their approaches, when they apply their existing strategies to seek solutions, and when they verify that their answers are reasonable.

    In Years 9–10, students formulate and solve problems when they use mathematics to represent unfamiliar or meaningful situations, when they design investigations and plan their approaches, when they apply their existing strategies to seek solutions, and when they verify that their answers are reasonable. Students develop the ability to make choices, interpret, formulate, model and investigate problem situations, and communicate solutions effectively.”

    1. Thanks, JF. The outline you’ve quoted is pretty silly, but there is more detail at the link you’ve you’ve provided. I took a quick look and I was less than thrilled, but it deserves a closer look. Then the question is, to the debatable extent that this amounts to decent problem-solving, how does the draft curriculum’s problem-solving differ in nature, placement, quantity and quality?

    2. OK, JF, I’ve looked at the sample problems at the link you sent. Again, the general framework you quoted is plain silly, and this whole thing is off the point of the current draft (and the current competition). But still it was interesting to see (and the website sucked).

      These are my thoughts on each of the six problems, with my summary below.

      a) Year 1 problem, which amounts to finding (natural number) solutions to x – y = 5.

      Well, yeah, ok. It’s hardly a “problem”, but it’s decent enough, and a bog standard, exploration of difference, as long as it comes after at least some solid sense of counting or addition.

      b) Year 4 measurement (?) problem, laying out 36 equal square tiles.

      This seems entirely vacuous and pointless to me.

      c) Year 6 number and algebra problem (sic), painting 60% of a fence.

      Contrived and ridiculous, and a typical derailing of the possibility of a true problem/exploration of numbers into the consideration of a fucking fence.

      c) Year 6 measurement problem, determining possible dimensions of a box given a 1.2m ribbon has to go around it.

      John, why did you make me look at this crap?

      d) Year 8 geometry problem, asking when a kite and trapezium would have the same area.

      This is better, if contrived. Although some of the open-end in the problem is good, it is mostly bad. I would prefer to have the problem much more structured, and I’m guessing that in practice, a teacher would have to make it much more structured. As given on the website it is problem-solving, and it hints of good problem-solving, but it is not good problem-solving.

      e) Year 9 number and algebra problem (sic), finding a target point equidistant from three given archers (who then fire their arrows?)

      It ain’t number and algebra, and the scenario is absurd but it is a good something. Is it a problem? In line with what I wrote about on Singapore, I would regard it as a very hard, and good but limited, exercise, but the “problem” element seems low.

      f) Year 10 statistics and probability problem, deciding from given box plots whether Franklin or Lloyd should win an award.

      Lord spare me.

      OK, my summary.

      1) After the Year 1 counting, not a single problem is remotely focussed on number or algebra. Of course.

      2) In every instance, the real-world aspect was ridiculous and added nothing.

      3) Three of the problems were intrinsically awful, and one of the remaining problems was extrinsically poor.

      John, you owe me.

      1. At this very moment, a signed copy of my book is in the mail …

        Would you believe a signed copy of one of my papers …?

        Would you believe a signed copy of one of my tests …?

        How about a signed passport-sized photo?

  4. I haven’t solved your problem Marty, so this comment is not an attempt to snatch the prize.

    Inspired by RF’s related question, I thought I’d contribute something NSW-specific. NESA (NSW Education Standards Authority) list this as a summary of the pinnacle of K-10 achievement (the so-called A10 level, basically an A+):

    *A student performing at this grade uses and interprets formal definitions and generalisations when explaining solutions; generalises mathematical ideas and techniques and selects and uses efficient strategies consistently and accurately to solve unfamiliar multi-step problems; uses deductive reasoning in presenting clear and concise mathematical arguments and formal proofs; synthesises mathematical techniques, results and ideas across the course.*

    The keen reader will note a few *naughty* words there. Now NESA have managed to get this in there with the current curriculum as it is, and ostensibly, all teachers should be guiding students along a path toward this A10 level. You might argue that in practice, this doesn’t always work out, and I agree with that — but importantly, this is what the official documentation is saying.

    My question is: **how compatible, if at all, is the draft curriculum with the existing NESA practice?** I suggest not at all compatible.

    Although I strongly dislike the current curriculum, this new one is an order of magnitude worse, and so when the force of inertia is challenged by the force of change, I’ll be supporting inertia.

    1. Glen, I would have thought that NESA should fall in line with the National Curriculum – that’s the point of a national curriculum.

      1. A lot more has to happen before “falling in line” is required. I believe (essential) inertia still has a fighting chance.

      2. Terry, NESA should do what is best for NSW students, and if that means telling ACARA to get stuffed then that’s what they should do. My guess is that NESA will do just that.

  5. Wow. I’d be happy if most Unit 3/4 Maths methods and Specialist Maths students were mostly doing this, let alone the best Yr 10 students!!

  6. The question as to what constitutes “problem solving” has been asked in various contexts. For example, in chess, there is a distinction between a “chess problem” and a “chess puzzle”. I am not particularly interested in such distinctions: it suffices to say that what is a problem for one person is not necessarily a problem for another person.

    I wrote this elsewhere.

    “We experience mathematics through solving mathematical problems. This way, we consolidate what we know, and we learn new things. Mathematical problems can be theoretical or applied; they might be posed by the teacher or by pupils; they may be exercises from a text book or large projects; they may be tackled by an individual or by a group; they might require a calculator or perhaps pencil and paper will suffice. In solving mathematical problems, we learn to develop strategies, pay attention to detail, “learn from our mistakes”~\footnote{Popper, K. (1962). \textit{Conjectures and refutations}. New York: NY: Basic Books (p.~vii).}, think laterally, and experience that exhilarating ah-ha! moment when we see the answer. In mathematics, solving problems is the link between experience and learning.”

    1. Terry, tt does not remotely suffice to say that people have different notion of the word, and then to use that word to make extravagant and tendentious claims.

      We do not just “experience mathematics” (what the hell does that even mean)
      through solving problems. And solving problems is not the only link between “experience” (what ever the hell it means) and learning. Unless you have a crazy-broad notion of “problem”.

    1. I’d say that it is not only possible, but far preferable, to learn mathematics without any of ACARA’s “problem solving” (which is what we are talking about here).

      1. Glen, you’ve hit the nail on the head. I don’t think anyone here disputes that problem solving is an important part of learning mathematics. What’s being argued is that the sort of problem solving that pervades the ACARA daft curriculum is vague, ill-conceived and inappropriate. ACARA clearly has no understanding of problem solving in mathematics.

        ACARA is the mathematics education equivalent of a leper without their bell. It should not be let anywhere near curriculum development in mathematics and should change its name to AARA.

        1. John, just two further aspects to be noted on ACARA’s “problem-solving”. It is not just that their problem-solving is laid on by the ton, and that it is often vague to the point of meaninglessness. It is also predominantly real-world “problems”, which feeds back poorly to mathematics.

          Second, and much worse, ACARA’s “problems” appear in the almost-immediate teaching of and enforcement of basic concepts and skills. As such, in this context, “problem-solving” is not a skill to be learned. It is much more an imposed method of learning. ACARA can deny it all they want, but they are attempting to mandate inquiry learning.

  7. OK, thank you for all your comments. A very interesting discussion.

    But, there is a competition to be won and, as with (almost) the other competition, I haven’t seen any actual entries yet.

    Currently. Glen is winning the other competition with a truly ridiculous, and sole, entry. Similarly, submit *any* crap “problem-solving” from the draft, and you’ll be winning this one. Remember, the prize goes to the *best* example. It needn’t be a good example.

    1. OK.

      Year 8 Number – investigating the use of pronumerals to represent recurring decimals as their equivalent fractions, for example, let 𝑥 = 0. 7̅ then 𝑥 = 0.77777… and 10𝑥 = 7.77777… therefore 10𝑥 – 𝑥 = 7 and 9𝑥 = 7 so 𝑥 =7/9.

      (Ignoring flippant manipulations of infinite series).

      Year 10 Algebra – applying a bisection algorithm to determine the approximate location of the horizontal axis intercepts of the graph of a quadratic function such as f(x) = 2x^2 - 3x - 7.

      Year 10 Extension Probability – using n! + 1 to prove that there are infinitely many prime numbers.

      Representing decimal numbers in other bases such as base 3 and showing that square numbers in the ternary number system always end in 1.

      (OK, I’m kidding about this last one. I just wanted to create a little bit of excitement for a moment)

      1. Thanks, John. Whether these are good or not, how are any of them problem-solving? Nonetheless, you’re currently winning.

      2. At the risk of shooting myself in the foot (which is quite possibly in my mouth), I doubt I’m thinking of it as problem solving in the same way that ACARA is. (I’d actually call them exercises, which is a form of problem solving in my book). Nevertheless, they are problems to be solved:

        1) Use n! + 1 to prove that there are infinitely many prime numbers.
        (It would be a fairly bright Yr 10 that would solve this problem).

        2) Apply the bisection algorithm to determine the approximate solution to 2x^2 – 3x – 7 = 0.

        3) (By implication of the example) Represent 0.37 recurring as an ‘equivalent fraction’.

        I was pleasantly surprised to discover 1). I find it quite funny that such a problem is classified as Probability. I wonder how many teachers will think that infinitely many prime numbers is something to do with probability. (What’s even funnier is that there is a connection with prime numbers. But I don’t think ACARA had that in mind. I think it’s a case of factorials = probability …!) The pity is that some teachers might see that it’s irrelevant to probability and therefore choose to ignore it.

        1. JF, it’s not a question of whether you’re using problem-solving in the ACARA sense, but in the Marty sense. I’m still not convinced by your examples.

          On 1), I was also pleased to see it, but no way is this intended to be a problem to be solved, and I very much doubt it would work as such. It is a beautiful proof to be seen (and hopefully learned). Is it really in the probability section?

          On 2), I don’t see any way this is other than an exercise with a good message (which is undermined by the definite article).

          On 3), I am again very glad to see it. But it’s just an exercise applying the learned trick. I guess if they have only been shown 0.555… etc, then 0.373737… is kind of a problem. But only kind of.

        2. OK, I should have looked at the draft first. John, your (3) of course is invalid. Making up your own hard exercise or problem to follow on from the method taught in the draft does not remotely equate to that exercise/problem being in the draft. So, the only example of yours which, very very weakly, qualifies is (1).

          1. I think 2) clearly applies as well.

            Yep, 3) is ‘weak’ but is strongly suggested as a problem in the Daft. Rather than vaguely hand-waved at.

            1. How is (2) a problem? I don’t see how your version of (3) is more than vaguely suggested. I’m also not arguing the merits of these exercises. Just their problem nature.

              1. Well, the exact wording (previously quoted) is

                “applying a bisection algorithm to determine the approximate location of the horizontal axis intercepts of the graph of a quadratic function such as f(x) = 2x^2 – 3x – 7.”

                I’m applying artistic license in changing the word “applying” to ‘apply’ and deleting “such as”…
                I think it can be reasonably argued that this is a specific problem to be solved that is given in the ACARA Daft curriculum.

                What should be added to it is:
                a) what level of accuracy is required,
                b) how would you determine when that accuracy has been achieved, and
                c) what drawbacks does the method have (give an equation that illustrates this drawback).

                Unfortunately the Daft curriculum does not mention these matters, and this is another defect. Even with something as fool-proof as a basic algorithm, the full story is not given. I wonder whether the writers of the Daft curriculum have assumed that textbooks will fill in the blanks …? If so, that’s another indicator of a poor curriculum document.

                1. John, it’s not a major point. As you suggest, with decent wording and framing, it could be a reasonable and good exercise/problem. But I’d still argue it’s either weak or silly in its problem aspect. Either you guide the kids enough that the activity is pretty routine, or you don’t guide them and then I don’t see how you frame it other than throwing them in the deep end of the pool.

  8. Alright, alright. I went looking to see if I could find something like the cool stuff that was in the problems book you posted about before Marty (like 16 x 16).

    I didn’t find much. I’ll submit this little ditty, badly worded (but what can you expect from a quote of the actual draft curriculum):

    “…drawing up a track game to resemble a
    running race, taking it in turns to roll two dice, where runner 1 moves a square if the difference between the two
    dice is 0, 1 or 2 and runner 2 moves a square if the difference is 3, 4 or 5; responding to questions ‘Is this game
    fair?’, ‘Are some differences more likely to come up than others?’, ‘How can you work that out?’ (AC9M5P01_E3)”

    The problem that I’m claiming exists here is properly answering the questions.

    BTW, if I win either of these, I’d like to at least pay for the book!

    1. Pay for your own prize? What, you think this is some kind of Liberal RoboQuiz, where everybody gets screwed? Don’t be ridiculous.

      Anyway, by default, you are currently winning.

      Glen, your suggestion is an illustration of exactly how I was hoping this competition would. You guys would scavenge the meatiest, problemiest problems, and then we could figure out what the worth, if any, there is in these problems. And, I honestly wasn’t, and still am not, clear on what we might find. I was/am sceptical that we’d find much, but I was/am open-minded.

      As to the elaboration you chose, it is a magnificent example of an activity that contains a decent problem element and, at the same time, is thoroughly ridiculous. It will take some time to sort out this nonsense, and of course others are free to comment. I’ll write more later today.

      1. It’s no more ridiculous than some other board games I’ve seen. I think ACARA is assuming that a “little ditty” like this will get written up properly in either a textbook, a Teacher’s Manual or appear as an on-line resource. On top of all its other faults, it’s a curriculum full of thought bubbles that are being handballed to others to make sense of.

    2. That’s a really good get, Glen. With proper scaffolding and direction, it would be a great problem for students to work through. It’s certainly not something you’d want to give students to explore and construct their own learning from. But it would be really nice to give once the basic skills have been taught and consolidated. What Year level is it from Year 9?

      1. It is in Year 5, Probability. Depending on what you do with it, it could be useful. But it’s certainly not a good way to learn a concept, it’s an after dinner mint at best.

        1. And that’s the problem. The students will not even have the sense of die probabilities, because, the self-same idiots moved that to Year 6. In practice, what is being suggested is aimless exploration, or maybe just trials. God only knows. Plus, the game-framing is gratuitous, pointless and distracting. Plus, asking essentially the same question three times doesn’t make the activity deeper, just more irritating.

          As John suggests, at the right level, and properly framed, it could be a very good activity/problem/something. It’s a nice example where you can readily calculate the probabilities, but you an also, as a forethought or afterthought, argue much more easily with symmetry.

          So, there is something there. But, in the context of the draft curriculum, it sucks awfully. And Glen is winning with it.

Leave a Reply

Your email address will not be published.

The maximum upload file size: 128 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded.

This site uses Akismet to reduce spam. Learn how your comment data is processed.