While we’re working away on ACARA, here’s another post to keep readers occupied. Below are released “benchmark” test items from TIMSS 2019. (Further details about the benchmarking can be found in the full report (pp 35-59, 172-198).
The sample items give a sense of what is tested in TIMSS, and what Australian students can and cannot do. (And, feel free to compare to PISA.) The test is given near the end of the year, to students in Year 4 and Year 8.
YEAR 4 ITEMS
YEAR 8 ITEMS
The Year 8 socks question: I’m not sure what worries me more, less than 70% of students being able to divide one quantity by another or the fact that this performance was somehow an improvement.
What strikes me is the level of English needed in the Year 4 examples – it seems sophisticated to me. Could someone who has taught Year 4 comment?
tom, I agree. I’m more interested in the Year 8 items, but I thought I may as well put up the Year 4 items as well.
3.13.1 – the question gives fractions in “simplest form” but the model answer does not.
Seems odd.
These problems are definitely closer to what I would call “mathematics” than what NAPLAN tests (still haven’t found a definition of “numeracy” that I’m comfortable with, but that is not relevant here).
Not sure if it is a common theme or not, but questions requiring more than one step in working out seem to have been scored a lot lower. Is this a correlation or a causation? I don’t know. Does it matter? Yes, I think it does.
Yes, RF, this a critical point. TIMSS tests mathematics; NAPLAN and PISA do not.
I think 3.13.1 is ok. Unsimplifying the fractions was necessary to solve the problem, but the unsimplified answer is correct. Of course the simplified answer would also count as correct.
As for the lower scores on multistep problems, of course that is causation.
If it is causation then this surely is a rather large whack of evidence in favour of the necessity to properly teach how to work through multi-step problems (that are not solving linear equations).
I’ve had a quick re-read and cannot find this idea anywhere in the draft curriculum. I could well be wrong, I’ve been known to miss very obvious things in the past.
Yes, of course the multistep thing is critical. I’m looking to post on that as soon as I get through the latest ACARA business. As for the draft curriculum, it *pretends* to do this: they’ll claim that all the “exploration” and “investigation” is extended-thought stuff. Of course, to the extent that it means anything, it is not remotely related to the proper practice of multistep procedure that is really required.
OK, perhaps I need to clarify for the sake of ACARA (if you’re listening…)
1. A question that has part a, b, c, d etc is NOT what I mean by a problem that requires multiple steps.
2. Anything which leads to an equation which then requires multiple steps to solve is not what I mean because in this instance, the steps are clear and the order is clear.
3. Any other situation where the steps (and the order to apply them) is obvious and therefore doesn’t really need much thought is not what I am talking about here.
So, what am I talking about? Pretty much any of the TIMSS questions after number 5 in this sample.
Of course. Having a problem with twenty-five neatly spelled out steps is teaching no one anything, except to hate (what they believe is) mathematics.
Serious question – what is/are the most effective way(s) to teach this skill? Let’s assume that students have already learned the “building blocks” (the relevant knowledge / concepts / standard techniques) required to solve a multi-step question.
Some options:
1) Provide a carefully curated and structured set of problems that gradually increase in complexity and difficulty.
2) Offer general advice (a la Polya, eg. draw a diagram, work backwards, consider a similar but easier problem).
The first seems destined to run into problems with students who struggle to generalise what they’ve learned from simpler problems to more challenging problems. And then you just end up teaching a process with neatly spelled out steps, or else you have a frustrated student. The second often seems unhelpful because students don’t know how to apply those strategies (what diagram should be drawn? how should a student know how they would work out the last step of a problem? what is a *relevantly* similar problem?).
Of course it’s important that students learn how to do this, but I think it is genuinely difficult and difficult to teach. (My sense is that many/most teachers just feel like the “strong” kids will pick it up naturally, but others won’t, and there’s not much that can be done about it).
This is related to creativity in research mathematics. I sometimes find that I can make some initial headway and then hit a brick wall. It helps to leave the problem and then come back with a fresh mind, uncluttered by earlier thoughts. The same with crosswords. Can this be taught?
If the student has completed a part and then is stuck, perhaps they can be advised to continue with another question and then return. Great for confidence if it works.
Hi, SRK. It’s a good question. I don’t have a good answer, but I am planning to post some writing in the next day or so from someone who does.