Signs of the Times

Our second sabbatical post concerns, well, the reader can decide what it concerns.

Last year, diagnostic quizzes were given to a large class of first year mathematics students at a Victorian tertiary institution. The majority of these students had completed Specialist Mathematics or an equivalent. On average, these would not have been the top Specialist students, nor would they have been the weakest. The results of these quizzes were, let’s say, interesting.

It was notable, for example, that around 2/5 of these students failed to simplify the likes of 81-3/4. And, around 2/3 of the students failed to solve an inequality such as 2 + 4x ≥ x2 + 5. And, around 3/5 of the students failed to correctly evaluate \boldsymbol {\int_0^{\pi} \sin 5x \,{\rm d}x}\, or similar. There were many such notable outcomes.

Most striking for us, however, were questions concerning lists of numbers, such as those displayed above. Students were asked to write the listed numbers in ascending order. And, though a majority of the students answered correctly, about 1/4 of the students did not.

What, then, does it tell us if a quarter of post-Specialist students cannot order a list of common numbers? Is this acceptable? If not, what or whom are we to blame? Will the outcome of the current VCAA review improve things, or will it make matters worse?

Tricky, tricky questions.

25 Replies to “Signs of the Times”

  1. Structure C might have them reaching for Sqrt[{3, Sqrt[5], 16/5, 2, Pi}] in Mathematica, which like the 1/4 of the students, doesn’t return what you might expect. (Ascending numerical order with irrational elements requires Sqrt[{3, Sqrt[5], 16/5, 2, Pi}, Less] for those interested.)

  2. Were the students judged on the final answer, that is, was the mark for each question either 0 or 1?

    I can understand that a careless error in the ordering of the numbers would result in a score of 0 for that question. What would be interesting would be to see the calculations for the other questions. To be meaningful as a diagnostic tool, I’d prefer to see average scores for each question. For example, for the inequality question:

    1 mark (basic re-arrangement): x^2 – 4x + 3 leq 0.

    1 mark (basic factorisation): (x + 4)(x – 1) leq 0.

    1 mark (basic graphing?): Answer.

    Now, if most of the 2/3 of students who failed to get the correct answer nevertheless got a score of 2/3 for this question, then we know something – I would not be terribly troubled. But anything less than 2/3 (that is, allow for only one careless mistake) and we also know something …. Knowing whether the failure to get the correct answer reflects poor mathematical skills or just plain carelessness is key for any compulsory remedial mathematics subject that this institute undoubtedly offers. (The fact that a tertiary institute even needs to conduct a diagnostic quiz and offer remedial subjects is of grave concern, for many reasons)

    I suspect (hope!) that careless errors have skewed the landscape somewhat, not that this excuses such a large proportion of Specialist students failing to correctly answer such simple questions. Another interesting thing to know would be the how much overlap there was for each cohort of ‘fails’. Perhaps there was a hard core of students who failed all of the questions ….

    But all things considered, that this data is particularly troubling would be an understatement. And I doubt it would be any less troubling if it had been Maths Methods students that were tested … I would expect the indices and inequality questions to be well done by a reasonable Year 10 maths student. And the integration should be money for jam.

    I assume your final three questions are purely rhetorical because any informed and intelligent person knows the answers. But for the record, the pre-destined outcome of the current VCAA Review is going to make matters a whole lot worse.

    But for me, the most troubling element of all this is that these students will, one day, probably be designing bridges, skyscrapers, aeroplanes, autonomous vehicles etc. …..

    I’ll be travelling the long way, thanks very much.

    1. Thanks, John. Yes, my questions were rhetorical. As to your questions about the diagnostic quiz, I don’t know much, but I think the purpose was to get a quick sense of what students did or did not know, rather than a careful, academic analysis. And yes, multipoint grading might have indicated greater than zero understanding of some students on some questions. But such grading tends to *overestimate* the level of understanding; isn’t this obviously the case with VCE grading? You can’t simply dismiss it as “careless mistakes”. Part of gaining decent mastery of a subject is appreciating the care required and being able to do routine problems in a reliably accurate manner.

      And as for the numbering ordering question, sure anyone can make a careless mistake. But 25%?

      1. Two of the problems with VCE grading are:

        1. The marking scheme is kept secret, so who knows how the marks are allocated (only the assessors, who must sign confidentiality agreements).

        2. The raw % scores corresponding to the grades is kept secret (who knows what the raw score cut-off for an A is? Perhaps as low as 60% …?) Again, it’s a secret how the raw scores get ‘standardised’.

        I’d suggest it would be too embarrassing for VCAA to publicly release either of these two things.

        If a university accepts Maths Methods for a student, this generally means that the student achieved at least an unscaled Study Score of 25. But this doesn’t necessarily mean that the student got at least an average raw score of 50% on the exams. In fact, I’d bet that average raw scores of at least 35% would probably result in Study Scores of at least 25 ….

        For these two reasons I’d say it’s *how* multipoint grading is used that overestimates the level of understanding in VCE ….

        1. And if you look at the Examination Report for, say, last year’s Specialist Maths Exam 1, it is telling that half the cohort consistently failed to get the correct answer for most questions. Even Q1(a), where a couple of forces simply had to be labelled, 40% got zero!.

          You could argue a case that the average mark for Exam 1 was (low 20’s)/40, a bare pass in the ‘old thinking’. And yet the average Study Score is fixed at 30, which scaled to 41 in 2018 …! This is what truly misrepresents (overestimates) understanding. I’d be surprised if a student who got a raw study score of 25 (enough for a university to accept Specialist as a pre-requisite for that student) got more than (high teens)/40. So this is the cohort getting pre-tested ….

          One might well be surprised at how *’good’* the pre-test results were, given this context …!!

        2. John, don’t the grade distributions for each graded assessment, published by VCAA, state which raw scores receive which letter grades? 2018 data available here: https://www.vcaa.vic.edu.au/Pages/vce/statistics/2018/statssect3.aspx My understanding is that exam scores are not standardised.

          As for the marking scheme, it is a constant source of frustration that it is not released. My suspicion, based on various anecdotal evidence and attending “meet the assessors”, is that the assessors like to have some flexibility for how papers are marked – in some years they are very pedantic about, say, including the dx in the integral, and in others years not. It depends on what markers are seeing in the scripts they mark.

          1. Yes, you’re right.

            And 18 – 22/40 gives a C for Exam 1. Most people would certainly consider C a pass, and it looks like C’s would give a Study Score of around 25 ….

            Many people would probably consider D a pass grade, and a D is 10-14/40 ….

            Re: Flexibility. When Assessors use that word, it’s just a euphemism for changing the goal posts each year. Which I guess is embarrassing if made available in the public domain.

          2. Back in 2017, 11/40 would have got you a C and 4/40 would have got you a D. Which is pretty shocking.

          3. Damo, it’s beyond shocking. It’s a total disgrace. Particularly when the grade C in Years 7 – 10 means that the student is “at the expected standard” …. There was a time when C meant 60 – 69% raw …..

  3. I read a NZ study quite a while ago – I think it was around 2004. The university of Auckland was looking at CAS calculators (or their previous relatives the GDC). They separated students according to whether or not they used the calculator in their final year of school and gave then a question on limits. The question, from memory was simple such as “what is the limit as x->inf?” the students who had not used the calculators in school were almost all correct and the reverse was also true. It was typical first year calculus and the question was asked late in first year, when you think the ill effects may have worn off.

    You probably won’t find the article referenced at any of the STEM conferences though…

  4. I believe the NZ study was actually properly conducted. They were wanting to know whether or not to use the calculators in first year Mathematics courses. Despite the evidence, my understanding from further reading is they did introduce the TI calculators…

    1. Yes, an example of the better research in maths ed: proving the obvious and then ignoring it.

  5. Educationally – for some very strange reason – there appears to be a significant negative bias in Maths students towards basic arithmetic…Not only Maths students, BTW – recall the story of Andrew Wiles setting out to solve Fermat’s Last Theorem…he quite sadly hid the fact that he as doing such research almost as though he was ashamed of tackling the subject…Mind you, he still failed miserably to find the easy arithmetic answer to this puzzle…

    1. I think it’s human nature for a child to resist serious practice, of anything. The problem is the adults who succumb to or, worse, promote that resistance. (Wiles wasn’t ashamed of his work: he simply wanted to be left alone to concentrate.)

  6. SRK, thanks very much for the statistics link. I wasn’t aware of that. As everyone indicates, the statistics are pretty shocking. But it also seems to me that people commenting here (and teachers generally much more so) are too way willing to accept VCAA’s ridiculous games. Yes, you gotta play the ridiculous games, but that doesn’t mean you have to accept them.

    There is nothing intrinsically wrong with having a difficult exam. My memory is that in the old pass-or-fail days of Pure and Applied it was pretty standard to have a raw score in the low 30s constitute a pass. One can argue that such a raw score was too low, but it takes an argument, and I would argue against it.

    But let’s consider the Specialist exam 1. Why are students doing so poorly:

    1) Are the Specialist exam 1 questions difficult?

    The answer is a clear “no”. The majority of the exam is absolutely routine, and nothing is far from routine. The Reports make clear that elementary arithmetic and algebraic skills, as well as many conceptual basics, are phenomenally weak. This is the inevitable poisoning effect of CAS, which has destroyed the curriculum and has destroyed an entire generation of teachers, who now ignorantly feed this poison to their students.

    But,I think the obvious perversive aspect of CAS and the stunning idiocy of its promoters is just one aspect. I think there is a separate question:

    2) Are the Specialist 1 exams difficult?

    Here, I think the answer is “yes”, but for two horribly wrong reasons.

    The first problem with the ONE HOUR exam is that it is an unforgiving 100 metre sprint, with some hurdles. The student has to race through a *lot* of stuff, and one small stumble means they can crash into a hurdle, are off balance for the next hurdle, and can very easily fall into a heap in the middle of the track. To impose a high-stakes ONE HOUR exam is simply insane and simply evil.

    The second problem is the obsessive-compulsive grading, which of course causes teachers and students to go nuts. It is very easy for a student to know very well what they are doing but to receive 1/2 or 2/4 or whatnot. In America this is referred to as being nickeled and dimed, and if the exam structure is too bitsy, with the consequent grading too fussy and precious, it can very easily sum to a large amount. In this aspect, the VCAA exams are batshit crazy.

    So, the commenters above want more transparency and clarity in the grading of the exams, and this makes sense. But it is ignoring the fundamental problems. The fundamental problems, separate from the CAS poison, are that the exams are appalling and the grading is appalling. You may ask for more clarity, but more clarity, even if possible, won’t have any effect whatsoever on the intrinsic awfulness.

    1. A delayed reply due to the always inflated workload, but some thoughts on your comments Marty:

      1. The errors in basic skills and knowledge displayed in Exam 1 could be due to over-reliance on CAS during instruction and practice. But I think there are also a couple of other major culprits. One is the problem you mention in your second point – the lack of time students have to complete the exam. A second is the relatively short amount of time students have to learn the curriculum throughout the year. Much of instructional time is spent shovelling new content down student throats, and there is insufficient opportunity to consolidate skills and address deficits.

      2. The OCD grading is ridiculous. I understand the point of rewarding students who take the time and effort to write their responses to a high standard, but there are other more authentic ways of discriminating amongst the strength of the cohort. At least one of my students has taken to social media to complain about my overly pedantic marking. There is no ill feeling, and my students understand that I am trying to encourage habits that will promote success on the VCAA exams. But it is a frustrating and undignified experience for teacher and student.

      1. Thanks, SRC. A few comments in reply.

        You are obviously correct, that the students are too rushed in Exam 1, and *everyone* is too rushed throughout the year. No question that both have an effect on solidifying and then demonstrating the basics. But it is more than that, and obviously so.

        CAS aside, I think the underlying theme of all this is the curriculum (both VCE and pre-VCE) being an inch deep and a mile wide. It means that nothing gets reinforced properly. Natural connections between topics, which require and motivate this reinforcement, never get taught. It also means that nothing can be examined deeply, because there is no “deep” to examine.

        As for the grading, it is partially explained by the shallow questions necessitated by the shallow curriculum, but I think it is more than that. A telltale sign of weak mathematical thought is to conflate pedantry with proof. The examiners flash such signs like lighthouse beacons.

  7. Irrespective of the VCAA exams, the fact remains that a significant minority of Specialist Maths students (and the mind boggles at what the proportion of Maths Methods would be) are failing simple diagnostic quizzes that universities are running. This is independent of the VCAA exams and brings us back to the tricky, tricky questions you posed:

    Is this acceptable? If not, what or whom are we to blame? Will the outcome of the current VCAA review improve things, or will it make matters worse?

    My question: What can be done to change this?

      1. I wonder what would happen if universities put integrity ahead of money and re-introduced pre-requisite subjects and higher cut-offs on minimum requirements. Or maybe just set their own entrance exam (which is what the old Form 6 essentially was).

        If the value of the VCE Certificate was reduced depending on how it was used, perhaps the worth of it would be forced to improve in a practical, measurable and transparent way.

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 128 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here