So far, there have been about a million columns written on ChatGPT, the AI chatbot that was launched a couple months ago. About half a million of these columns have been devoted to predictions on ChatGPT’s implications for education, both in schools and universities. Many of the columns have been fearful, but a few are bright-eyed, talking up the Brave New Possibilities that ChatGPT will offer, and will demand.
A notable entry in the latter category is a recent SMH opinion piece, with the pithy title Schools right now face a choice – fight the wave of ChatGPT, or surf it. We’re not sure how one might fight a wave, but you get the idea. The piece was written by Adam Voigt, who is a former principal of somewhere, is CEO of something called Real Schools, and is “education expert for The Project, 2GB, 3AW & The Age”. We do not know where and how Voigt developed his expertise. He smiles a lot.
At the outset, Voigt reassures the reader that “ChatGPT … is no storm” (presumably because ChatGPT is a wave), and is not “an existential threat to teachers”. But/And Voigt assures us that ChatGPT can bring great and good change:
[ChatGPT] emerges at a time when literacy and numeracy results are deteriorating in Australian schools despite increased funding and teachers working longer hours. And it could be the greatest opportunity to land at our feet in recent history if we can emerge from below the decks and adjust the sails of our schools accordingly. ChatGPT will not be ignored. It’s not a technological fad …
It would seem a little tricky to adjust the sails for surfing a wave that has landed at our feet, but let’s continue:
[ChatGPT] will fundamentally change the way we educate, and its first impost will be on the way we assess … Any assessment that a student completes independently at home is now going to be compromised.
But wasn’t this always true? Well,
Of course, it always was.
Huh. OK, so then what’s the big deal?
But finally, we may have crossed a point of no return, unable to justify trudging the tired path of delivering a lecture at school, setting a task, collecting the task, marking the task and returning a grade that has so slowed our educational progress.
It is a good time to stop, take a breath, and ask what the hell Voigt is talking about.
Nothing indicates what must now be fundamentally different for school assessment, why the post-ChatGPT “point of no return” might be inaccessibly far from the pre-ChatGPT point. It seems not to have occurred to Voigt that the setting of too few tasks may have “slowed our educational progress”. As for the “tired path” that begins with “delivering a lecture”, it is not a school path with which we are familiar (nor to which we would necessarily object).
What does any of this have to do with addressing “literacy and numeracy results”? Voigt never says. Voigt imagines ChatGPT will build upon current “innovative models for learning”, but there is nothing of substance beyond a reference to the Khan Academy, which apparently gave birth to “flipped learning”. Well, Voigt acknowledges there are “valid criticisms” of Khan, but he imagines that ChatGPT will result in super-duper versions of an imagined Khan-done-right. Lennon would be pleased with all this imagining.
Voigt is unwaveringly positive, and he makes clear that we have no choice:
ChatGPT won’t permit the same return to these fabled good old days.
As opposed to a different return.
As scary as it may sound, our schools should resist any urge to merely weather the ChatGPT storm and instead embrace its immense possibilities.
Ah. So ChatGPT turns out to be a storm after all, which should be embraced. Let’s review …
God knows how ChatGPT will change education, and certainly we don’t. Such programs appear to be a genuine issue for assessment in universities, but it is unclear to us how or why ChatGPT and his mates must or should change the fundamental nature of university teaching.
As for school education, will ChatGPT change things? Probably, since the educational world is forever leaping from one techno-saviour to the next. But what good might ChatGPT bring to school assessment or, more critically, school teaching? We don’t see it. Articles such as Voigt’s do nothing but suggest to us we’re in for a fresh educational hell.
I’ve seen a few people on social media report how they got ChatGPT to make lesson plans in seconds; or worksheets and assignments in seconds. It scares me a little. I’m pretty slow relative to other teachers already. It might raise the bar for what is expected in terms of teacher efficiency to something completely unachievable for me.
Quality beats speed if your boss is not a twat.
If ChatGPT can create good lesson plans or at least good drafts, then great. Maybe it’ll be a good thing. That would still have no bearing on what should be taught, or how, or how it should be assessed.
Is Adam Voigt our own Jo Boaler? (Except the maths bit, of course.) I have heard plenty of educational experts over the years, all of whom seem to “state the bleeding obvious”. I’ve not got a lot out of them. Nobody can possibly guess the influence of ChatGPT; like anything new in the educational world it will have its fierce supporters and its moaning detractors. (“This will be the end … “) My own limited experience of ChatGPT is that it can be quite fun to use. I asked it to give me a proof of Pythagoras’ theorem, which it did – lengthy and verbose, as it is purely text – but as far as I could determine it was correct. Anyway, as we know from the work done by Marty and his correspondents, the entire VCAA/ACARA proposal is a heap of rubbish – at least the various mathematics syllabi. Throwing ChatGPT into the mix can hardly make it worse.
I wonder how a curriculum created by ChatGPT would compare with the latest one created by ACARA. Now there’s a PhD in Education waiting to be claimed.
That’s a hilarious suggestion.
> My own limited experience of ChatGPT is that it can be quite fun to use.
> I asked it to give me a proof of Pythagoras’ theorem, which it did – lengthy
> and verbose, as it is purely text – but as far as I could determine it was correct.
I asked ChatGPT to answer one of the questions on one of my geometric transformations worksheets: to tell me the analytic formula of the affine transformation of the plane sending the point (-1,0) to (3,-1), the point (0,3) to (0,-3), and the point (3,2) to (1,1). Its answer was text interspersed with LaTeX code (the code was slightly incorrect since in some places it used a single backslash instead of two), but it correctly told me that the general formula for an affine transformation of the plane was
and that we needed to set up a linear system with the given values and solve it for
to
. However, it incorrectly set up the linear system, and then incorrectly solved the system it found. So what I assume is that my problem looks similar enough to other problems in its training set that it can tell me how to proceed (actually writing LaTeX code without being prompted was a nice touch), but it doesn’t know how to do computations.
It’s very interesting to say the least, considering this is very strongly related to my future career, and seeing this technology get progressively better over the last few years (and drastically!). But I genuinely don’t know what universities can do about ChatGPT. The faculty of IT at my university has been gutting exams across their subjects, and now this LLM (Large Language Model), which is extraordinarily capable of ripping through programming assessments turns up. What a predicament… (and quite ironic).
For what it’s worth, ChatGPT is pretty lousy at mathematics (especially arithmetic), so I have no clue how these AI progressive people intend to use the chatbot or the application it comes from (GPT-3) to help with “numeracy”. It also must be said, that these weird black boxes can be misleading too, their instructions seem coherent but may require double checking, for instance making up references, misinterpreting instructions and so forth. I’ve only used it for assistance in generating ideas, and writing code, not as a primary crutch. It’s pretty good if you’re not handing it the wheels to whatever you’re doing, but I wouldn’t ask ChatGPT to ask me to educate me on something, it’s not designed to do that.
I’d love for chatGPT to herald a new age. A new age of oral exams, live assessment, written and performed. Let’s make it 70/30.
I once asked an expert in robotics, whether robots could replace teachers. He replied “Yes – but it won’t happen quickly.”
Was he also an expert in teaching? If not, why the hell should anyone care what he says about the application of robotics to teaching?
Was he an expert in teaching? He has been teaching in universities for many years.
In other words, no.
Re Voigt’s comment on “the tired path of delivering a lecture at school, setting a task, collecting the task, marking the task and returning a grade that has so slowed our educational progress”.
Let me take issue with the sentiments conveyed by a “tired path”.
I don’t deliver lectures at school. Presenting a successful lesson is a wonderful experience; presenting a dreadful lesson is an opportunity for me to learn. Sometimes, I will repeat the lesson in an improved version.
I particularly enjoy setting a task. Although this takes me a great deal of time, I see it as an opportunity for creativity.
Collecting the task is not a burden at all. Students hand it to me, or submit it online.
Marking the task is an opportunity for me to listen to my students. Through their submissions, my students are talking to me, and I should take the time to listen. (Of course, there is nothing to hear when their answer is only (C) – but I won’t go there again for now.)
Returning the grade is not a burden at all. I record grades (with brief comments) online so that students and their parents can see the results, and hand the marked work back to the students.
There is no aspect of my work as a teacher that is tiresome. (This is not an exaggeration.)
Yes. It’s a ridiculous line. A straw school.
My colleagues were playing around and got it to write an open ended statistics question (with supplied data) for a General Maths SAC that they said wasn’t bad…. can’t be worse than most SAC questions! I asked for a VCE Specialist Maths Complex Numbers SAC question and got ‘ find the modulus and argument of 3+4i’ which is hardly helpful. I went to an interesting (for the wrong reasons) pd once by one a tech utopian who was talking about how education has hardly changed in 100 years and every other profession has. He said it like it was our fault somehow for dragging our heels but actually I’m not sure much improves on being in a room with someone who knows something about a topic you are interested in and having them explain it to you. I could certainly swap my whiteboard marker for chalk (or slate?!) and teach 100 years ago! My pet peeve is not wolfram alpha and the programs that ‘do maths’ for you, but the Maths Pathways model that promises to find student misunderstanding and fix it so they can move on with all students doing their own work at their own pace on different topics monitored by a computer algorithm. The idea that struggling students will watch a video and learn a concept independently is not reality – a large part of teaching is the motivating and bringing students along with you, particularly with junior and struggling students. My students could watch Khan Academy or Woo Tube or any number of online resources explaining the maths we are doing but they largely don’t, and the weaker students are never going to.
Thanks, Amber. That’s interesting about the attempt at the GM SAC. As I replied to wst, if ChatGPT can easily produce workable drafts, that’s great. I wonder if (and suspect that) the drafts may be samey, and in some undesirable manner, but perhaps that’s just my glass-half-empty demeanour.
Of course, I agree entirely with your general thoughts about “tech” in the classroom. All a teacher needs is desks facing the front and a large surface upon which to write, and anything extra will almost inevitably make it worse. Maths Pathways is appalling.
Having “desks facing the front” can be a challenge. Many teachers in other disciplines, but also in mathematics, do not like this arrangement. To have desks facing the front may involve moving all the furniture at the beginning of the lesson and then putting it all back at the end of the lesson. I have noticed that in new buildings in universities there may be screens on every wall so that the students need not face the front: it is assumed that all lecturers will use powerpoint for all classes. And I have attended many conferences held in hotels and other function centres where all the tables are round tables to promote discussion – but this means that a good fraction of the delegates cannot face the speaker.
Terry, what exactly is the “challenge” here? It is that many of your teaching colleagues, and the vast majority of university administrators, have no damn clue how to teach, or what teaching is, or what a teacher is. Screw ’em. Move the goddam tables to face the board, and if the “teachers” want them moved back, they can move them back.
Don’t conflate majority opinion with sanity.
In many institutions, it is expected that the furniture in a classroom at the end of the lesson will conform to some prescribed pattern. Similarly teachers should clean the board at the end of the lesson.
Fine. Then start a fight. But the fundamental point here is that the “challenge” is trying to teach in institutions with no interest in or understanding of teaching. The challenge is staying sane in a sea of madness.
Terry does have a genuine point. Some schools now have policies about how the furniture should be arranged in classrooms. It is normally in “groups” of 4 to 6 desks, none of which face the front.
The fact that this is a school policy means that any teacher who rearranges the desks to face the front is in breach of school policy. A fight with colleagues is one thing. A fight with school management is quite a different beast.
OK, that’s different. Terry began this by referring to what teachers with tiny little minds “like”. The fact that schools with tiny little minds have this as a policy is different, and much worse. In which case, of course, the only course of action is to tell the Principal that their school is out of its tiny little mind.
Interesting to read today that ChatGPT has been “banned” by the education department in Victoria – because that will definitely stop people using it. More bizarrely, it’s been “banned” for both students and teachers when teachers are the ones that should be playing with it to figure out what it can do and how it will impact – both negatively and positively – our lessons, planning and assessment. With more playing – it isn’t great at maths. I asked for worked examples of proof by induction and got one with the wrong formula and an algebraic error made in ‘proving’ the wrong formula. But when I told it there was an error it apologised and gave me the correct proof.
Interesting, and weird. Thanks, Amber. Do you have a link for the VicEd commandment?
I read it in the age https://www.theage.com.au/national/victoria/ai-tool-banned-in-victorian-schools-as-implications-examined-20230201-p5ch8h.html
Huh. Presumably there’s an announcment on VCAA’s website somewhere. I’ll try to find time to look.
OK, I read the article. I don’t think it is correct to call this a “ban” (which makes the headline idiotic). Blocking CGPT from the government servers is not nearly the same thing as forbidding its use, either by students or by teachers.
ChatGPT: Humanities and social sciences = CAS calculators: Mathematics
Nonsense. Don’t be lazy.
I think one difference is that people are accustomed to looking at the output from AI language models suspiciously. Sometimes the results are funny, and are parodied with fake-AI texts written as comically plausible outputs from chatbots or automatic translators. People expect to have to edit. On the other hand, I think calculators may have more authority and people trust their output a lot more. They can be used to stop thinking.