A few days ago Greg Ashman handballed an article to us, suggesting we might enjoy it, although clearly he meant “enjoy” it. The (paywalled) article, just published in the journal Research in Mathematics Education, is titled
Intersectional feminism to reenvision mathematical literacies & precarity
Yeah, you don’t have to read the article. We did.
We don’t even know what to do with this one. Since it is paywalled, most readers will be unable to (attempt to) read the article. And, posting a few highlight excerpts misses the point; doing so could not convey the endless swampiness of the twenty-five pages of self-indulgent nonsense. So, we’ll give up. We’ll post the abstract, make one quick comment explaining the purpose of the authors’ diagram above, and be done with it.
Here is the abstract:
Current global crises (e.g. COVID-19 pandemic and climate change) necessitate changes to mathematics curricula, especially related to using mathematics to solve real-world problems. We begin with the Programme for International Student Assessment’s (PISA) framework for mathematical literacy (FML), since it functions as a global guide for curriculum. We demonstrate its inadequacy to solve current crises and to mediate the precarity of girls and women. Then we reenvision the FML by integrating concepts of critical mathematics education with intersectional feminism. We reenvision how to think about mathematical literacies. In particular, we add practices of feeling, acting, and reimagining to the conventional construct of mathematical reasoning. We reenvision ways to think about or classify real-world problem contexts by exploring three potential themes for real-world problem contexts.
In summary, the authors have attempted to employ an anybody-but-straight-white-males brand of feminism to make PISA’s flaky twaddle even flakier and twaddlier. Given the above diagram is the authors’ rejigging of PISA’s mathematical frameworks, it is clear that the authors have succeeded admirably.
It’s a very lucky thing that Research in Mathematics Education makes sure that “All research articles published in this journal have undergone rigorous peer review, based on initial editor screening and anonymized refereeing by at least two anonymous referees.”
Who knows what self-indulgent and nonsensical swill would otherwise get published.
“The journal welcomes high-quality research in any methodological tradition and is open to innovative and unusual approaches.”
which is clearly why this paper was accepted.
Here is my brief review:
I cannot recommend this paper too highly nor say enough good things about it. There is no other paper against which I can adequately compare it to. The author has clearly demonstrated their complete capabilities. The amount of material the author knows will surprise you.
On a completely unrelated note, it’s a pity that on occasion the aphorism isn’t publish and perish.
Just when you think the educationators ( a succinct term that is a nonsense) can’t go any lower, they somehow find a way to do it. You have to admire the level that can be sunk to. It’s like we’re all on board a mathematical bathysphere and we’re 16 k’s down but we find out there’s further depths. Depths that we never wanted to go to, and yet we find ourselves.
It makes one wonder how far would anybody get by putting an “article” together about the possible benefits of hypermasculinity in mathematics. That with men having educational frameworks that increased mathematical ability we save women from difficult and challenging problems. And thus women are freed to caregiving roles and interpersonal relationships.
Fact check #1: “Current global crises (e.g. COVID-19 pandemic and climate change) necessitate changes to mathematics curricula, especially related to using mathematics to solve real-world problems.” Verdict: irrelevant at best, likely to be plain wrong.
Fact check #2: “…it [PISA] functions as a global guide for curriculum.” Verdict: where and how, exactly? Dubious at best.
Thank goodness PISA is only every 3 years (unlike NAPLAN)
Pisa and curriculum. How do I measure my teaching? By comparing students’ results with some standard. How well did my class go on the same test as another class? Or compared to other years’ classes on the exam? Or by how much they have improved on the end of year Pat-M?
What is the standard that the country measures against? Is it Pisa? If it is, then we will modify the curriculum to try and get better results on that test.
Indeed, that is what has just been done, although in such a clueless manner, it will probably make the (meaningless) results worse.
If PISA was done by every student in every school then there is a chance I might agree in part with this.
PISA is a sample and quite a small one of 15 (or is it 16) year old students done every three years.
Arguing that PISA measures improvement is a bit of a stretch…
If PISA were done …
That one always trips me up.
VCAA forgives me though because my students somehow always remember the
on their integrals.
This reminded me of something I read recently, in an oblique way. It was in a book called Mathematics & Mathematics Education: Searching for Common Ground by Fried & Dreyfus. I think it’s possible you might actually enjoy “Chapter 3: Some of My Pet-Peeves with Mathematics Education” by Theodore Eisenberg.
“And what’s the problem with us sitting inside education departments? Well, most faculty members in education departments are too far away from what our intellectual interests should be–and that is the teaching and learning of mathematics. For example, I know of a department of education in which most on staff have trained themselves to not even see tables and numbers and statistical tests in research journal papers. They simply don’t see them–their eyes jump right over them–and they pooh-pooh the use of statistics and quantitative data.” (p.39)
Anyway, this struck me because, when my attention span is short, I have the opposite problem. I skip the paragraphs and settle on anything that looks like an equation, a table, or a figure; then read back. So, (I confess) I tried to read the article and made it to the end without taking in anything. I can’t judge but it just seems like perhaps an example of the different approaches to the world that it is hard to bridge across.
wst, I think your two final sentences contradict each other.
Re: Some people in education faculties “pooh-pooh the use of statistics and quantitative data”.
I am sceptical of the ways in which data are used in education circles. There is a great deal of emphasis on using data in schools to make decisions about the learning of our students. And while I agree with this in principle, it is terribly difficult to find good data on which to base a decision. Gonski (2018, p. x) recommends that our number one priority in education should be to “Deliver at least one year’s growth in learning for every student every year”. But how do we measure this growth? NAPLAN? On demand testing?
Here is an example. This year, for the first time, NAPLAN was conducted entirely on-line across the nation. Yet, I expect that when the results from 2022 are made public, many will want to compare these results with the results of previous years. I cannot see how this can be justified.
I agree with you when it comes to using data too much in schools. For example, when decisions about how to teach are dictated by the need to create ‘evidence’, I find that quite dreary. The chapter was about mathematics education research though, where I suspect it may be more useful to create and consider data sometimes.
I interpreted the quote as an attempt to describe a general aversion to mathematics and statistics – even if a person doesn’t use or collect quantitative data in their own research, it seems reasonable that they shouldn’t skip reading the parts of articles where data is mentioned.
Unless the article is written by VTAC.
What I find funny and pathetic about schools talking about the data and “evidence-based” policy etc. is that:
1) The data is often cherry-picked. People see what they want to see. “There are three kinds of lies: Lies, Damned Lies, and Statistics”.
2) The so-called evidence either doesn’t actually exist or has been misinterpreted. Typically the ‘evidence’ comes from primary school research, but this doesn’t stop secondary schools from making these “evidence-based” claims.
School leadership loves the bright shiny bullshit spewed out by these self-proclaimed educational ‘experts’ and typically tries to justify forcing the bullshit down everyone else’s throat via one or both of the above. I would argue that one way of retaining teachers would be to get rid of all the bullshit that school leadership is constantly shoving down our throats in the name of quality teaching.
And when you ask to see the evidence that bullshit X has improved learning outcomes, you’re met with a bunny in the headlights response. I’ve seen bullshit come and bullshit go so many times – the number of man-hours wasted on bullshit is staggering. And it all starts with this sort of self-indulgent and unintended satirical bullshit and the stupidity of teachers in swallowing/tolerating/supporting it. Teachers are stupid and apathetic (the 63% who voted in favour of the current Victorian EBA is proof-positive of this contention).
To be fair, the intersectional feminism paper suffers from none of these deficits.
Which is probably the most damning indictment of all!! To paraphrase Fermi, it’s so bad it’s not even bad.
Re: “Typically the ‘evidence’ comes from primary school research, but this doesn’t stop secondary schools from making these “evidence-based” claims.”
It is often the case that research conducted in one context is applied in different contexts. Many years ago, a manufacturer of toothpaste was boasting about the benefits of the fluoride that they put into their product: “University research shows …” the ad said. I wrote and asked for the evidence. The manufacturer sent back to me several published papers on the matter. It turned out that they were all based on experiments with primary school students in South Australia. Yet the findings were readily accepted as applying to a much broader population.
That article reads to me like unintended self-satire.
That hits the nail on the head, Glen! It’s the Spinal Tap of educational research.