Friday, 22 November 2024

Generative AI Pushes Outcome Over Process (And This Is Why I Hate It)

I really hate generative AI, because there are many reasons to hate it. It's abilities depend on stolen data; it uses so much electricity it's messing with climate goals and made them restart 3 Mile Island; it's riddled with all the biases you'd expect; and the sales pitch from companies with billions of dollars on the line has smelled of snake oil from day one. 

I'm also mad at it for professional reasons. 

First, I am an ecological psychologist, so I have things to say about whether these systems can be intelligent. They can't: they are made only of language, and intelligence is made of much more than this. Also their basic trick (unreflective extrapolation of the next most likely thing to say) isn't even how language works, let alone intelligence. 

But second, for the purposes of this post, I am mostly mad about it as an educator. My students are deluged with AI tools. Grammarly relies on it; Adobe keeps trying to summarise pdfs for them; and ChatGPT promises to help them write better essays. My students are busy and stressed: they're often working jobs to support themselves, and their courses ask a lot of them too. We know they take shortcuts, and AI is the most powerful one they've ever had access to. Universities are busy drafting and redrafting policy documents about fair use of AI, because we have no way to enforce a ban on its use, but even these documents accept the flawed premise at the heart of the promises these technologies make. 

The flawed premise is this: AI technology is based on the idea that the important part of creating things is the outcome, not the process. Can't draw? That shouldn't stop you from making a picture. Worried about your writing? Why should that stop you from handing in a coherent essay? The ads for AI all promise that you'll be able to produce things without all the tedious work of actually producing it - isn't that great? 

Well no, it's not - it's terrible. It betrays a fundamental misunderstanding of why creating things has value. It's terrible in general, but I am especially offended by this idea in the context of education, and in this post I want to lay this idea out in a little detail. 

I really encountered AI in education for the first time last year. I teach on two Masters level Cognitive Psychology classes and both have written assessments (essays, lab reports, etc). All written work is submitted to us via TurnItIn, the plagiarism detection system, and early in 2024 TurnItIn started reporting an AI score along with a plagiarism score. They use an AI system trained on work submitted to TurnItIn prior to the arrival of AI, and work generated via AI, so it can learn to spot the differences. Whether or not this technology can actually reliably detect AI generated text remains an open question; but once we had the score, we had to address it with our students - submitting text you did not generate and claiming that it is your own work is a textbook violation of academic integrity rules. I had to attend probably 30 meetings with students about this, which is an insanely high number. Mostly it was not malicious; students were over relying on Grammarly suggestions, or not paraphrasing ChatGPT output appropriately. But it still highlighted a problem: the students were all just trying to produce a piece of work they could submit, rather than engaging with the work of producing it, and AI was letting them do this because that is what it's for. 

Part of the problem is the broader higher education context that incentivises students to value the qualification, not the education, and that needs fixing as well. But there are a lot of tools that students can use that don't accept this framing, and my colleagues and I certainly don't accept it. The reason AI offends me so much is that it fully accepts this framing, and worse, it automates it!

Take, for example, the most commonly mentioned potential legitimate use case for AI; writing that first draft, or generating a few ideas to get you moving. But this highlights the problem: the process of drafting is actually a key part of the process of producing work that reflects (and improves!) your understanding of the topic. Everyone who actually creates things (scientist writing papers or designing experiments, artists making music or pictures) understands this. It's ok that it's hard! It's ok that it doesn't work the first time! We all want our students to understand this too, but the existence of AI makes that discussion so much harder to have, and I'm mad because this was already a challenge for us.

One reason this issue resonates so hard with me is that I am fully steeped in the ecological approach to answering questions, which at its heart says 'verb your nouns'. Cognition is a thing we do, not a thing we have, and learning is deeply shaped by how we get to where we are going. Process matters so much more than outcome. I'm sure there are many other pathways through educational and learning theories that leads to this idea, but the ecological approach provides deep insight into this and when it's applied to skill acquisition of any kind the primacy of process just cannot be escaped. 

So fuck you, AI. You embody a deeply flawed approach to human activity and what gives it value, and you are hurting my students. I'll keep fighting you off, but I really wish you weren't around. 

2 comments:

  1. When my students started taking exams remotely during COVID, the issue of access to the internet and to each other became important. Proctoring systems plus declarations were instigated. Realising that these measures were futile, I set exams and assignments on the basis that whatever the rules stipulated I told assume that the students would access the internet and each other. This required setting questions based on assignments and material they had turned in so that each student was effectively answering a slightly different question or applying a task to different data. Not perfect, but it helped.

    Conclusion - we need to change the way we teach and assess to enable tools like AI and the internet to be kart if the learning- not a shortcut for circumventing it.

    ReplyDelete