Generative AI & the taste of sweet surrender
I’ve attended a lot of AI talks recently (I mean, even if I tried to avoid them I would still have racked up a few). And here’s my hot take for education – just go for it. I don’t say this as an AI enthusiast, I find it quite boring and kind of soul sucking, but shouting loudly and hoping it will go away isn’t a viable strategy. As I argued in my last post, it has a strong inevitability factor, and lack of engagement risks doing ourselves and our students a disservice.
That’s not to say we shouldn’t fight to make it open, to avoid bias in datasets, and be critical of its output – we absolutely should. But also let’s bring it into the fold in higher ed. It’s a form of knowledge tool after all, why would we be against that?
There is a lot that could be good about it from a student perspective. AI infused higher ed may be more democratic – everyone has access to advanced smart tutorial bots, not just rich kids (although I’m sure capitalism will find a way to make it exclusive some how). Gone is the need for expensive textbooks that disadvantage poor students. AI tools can help students in many ways – Mike Sharples has a good table of possible educational uses in this UNESCO publication. They seem like good things for students.
We will need to teach students to critically evaluate and assess the quality of AI content as it’s going to be everywhere when they leave higher ed. Also how to make the best use of AI, for instance, developing skills in being prompt engineers will be valuable across all disciplines. Making the campus an AI free zone is the same as when we used to pretend students wouldn’t access the web. Almost every career or past time is going to be touched by it in some way, it’d be doing the dirty on our students to pretend that it doesn’t exist, like when we used to make them pretend they didn’t read things online and only in print.
If your assessment can be passed by AI, then change it, what comes out may be more meaningful for students. AI assistants that can help with navigating the University system will alleviate a lot of stress. Getting students to use AI as a tool might be fun and shifts the power dynamic around knowledge. Rethink your curriculum and discipline from an AI rich perspective. All of this is easy to write, but difficult and costly to do.
Then concentrate on the bits AI doesn’t do well. Focus on the social, the connections, the meta-cognitive skills, sympathy, care and human aspects of education, which have been affected by an over-systematisation of higher ed. Ironically the robots may make it all more human again.
To emphasise I don’t say engage uncritically, indeed that criticality and holding to account is a key reason to get involved. Otherwise a version controlled an determined by antisemitic idiots who own large tech corporations will be the only option. But there are real benefits for students to be had, I believe, and we need to be ensuring these are at the forefront of what universities do with the tech, because otherwise we’ll get sold all sorts of faux solutions to fake problems.
After fretting about it and how it can be stopped, I have now embraced the sweet bliss of surrender. I’m just not going to, you know, do any of the actual AI stuff, because it doesn’t really interest me. But there are plenty of good people out there who are excited by it, so have at it, we need you.
What has Artificial Insemination (AI) got to do with education, especially HE?
Or am I wearing my genes back to front…?
(Answer “offered” by ChatGPT)