Image from Public Domain Review
(replace this with an architecture and data flow diagram)
I’m reading a few popular history of chemistry books at the moment (notably Mendeleyev’s Dream and Napoleon’s Buttons). One theme is how the history of chemistry was plagued by the completely bogus notion of alchemy. The idea that base metals could be transmuted into gold dominated any dabblings in chemistry for centuries, and kept reappearing in different cultures and at different times. “This has to be possible, right?” was the persistent motivation. The dogged pursuit of alchemy was characterised by the following:
- Greed – unlimited wealth awaits!
- Obfuscation – it persists through rumour, and secret formulas, adding to its allure. The process is never made public.
- Magical lexicon – this obfuscation works not only by being secretive but by creating a language that is difficult to penetrate
- Vagueness – although the ultimate aim (Gold!) is clear, there is a lot of vagueness otherwise about benefits (immortality, spiritual awakening, general goodness)
- Occasional unexpected side benefits – almost inevitably given the time devoted to it, there would be the odd chemical breakthrough which occurred as a side benefit of alchemy, for example, the discovery of phosphorous
- Persistence despite results – obviously no-one ever got alchemy to work, but this complete lack of success was only seen as reason to continue. It had to be true, dammit! Hundreds of years, and some of the best historical minds (hello Isaac Newton) were involved in this fruitless pursuit.
Now, is it just because I have a particular mindset, or does this set of characteristics sound familiar? As I was reading it I kept thinking of the ed tech equivalents of alchemy. Goals that are pursued at different times, in different guises and never actually realised. I would suggest the “gold from base metal” dream of ed tech is automated, personalised tuition across all subjects – essentially the removal of the human educator. We’ve seen this with industrial systems, early AI, MOOCs, and now new improved AI. I think it matches all of the characteristics of alchemy I’ve given above. We do get breakthroughs, and automatic tuition and assessment is possible at a fairly simple level. Let’s consider the similarities with my alchemy list:
- This offers vast riches for the discoverer who can sell the product at great cost, because this will still be cheaper for providers than employing people. The education market is estimated at $6 Trillion annually. That’s pretty much turning base metal into gold.
- It is frequently obfuscated by commercial interests with black box algorithms. They only report questionable results which are difficult to verify because we don’t know the underlying transformations.
- It has its own lexicon of algorithms, learning analytics, intelligent systems that increasingly becomes to look like magic.
- There is often a vagueness around improved efficiency, retention, learner satisfaction, democratisation of learning, etc. All of these are actually worthwhile goals to pursue in their own right, but there is a magpie tendency to grab the latest concern and say “yes, we can help with that too. Plus, did we mention we can turn lead into gold?”
- Accidental side benefits – this intensive work with algorithms and data does have some benefits, learning analytics that help educators improve their course design for instance. But this isn’t the real goal
- Persistence – Audrey Watters has talked of “zombie ideas” in ed tech that just refuse to die. Certainly automatic tuition is one of these, no matter how small the gains are, there is always the sense that it is ‘just about to happen’.
Just to be clear – I am NOT saying ed tech is rubbish. I love ed tech. It has provided genuinely new ways of teaching and reaching different audiences. It can solve very specific issues and offer lots of benefits for learners. My objection here is to the overblown claims, and the often unspoken alchemic tradition that persists in ed tech. The way to combat this is openness (of data, algorithms, claims, results), focusing on very specific problems to address (instead of grand revolutions) and calling bullshit when we see it.
Like alchemy I fear we will waste time, effort, money and good minds on the pursuit of a really big, unattainable goal instead of focusing on smaller, actually achievable ones. What if we say we don’t think you can, or want to, remove the human educator from the education process. If we accept that as a premise, then what can we now go on to do with ed tech? Just like with alchemy once they stopped trying to produce gold, they went on to discover elements, invent medicines, create all manner of new materials that can be used in the objects we use everyday, and so on (and yes, quite a few weapons along the way, but that’s another post).
I think I’ve heard others talk about the analogy of alchemy in ed tech. Certainly Audrey Watters has mentioned it a couple of times. But I can’t find anything in detail. My apologies if I’m actually just regurgitating something I heard once and have now mistaken for an original idea (it happens).
In the comments Mark Brandon reminded me of this Blackadder scene. I’d meant to include it originally as it kept coming to mind when I was reading, but then forgot. It pretty much is the perfect analogy, thanks Mark: