higher ed,  OU

When is widening participation not widening participation?

Facepalm

The Higher Education Policy Institute released a study today that ranked universities by their widening participation stats. You’d expect Russell group unis to do poorly in this, but I bet the Open University, a provider set up to specifically address WP will do well, right? Except, they didn’t include it. I got into an exchange (HEPI Twitter is feisty!) on this, where they defended their methodology. But this was itself revealing, they replied to my criticism about the OU’s exclusion saying:

“To be clear, there is not a valid way of including it in this study as Polar focuses on young people, the data was sourced from UCAS etc etc etc. Much dodgier to wrench an institution in just because we think it might do well. You’ll find lots about the OU on our site elsewhere.”

POLAR as a measure of recording WP is flawed, particularly if you want to measure mature students. The TEF recognised this by including IMD data this year (this also has issues, particularly for inner city where postcode can include widely varying incomes, hence they include both). The message from HEPI seems to be that it’s your fault if you don’t fit their methodology. But inherent in the methodology are assumptions that undermine the very point of the study I think (I should note that HEPI strongly disagree with me, saying that was not the intention of this study).

This report focuses on traditional universities (Birkbeck is similarly noticeable by its absence), and traditional students (young, campus based). If your aim is to argue that widening participation is an important metric (they are sort of promoting a WP league table), then that message is entirely undermined if your definition of WP is, ironically, too narrow. A study that showed how providers who focus on WP perform would be more powerful. This one seems designed to get headlines (it succeeded in that), over making a valuable contribution to the WP agenda. If your methodology is excluding institutions that everyone thinks should be included, maybe it’s worth looking at that method? That’s what I’d be telling a PhD student embarking on this study. The report is titled “Benchmarking Widening Participation”. This has the intention then to become a useful metric, and if so, the exclusion of widening participation institutions from the outset is not just annoying, it’s potentially damaging.

[UPDATE – I extended this post for an article in Wonkhe]

3 Comments

  • Sarah Lambert

    Oh dear, sounds like they’ve done the easy study, measuring that which is easy to measure. Rather than the harder study which, as you say, might actually measure WP. But having said that, maybe what they’ve done is measure the extent that unis replicate existing advantage in the way they take school leavers from higher vs lower socioeconomic schools and areas. Very roughly. And I do like that you can see who from the mainstream unis are doing some heavy lifting vs those who really are part of the problem, not part of the solution. Normally we see only national aggregates re different equity groups’ participation, and you get no sense of where progress is being made. Which feels rather disempowering. Maybe while this is imperfect, it’s still something.

    • mweller

      Hi Sarah, I wrote a longer response in Wonkhe. The problem is that many WP providers found themselves having to defend their record yesterday because their whole raison d’etre is WP yet they either fared poorly (because POLAR doesn’t capture their students) or weren’t included. For a study that wants to promote WP as significant, making life more difficult for WP providers is a poor intervention, so I would have preferred it not to be published (or to be published with a better methodology)

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php