What university rankings really tell us
There's a law with league tables that goes something like, your criticism of them is indirectly proportional to your placing in them. If you're in the top 10, it's a marvellous piece of research, but as you go down, you become more suspicious. So, when I saw that the OU was ranked 99th in the Times Higher top 100 universities under 50 years old, my critical faculties were engaged. So you can probably dismiss all of this as sour grapes, but the table raised a couple of issues for me which reveal quite a lot about the state of higher education (and to be honest, I'm desperate to blog about something other than bloody MOOCs).
The first issue is that, despite all the talk to the contrary, it's research that counts still. The league rankings were based on six criteria. Of these 5 are related to research:
- Research: volume, income and reputation (30 per cent)
- Citations: research influence (30 per cent)
- International outlook: people
- and research (7.5 per cent)
- Industry income: innovation (2.5 per cent).
This may favour smaller institutions, where a couple of big hitters can have an impact. But more significantly, it means only one criteria relates to teaching. That tells you a lot about what's valued in higher education by people in higher education, but I'm not sure the same is true of students. The one remaining factor then is:
- Teaching: the learning environment (30 per cent)
The OU would fair well here I though, after all, we usually come in the top 3 in terms of student satisfaction (yes, I'm strangely supportive of that league table). But the OU only scored 18.1% on this. How come? Well, closer examination of the methodology for rating the learning environment reveals that it isn't based on student satisfaction but is instead influenced mainly by, you've guessed it, research. So two of the six factors were:
"the ratio of PhD to bachelor’s degrees awarded by each institution. We believe that institutions with a high density of research students are more knowledge-intensive, and that the presence of an active postgraduate community is a marker of a research-led teaching environment valued by undergraduates and postgraduates alike.
The PhD-to-bachelor’s ratio is worth 3 per cent of the 100 Under 50 scores (up from 2.25 per cent).
This category also uses data on the number of PhDs awarded by an institution, scaled against its size as measured by the number of academic staff."
If you have a lot of undergraduate students, like the OU does, then no matter how good their learning experience is, you won't score highly because the ratio to postgrad will be low.
I do value research, but for 5 of the 6 criteria to be based around it and the remaining 1 be heavily influenced by it seems to raise issues. For potential students I wonder what use this is. I'd want to know what the teaching environment is like, as that's what I'm paying my money for. Quite often those researchers who contribute to the ranking view teaching as an inconvenience, so it doesn't bear much relationship to the quality of teaching.
Choosing a university on the basis of its research is a bit like moving to an area and using the quality of schools as your sole criterion, when you don't have any children. It may be a proxy for good quality but it could be that everyone lives outside the area and busses their kids in (Harrow for example has a very posh school but not many of its pupils live in the surrounding area). For you, the quality of doctors may be a more important criteria.
I wonder what message this sends to all, but particularly new, academics. The message I would take away is "research is still king, do the minimum you have to in order to fulfil your teaching duty." And with students paying high fees for their education, this is setting up a conflict between the 'customers' and the context within which staff operate. And that can't be good for the long term future of education (I'm not going to mention MOOCs here, I'm not!).
One other thought was the research obsession left no room for any digital presence. These rankings could have been created 20 years ago. There is nothing in here about online engagement, digital profile, use of social media, outreach via new channels, etc. This would be difficult to measure I grant you, but given the significance of online presence for students, academics and institutions alike, it seems like an oversight.
As you rightly note, league tables are at best bizarre. I suppose this one has other wrinkles – on the list appear 19 UK universities. So on top of the incredible list of research based criteria, it is also perhaps a historical footnote about a period of university building/development since the early 1960s. It provides an interesting global view – 19 UK universities; 13 Australian universities; 8 United States, and 5 from Taiwan.
I suppose on the bright side – at least, we got into the top 100.