A quick and dirty REF

I am currently engaged with working on the REF submission for my Institute. I fully understand why we do this, in cash strapped time in particular, research money will make a big difference. It is also almost impossible not to play the game if you want to be recognised as a research institution and attract staff.

But it is a horrible, pernicious process that seems deliberately designed to kill innovation, perpetuate existing models, waste time, focus energy on gaming the system and keep publishers happy. If you wanted to design a process with the sole aim of restricting the adoption of new methods and technology, then you could do a lot worse that the REF.

Not every country has an equivalent of the REF, so we shouldn't view it as a necessary evil. But, let's say that for whatever reason, something like it is going to persist. Two questions then arise: can it be better and can it be easier?

I'm not going to answer the first of these questions. It would take too long, and I've covered it lots of times before. But in a conversation with Tony the other day we pondered how the REF might be made easier and a hell of a lot cheaper. The total (estimated) cost of the 2008 RAE in England was £56.7 million, which compares with the £1.5 billion of QR research funding. They estimate costs for the REF will be similar. My guess is this figure is low, and may not take into account all of the meeting and individual time associated with the process, and probably not opportunity costs also. In a review of the 2001 RAE, for more teaching oriented institutions it was found that the "RAE is over 16 times less efficient than the norm". I don't think the REF can be argued to be more efficient as it largely follows the same methodology.

So here is my (well Tony's mainly) proposal for a quick and dirty REF:

  • Research councils monitor where money goes and evaluate research projects
  • Publishers have databases of authors and citations

The REF then centrally datawarehouses these sources, adds in various weighting factors, creating a score for individuals, which is aggregated for the institution to give their REF evaluation (maybe by subject or maybe across a whole institution).

It uses proxies for research quality, and is focused on research funding and formal publications, but hey, so is the current REF. If we're going to be forced to play that game, then couldn't we at least do it in an efficient manner? It would at least free up a lot of time to you know, actually do research.

It would at least be worth doing a pilot based on this to see how far it differs from the very labour intensive system we have currently, don't you think?



Leave a Reply

Your email address will not be published. Required fields are marked *