digital scholarship,  higher ed,  Research

The REF – a user’s guide

This is the first of two posts looking at the Research Excellence Framework (REF). Apologies to non-UK readers, this is a bit parochial. I needed to look through the REF in relation to the digital scholarship work I am doing. In the next post I will comment on it from a digital scholarship perspective, but having read it all, I thought I'd give an overview of the key points in this post, just to provide a quick review for those who can't be bothered to read the whole thing (ie, any sane person). So in this post I'm not commenting on the REF itself, either in detail, or as an exercise as a whole. I'll do that in the next post.

I read the REF consultation over at WriteToReply, and have left a number of comments there. I found the breakdown and formatting there a good way to read it, so if you do want to go through it, I'd recommend that route. So here are what I have determined to be the main elements of the proposal (although don't sue me if you submit according to this and I've missed something out).

Change to REF from RAE

The REF does not appear radically different from the RAE. It is an adjustment in process rather than a complete overhaul. The aim seems to be to simplify the RAE. This is made clear from the start:

"The underlying policy of allocating research funding selectively on the basis of quality remained unchanged; the intention was that the mechanisms should become simpler and less burdensome."

One substantial change is more of an emphasis on 'impact' – be it social or economic. This is probably a nod towards justifying the money spent on research and its relevance to the UK population:

"the Secretary of State emphasised that the REF should take better account of the impact research makes on the economy and society, and gave further guidance on particular activities that the REF should encourage:

'The REF should continue to incentivise research excellence, but also reflect the quality of researchers’ contribution to public policy making and to public engagement, and not create disincentives to researchers moving between academia and the private sector.’ "

Aims

So what is the overall aim of the REF? They sum it up as:

"Support and encourage innovative and curiosity-driven research, including new approaches, new fields and interdisciplinary work."

We should bear this aim in mind, as I found myself losing sight of it amongst the discussion of the process.

What is research?

Their definition of research is:

"a process of investigation leading to new insights effectively shared"

Assessing researchers

They assess units of researchers, not individuals. These units need to demonstrate:

  • "a portfolio of high-quality, original and rigorous research, including work which is world-leading in moving the discipline forward, innovative work pursuing new lines of enquiry, and activity effectively building on this to achieve impact beyond the discipline, benefiting the economy or society"
  • "Effective sharing of its research findings with a range of audiences"
  • "a range of activity leading to benefits to the economy and society, "
  • "A high-quality, forward-looking research environment"
  • "Significant contributions to the sustainability and vitality of the research base"

There are three main elements they will assess (not equally weighted, see later):

  • Output quality
  • Impact
  • Environment

Greatest weight in the assessment will be given to output quality – they say this three times so I think they are making a point. Their current thinking is a weighting of Output = 60%, Impact = 25%, Environment = 15%.

As with the RAE assessment will be performed by expert panels, although there will also be the use of some citation metrics for the sciences in the REF.

Outputs

The institution selects the staff it wishes to be submitted for the REF in each of the units of assessment. There is no pressure to include a minimum or maximum number of staff:

"It remains our view that the proportion of staff selected should not be a significant factor in assessing quality in the REF."

They propose 3-4 outputs per individual. It looks likely that they will come down to 3 to reduce burden on the panels.

There is some indication that it is not just the traditional outputs which will be considered (although you might want to see my next post):

"All types of outputs from research that meets the Frascati principles (involving original investigation leading to new insights) will be eligible for submission. This includes ‘grey literature’ and outputs that are not in conventional published form, such as confidential reports to government or business, software, designs, performances and artefacts"

They will assess outputs "against criteria of ‘rigour, originality and significance’." Assessing significance may be difficult and they propose "to assist in assessing user significance (beyond the academic sphere), institutions will be invited to include a short statement with any output for which they believe that such significance may convincingly be asserted."

They will use a similar 1-4 star rating as the RAE. They then provide some 'definitions' of excellence, which are self-referential:

  • Four star      Exceptional: Quality that is world-leading and meets the highest standards of excellence in terms of originality, significance and rigour
  • Three star     Excellent: Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence
  • etc.

Metrics

A pilot study was conducted with citation metrics. They conclude that it is not robust enough to replace the expert panel, but can form a useful part of the review, particularly in the sciences.

The citation metrics need to relate to the selected outputs, ie not to an individual or a project overall.

The citation data will be limited to only 3-4 pre-selected databases:

"In the pilot exercise we used two databases, the Web of Science and Scopus. For the REF, we will procure one or more databases through a rigorous procurement process"

There is a strong hint that these citation metrics should act as the filter for who is submitted:

"Given that a number of expert panels will make use of citation data to inform their judgements, we anticipate that institutions may wish to make use of such data to inform their selection of outputs"

In order to reduce the load on panels they suggest some outputs may be 'double-weighted'.

Impact

This is now an explicit element, a change from the RAE. They stress that it needs to impact based on the research, not impact arising from other activity. They propose a wide definition of impact, "including economic, social, public policy, cultural and quality of life." They explicitly state that impact does not cover influence on academic knowledge – that is encompassed in the outputs element. This is much more about the broader, social impact of research. It can relate to teaching where "it can be shown that high-quality research has informed practice, not just course content".

Impact will not be quantified, instead "expert panels will review narrative evidence supported by appropriate indicators, and produce graded impact sub-profiles for each submission".

Each unit of assessment will need to provide evidence of impact comprising case studies and an overarching impact statement. These both need to use a number of indicators:

  • "of research income generated from key categories of research users
  • of the amount and extent of collaboration with the full range of research users
  • that may be particular to specific UOAs, selected from a common ‘menu’."

Research environment

This doesn't really affect individuals, as it will be a narrative produced by the unit of assessment indicating how their research environment covers areas such as resourcing, management and engagement.

Panels and Units of Assessment

The workload on panels and consistency between them is a major concern in the REF document. They propose having fewer units of assessment and with "fewer fluid boundaries between them than in previous assessment exercises". Each panel will consist of about 20 members with 15-20 associate members.

Interdisciplinary research gets an explicit mention: "we aim to ensure that whichever panel interdisciplinary research is submitted to, there will be effective mechanisms for ensuring it is reviewed fairly by people with appropriate expertise"

Timetable

These are the suggested dates:

  • Sep – Dec 2009     Consultation exercise. Initiate impact pilot exercise
  • Spring 2010     Announce high level consultation outcomes. Invite nominations for panels. Start developing REF data collection system
  • Autumn 2010     Conclude impact pilot exercise. Publish guidance on submissions
  • Late 2010     Appoint panels
  • 2011     Panels consult on and publish criteria. Complete REF data collection system
  • 2012     HEIs make submissions
  • 2013     Panels assess submissions
  • Dec 2013     Publish outcomes
  • Feb 2014     Determine funding outcomes

Costs

There was a review after the last RAE. The total (estimated) cost of the 2008 RAE in England was £56.7 million, which compares with the £1.5 billion of QR research funding. They estimate costs for the REF will be similar.

2 Comments

  • AJ Cann

    (One of) the iniquities of RAE2008 was different panels behaved in very different ways. In some cases, they were reasonably objective, in other cases, totally subjective. That’s the main reason I’ve lost faith in REF – rather than revising the process using citation/impact data, it’s still going to be totally subjective and biased.

  • Linda Price

    I too am sceptical about the REF and I agree with the previous post that the last RAE seemed biased and subjective – and based on the perspectives of the panel members.
    I think we should suggest postponing or cancelling the next REF…. that way the government could save at least £56 million! And we would all save ourselves lots of time and effort on a totally outmoded form of assessing research.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php