Accueil > Revue de presse > Top research departments fail to shine in impact pilot - Paul Jump, Times (...)

Top research departments fail to shine in impact pilot - Paul Jump, Times Higher Education, 20 janvier 2011

vendredi 21 janvier 2011

Hefce warns against reading too much into results as departments ’experiment’. Paul Jump reports

A number of top research departments performed unexpectedly poorly in the first official attempt to measure the impact of academics’ research, new data reveal.

Some lower-rated departments also did conspicuously well in the Higher Education Funding Council for England’s pilot impact assessment exercise.

Reports summarising the reactions of the institutions and the judging panels to the exercise were released in November, but the results themselves were held back because, according to Hefce, they were "not relevant" to the general conclusion that a case study-based approach to assessing impact was "workable".

In the exercise - held to test controversial plans to include an impact rating in 2014’s inaugural research excellence framework, which could be worth up to 25 per cent of the marks - 29 universities were asked to submit a case study for every 10 scholars working in two out of the five subjects being assessed.

In social work and social policy, the London School of Economics achieved the best result, with 70 per cent of the material it submitted rated 4*, the highest grade, defined as "exceptional". The LSE was also the top performer in the subject in the 2008 research assessment exercise, according to Times Higher Education’s analysis of the results.

In English language and literature, Queen Mary, University of London (rated second for the subject in the RAE 2008) also performed strongly in the impact pilot, with 40 per cent of its submission rated 4* and 60 per cent 3*, defined as "excellent".

But the University of Manchester, judged the fourth-best department for research in English in the final RAE, saw 80 per cent of its impact submission rated only 1*, or "good".

A Manchester spokesman said the focus of its English submission was on "learning about the mechanisms of assessment...Although the results can be seen as disappointing, they have given us an opportunity to learn far more about what is expected from an impact statement."

Lancaster University’s English department, ranked in the middle of the RAE 2008, achieved a notably strong result, with 35 per cent of its impact submission rated 4* and 50 per cent 3*. But its top-rated physics department did relatively poorly, with 95 per cent of its impact submission rated only 2*, defined as "very good". A spokesman for the university said Lancaster had also approached the pilot exercise as "an opportunity to test the system".

The University of Cambridge’s physics department, rated second in the RAE, fared better, with 30 per cent of its impact submission in physics deemed 4*. The strongest performance in physics came from Liverpool John Moores University, which achieved a 40 per cent 4* and 45 per cent 3* impact profile.

David Carter, professor of observational astronomy at Liverpool John Moores, said he believed Hefce had found a sensible and proportionate approach to the agenda, but warned that it was difficult to gather impact evidence retrospectively.

"You need to consider what the impact is of your research while you are doing it and write it down : it is fairly basic stuff," he said.

In earth systems and environmental science, Brunel University achieved the highest impact score, with 50 per cent of its submission rated 4*, although the remainder went unclassified.

The results for the other subject assessed, clinical medicine, cannot be directly compared with the RAE 2008 results.

Hefce’s introduction to the impact profiles notes that a lack of evidence in some case studies had "significantly affected" scores.

For this reason, plus the deliberate experimentation in some of the submissions, the results "should NOT be read as a clear judgement about the impact of research from the submitting departments, or as a means of predicting the impact profiles departments may be expected to achieve in the real REF", the funding council says.


Voir en ligne : Times Higher Education