Research metrics and indicators

download Research metrics and indicators

If you can't read please download the document

Transcript of Research metrics and indicators

PowerPoint Presentation

Research metrics and indicatorsChair: Anne Trefethen, University of Oxford108/07/2016

IntroductionChair: Anne Trefethen, University of Oxford08/07/2016

The metric tideStephen Curry, Imperial College London - Ben Johnson, HEFCE08/07/2016

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)4

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)5

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)6

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)7

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)8

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)9

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)10

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)11

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)12

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)13

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)14

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)15

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)16

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)17

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)18

08/07/2016Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all)19

Responsible research metricsBen JohnsonResearch Policy Adviser, HEFCE6 July 2016

OUTLINETalk about work on metricsRefresher of key findings and recs ofmetric tide workDiscuss plans for metrics forum (in brief)

First some background20

We invest in the teaching, research and knowledge exchange activities of English higher education institutions. 4bn / yearWe regulate and oversee English HEIs (quality and financial sustainability)We operate the UK-wide Research Excellence Framework (REF) www.ref.ac.uk

About HEFCE

21

Measurement mattersIf we can tell the good research from the bad:Researchers can build on high quality work and pursue promising directionsHEIs can appoint and promote good researchers, support good departmentsFunders and governments can benchmark performance and fund high quality research

and if we can put numbers on it, we can be more objective in our decisions!

Stating the obvious..!22

For researchers, quality = publishing

And publishing = esteem.

350 years of this

Publishing depends on peer review an embedded but resource-intensive activity. A qualitative activity. At the heart of academic culture. 23

Metrics everywhere!

Thats not to say we dont measure thingsResearchers, heis, publisher and funders all increasingly interested in quantifying activity around publishing and researchCitationsDownloadsUsageImpactGrant incomePhD canddates etc

24

Publishers have embraced thisWhole industry here

JIF in particular is widely trumpeted by publishers, despite its objvious flaws25

Metrics providers also in HEI ranking game, with some HEIs basing whole strategy around league tables26

And governments not immune they need to measure and benchmark to demonstrate value for money and attract further investment and justify expenditure

Elsevier/BIS report bases all performance measures on publishing metrics showing clear link back down to the qualitative business of publishing and peer review.

27

REF is another method of quantifying quality

Like many other metrics it also depends on expert review

But it is resource intensive as a result. And not embedded!

So28

.some argue that it is additional activity that can be replaced with metrics to save money.

Those in the business of providing metrics are particularly keen, epsc. Publishers trying to diversify away from subscription in light of moves towards OA29

I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider the robustness of metrics across different disciplines and assess their potential contribution to the development of research excellence and impact

David Willetts, Minister for Universities & Science, Speech to Universities UK, 3 April 2014

All this is part of motivation behind DW request that hefce review role of metrics in research asst. Can we cut down on the additional work? 30

Steering group

The review was chaired by James Wilsdon, Professor of Science and Democracy at the Science Policy Research Unit (SPRU), University of Sussex. He was supported by an independent steering group and a secretariat from HEFCEs Research Policy Team:

Dr Liz Allen (Head of Evaluation, Wellcome Trust)Dr Eleonora Belfiore (Associate Professor of Cultural Policy, University of Warwick)Sir Philip Campbell (Editor-in-Chief, Nature)Professor Stephen Curry (Department of Life Sciences, Imperial College London)Dr Steven Hill (Head of Research Policy, HEFCE)Professor Richard Jones FRS (Pro-Vice-Chancellor for Research and Innovation, University of Sheffield) representing the Royal SocietyProfessor Roger Kain FBA (Dean and Chief Executive, School of Advanced Study, University of London) representing the British AcademyDr Simon Kerridge (Director of Research Services, University of Kent) representative of the Association of Research Managers and AdministratorsProfessor Mike Thelwall (Statistical Cybermetrics Research Group, University of Wolverhampton)Jane Tinkler (Parliamentary Office of Science & Technology) Dr Ian Viney (Head of Evaluation, Medical Research Council) representing RCUK Professor Paul Wouters (Centre for Science & Technology Studies, Uni of Leiden)

Set up in 201431

Approach and evidence sourcesSteering group: diverse expertise and extensive involvement;Broad TORs: opening up, rather than closing down questions;Transparent: publishing minutes & evidence in real time;Formal call for evidence, May to June 2014;153 responses; 44% HEIs; 27% individuals; 18% learned societies; 7% providers; 2% mission groups; 2% otherStakeholder engagement30+ events, inc. 6 review workshops, including on equality & diversity, A&H. Invited fiercest critics!Ongoing consultation & use of social media e.g. #hefcemetrics;In-depth literature review;Quantitative correlation exercise relating REF outcomes to indicators of research;Linkage to HEFCEs evaluations of REF projects;

32

http://www.hefce.ac.uk/rsrch/metrics/http://www.responsiblemetrics.org

Puublished in july 201533

The Metric Tide

Headline findings(a selection)

Just headlines!34

Across the research community, the description, production and consumption of metrics remains contested and open to misunderstandings.

Cultural challenge here researchers dont like being measured, especuially by numbers obsessed administraotrs and beaurocrats from hefceFears are quite real:Quality may be lost in reductive number crunchingMetrics favour those that can play the gameMetrics punish rebelsMetrics are not suitable for A&H research

Researchers more comfortable with the term indicators

35

Peer review, despite its flaws and limitations, continues to command widespread support across disciplines. Metrics should support, not supplant expert judgement.

Inappropriate indicators create perverse incentives. There is legitimate concern that some quantitative indicators can be gamed, or can lead to unintended consequences.

e.g. citation rings, salami slicing, focussing on sexy areas or measurable areas, not publishing controversial research

Basket of indicators may help to counteract gaming? And some of these issues are already upon us37

Indicators can only meet their potential if they are underpinned by an open and interoperable data infrastructure.

Problem that metrics arebeing created behind closed doors, dodgy data, incompatible systems. Needs to be opened up. 38

Our correlation analysis of the REF2014 results at output-by-author level has shown that individual metrics cannot provide a like-for-like replacement for REF peer review.

Very weak correlation.Very poor coverage across some disiplines and output types. Thjese metrics are insensitive and imprecise. 39

Within the REF, it is not currently feasible to assess the quality of research outputs using quantitative indicators alone, or to replace narrative impact case studies and templates.

.. Until issues are ironed out.Recognise that issues can be solved with work but it will take yearsObvious trade off with cost of peer review- so we will keep needing to have this discussion when we do REFs in future40

There is a need for more research on research. The study of research systems sometimes called the science of science policy is poorly funded in the UK.

Responsible metricsResponsible metrics can be understood in terms of:

Robustness: basing metrics on the best possible data in terms of accuracy and scope;Humility: recognizing that quantitative evaluation should support but not supplant qualitative, expert assessment;Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths;Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response.

Not all the written evidence about the REF we received acknowledged the diversity of purposes of the REF, and there are clearly differences of opinion about the relative importance of the purposes. It is not surprising, therefore, that there are equally different views on how and whether quantitative indicators of research quality should feature in the assessment.

This slide captures some of the overall points and concerns gathered through the process of evidence gathering

Some technical Some cultural

42

The Metric Tide

Recommendations(a selection)

43

The research community should develop a more sophisticated and nuanced approach to the contribution and limitations of quantitative indicators.

At an institutional level, HEI leaders should develop a clear statement of principles on their approach to research management and assessment, including the role of indicators.

Research managers and administrators should champion these principles and the use of responsible metrics within their institutions.

HR managers and recruitment or promotion panels in HEIs should be explicit about the criteria used for academic appointment and promotion decisions.

Individual researchers should be mindful of the limitations of particular indicators in the way they present their own CVs and evaluate the work of colleagues.

Especially JIF and H!!!48

Like HEIs, research funders should develop their own context-specific principles for the use of quantitative indicators in research assessment and management.

Data providers, analysts & producers of university rankings and league tables should strive for greater transparency and interoperability between different measurement systems.

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance.

An article is only as good as the journal it is published in == bullshit, obviously52

There is a need for greater transparency and openness in research data infrastructure. Principles should be developed to support open, trustworthy research information management.

NB extensive buying up of infrastructure by private companies is leading to significant concern. (Expect Cameron will cover this!!) 53

The UK research system should take full advantage of ORCID as its preferred system of unique identifiers. ORCID iDs should be mandatory for all researchers in the next REF.

The use of digital object identifiers (DOIs) should be extended to cover all research outputs.

55

Further investment in research information infrastructure is required to improve the interoperability of research management systems.

HEFCE, funders, HEIs and Jisc should explore how to leverage data held in existing platforms to support the REF process, and vice versa.

BIS should identify ways of linking data gathered from research-related platforms (including Gateway to Research, Researchfish and the REF) more directly to policy processes.

In assessing impact, we recommend that HEFCE builds on the analysis of the impact case studies from REF2014 to develop clear guidelines for the use of quantitative indicators in future impact case studies.

Couple of REF related ones59

In assessing the research environment in the REF, we recommend that the role of quantitative data is enhanced.

The community needs a mechanism to carry forward this agenda. We propose a Forum for Responsible Metrics, to bring together key players to work on data standards, openness, interoperability & transparency.

Starting work on that now. 61

As a legacy initiative, we are setting up ResponsibleMetrics.org as a forum for ongoing discussion. Every year we will award a Bad Metric prize to the most egregious example of an inappropriate use of quantitative indicators.

@ResMetrics

Other developmentsHE and Research BillStern review: Expected July 2015Extensive referencing of The Metric Tide in sector responsesEC expert group on altmetrics

Open citationCameron Neylon, Curtin University08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

Open infrastructuresClifford Tatum, Leiden University08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

AWAITING PRESENTATION08/07/2016

Research metrics and indicators18108/07/2016