Benchmarking and Appropriate use of Metrics to support ... 303... · Research Facilitator ... •...

48
Benchmarking and Appropriate use of Metrics to support Research Administration Colin Cooper Research Facilitator Liverpool Hope University Contains a bit of controversy.

Transcript of Benchmarking and Appropriate use of Metrics to support ... 303... · Research Facilitator ... •...

Benchmarking and Appropriate use of Metrics to support Research Administration

Colin CooperResearch Facilitator

Liverpool Hope University

Contains a bit of controversy.

Benchmarking and Appropriate use of Metrics to support Research Administration

Content:• Benchmarking• Metrics • Use of Metrics• Other Sectors• Performance Metrics• New Metrics for RSO

Benchmarking

Definition

“Benchmarking is a tool for improving performance by learning from best practices and the processes by which it has been achieved.

Benchmarking involves looking how others achieve their performance levels. In this way benchmarking helps explain the processes behind excellent performance.”

Metrics

Definition

Metrics measure and report an organisation's performance and capture evidence based figures. It allows and encourages comparison and supports a strategy.

Statistics

Definition

The science of producing unreliable facts from reliable figures.

If metrics are evidence based figures, statistics are how we then elaborate (lie) about them!

Benchmarking

End Results• A set of meaningful performance information.• Improve strategic planning and provide an assessment

of strengths and weaknesses.• Ability to set challenging performance goals and

stimulate management of the research portfolio• Encourage implementation of best practices.• Lead to increased efficiency in the use of resources both

financial and physical.• Close the performance gap• Empowering of staff to seek further improvements• Get additional resources

BenchmarkingPerformanceBenchmarking

Comparison of performance measures to decide how good University is compared to others

Process Benchmarking

Processes are compared to others to improve your own

StrategicBenchmarking

Comparison to change strategic direction

Internal Benchmarking

Comparison made between Departments / School / Faculty in a devolved administration

CompetitiveBenchmarking

Comparison against like for like competition

FunctionalBenchmarking

Comparison of enterprise systems e.g. Finance, HR, Students: to become the best in use of current systems

Generic Benchmarking

Comparison against best in the business regardless of sector e.g. Amazon for marketing and Tesco CRM

BenchmarkingInstitutional Benchmarking • What do you want information for ?

To set goalsTo chose collaboratorsTo gee up your academic staff

• What / who do you measure ?Research income / Funding baseSuccess RatesPublications, patents, licences, ImpactNumber of Students @ UG, PGR,PGTStaff numbers, Research Staff numbersComparator / aspirational / Individuals/ Groups/ Departments / Faculties/ research centres

• Where do you get the information ?Research Excellence Framework (UK)HESA ( Higher Education Statistics Agency (UK)Funders StatsColleagues League Tables

Benchmarking STEP 1Sit back and think !“Not everything that can be measured is important and

not everything that is important can be measured” Albert Einstein

• Who’s asked for it ? Top down or bottom up• Costs (carrying it out / implementing results ) • Either way get yourself a “champion”• Team selection• What’s being benchmarked (priorities v KPI’s)• Select non friendly academic staff• Understand your own processes • Carry out your own data collection• Performance metrics

Benchmarking STEP 1Go deeper basic functions (Process

Benchmarking)

• Length of time to prepare costing • Length of time to negotiate a contract• Length of time from award to account set up• Staff appointments / in period changes• Close out within x number of days• Timeliness of invoicing • Outstanding debtors over x days• Ethics approval• General response time• FEC recovery achieved

Benchmarking STEP 2Identify Best Practice Select Partners

(Performance, Generic & Internal Benchmarking)

• Select appropriate comparators internal

or external

• Within HE Sector• Outside HE• International• Ethics even in benchmarking• Age of data HESA or wait for REF results

Benchmarking Step 3Decide on Approach

How Soon You Need Results Benchmarking Alternatives

Within a weekReading about research adminSurfing the webTelephone interviews

One to two weeksExperienced ResearchAdministratorHire a consultant

Three to six weeks E-mail Questionnairesite visit

Two or more months Larger benchmarking / review

Benchmarking STEP 4

Implement Findings

• Correct team selection back at step 1 • One size does not fit all• Don’t just imitate……..innovate (Imperial, City, UMIST,

Manchester, Liverpool, Liverpool Hope)• Put in place quick wins• Communicate the results to encourage use• Decide if knock-on for other areas need addressing• Feedback to participants• Link data collection and management for the Benchmark

results to ongoing work

Benchmarking STEP 5

Assess & Review

• Give it some time• How was it for you• If objectives not met, what can you do• Did we use the right metrics• Can we see marked improvement• Make sure staff are actually implementing the changes• Feedback the results to your participants

Benchmarking STEP 6

START AGAIN ?

• Benchmarking should not be a one-off exercise• Aim for continuous improvement• Select new areas to be benchmarked• Changes to administrative structures• Changes to funding • Did you learn anything or do you still believe you are the

best

Benchmarking

Definition

“Benchmarking is a tool for improving performances by learning from best practices and the processes by which it has been achieved.

Benchmarking involves looking outward to examine how others achieve their performance levels. In this way benchmarking helps explain the processes behind excellent performance.”

But what about trying something like this with Metrics, lets be innovative, different and controversial ………..

Biomolecular ScienceApplications 9,033 69 - 398 311 3,223 - ##### 65 Aw ards 2,048 27 164 - 293 643 - 3,175 Chemical EngineeringApplications 4,265 450 24 103 548 - - 5,390 26 Aw ards 1,711 183 334 140 48 - 19 2,435 ChemistryApplications 4,487 138 - 41 - 326 - 4,992 22 Aw ards 2,581 - 12 20 224 71 18 2,926 DIASApplications 3,581 1,188 - - 295 - - 5,064 20 Aw ards 871 48 7 38 31 7 - 1,002 OptometryApplications 229 206 22 308 37 713 - 1,515 15 Aw ards 525 6 159 215 - 155 - 1,060

Reporting Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun JulNo of Visits 2 2 0 0 1 0 1 1 1Formal Presentations 0 0 0 0 0 0 0 0 0 1Signif icant Proposals 0 0 0 0 0 0 0 0 0Partnerships 0 0 0 0 0 0 0 0 0Other 0 0 0 0 0 0 0 0 0

Other ActivityFuels for the Future ws wsBBSRC BioinformaticsSOCI m mBioNow m nwEPICC mNW Chem InitiativeShell mCIBA mEPSRC Mock Panels wsFP5 Soybean mWilton / Fraunhofer m m m mBiotech Group mBioindustry TT wsUniv Manchester mNHS IP cMedical House m

UK Ch

arities

Other

Total V

alue

Total N

umber

Resea

rch Co

uncils

Departments Gover

nment

UK Ind

ustry

Overs

eas

Europe

an Co

mmiss

ion

Metrics nowball Effect

Information Capture for Academic Metrics

h-index, Bibliometrics, Publications and Citations, Scholarly Output, Citation Count, Citations per Output, Field-Weighted Citation Impact, Outputs in Top Percentiles, i10index, Publications in Top Journal Percentiles, Collaboration Metrics, Collaboration Impact, Academic-Corporate Collaboration, Academic-Corporate Collaboration Impact, Esteem and Socio-Economic Impact Metrics, Public Engagement, Enterprise Activities / Economic Development Metrics, Intellectual Property Volume, Intellectual Property Income, Sustainable Spin-Offs, Spin-Off-Related Finances, Research Gate, Research Fish, Web of Knowledge

Altmetrics

Tweet citations, Facebook post citations, Blog citations, General web citationsGrey literature citations, Online syllabus citations, Online presentation citationsDiscussion forum citations, Mainstream media citations, Mendeley, CiteULike or Zotero bookmark counts. Library holdings (of books), Citations from Google Books, Linkedin

Metrics

Metrics

Available Hardware / Software etc

Current Research Information System (CRIS), Scopus, Web of Science, Google Scholar, Elsevier’s Pure, Digital Science’s Symplectic1, Thomson Reuters’ Converis, Star Metrics, Research in ViewSM, ePrints, dSpace. Snowball Metrics Exchange, SCIMAGO, Common European Research Information Format (CERIF), HESA, RCUK, THES tables, NSS, US Office of Management and Budget, PlumX, Impactstory, Research Gate, Microsoft Academic Search, Socialcite, Papercritic, Readmeter, Crowdometer, ORCID, Excellence in Research (Aus)

Out of hand………When a commissioned report by HEFCE recommends an award for an annual “Bad Metric” prize to the most egregious example of an inappropriate use of quantitative indicators in research management. (Metric Tide)

ESRC : Centre for Doctoral Training

The CDT must consist of REF UoA which fulfil all the following criteria :

• a greater than or equal to 50 per cent REF output (3*+4*)

• a greater than or equal to 50 per cent REF environment (3*+4*)

• a greater than or equal to than 50 per cent REF impact (3*+4*)

• a research volume equivalent to a minimum of five FTE staff with output at 3* or 4* (calculated by number of FTE staff submitted to REF2014 ‘multiplied by’ percentage of REF output at 3* or 4*).

Funding fixes

% of the submission meeting the standard for:

4* 3* 2* 1*

Overall 15 38 34 13

Outputs 11.4 48.6 28.6 11.4

Impact 40.0 0.0 30.0 30.0

Environment 0.0 37.5 62.5 0.0

Liverpool Hope REF scores in UoA 25 Education

Catch 22

Metrics (Stern Review UK)• “no metric can currently provide a like-for-like

replacement for REF peer review.”

• “a move to metrics carries risks for the reputation of REF and the quality assessments it makes.”

• “we have concluded that metrics should not replace peer review as the primary approach to the assessment in the next REF”.

• “widespread recognition that robust metrics for impact don’t yet exist, such that narrative case studies, assessed by peer review, remain the best option”.

Metrics in the REF

• We know who’s going to get what (within 10%) funding, even before the exercise is carried out. In the last exercise, 20 out of 130 (English) universities received 67% of the funding

• Can be no doubt whatsoever regarding concentration of research funding in the UK

• Exercise costs the sector £246M = 4% of Research Spend

• Occasionally will throw up a surprise, so that the funding algorithm needs changing

Concentration of Research Funding

Metrics

OTHER SECTORS Industry

• By calculating revenue less the real costs of the products the customer is ordering and less the real cost of servicing the customer we are left with the customer contribution. The customer contribution is the figure to compare one customer to another

Positive contributors

Negative contributors

Most profitable

100% contribution

Productsor

Customers

Positive contributors

Negative contributors

A BakeryHalf the customers added no profit. Rationalising product mix and changing service

levels to specific customers added £1 million to the bottom

line within six months

Cumulative net margin

Least profitable

A Wholesale CompanyLoss-making customers typically

had low order values, more returns and manual systems. Many were moved to profitability by changing these characteristics. Others were

‘let go’ to competitors.

An Electricity Utility15% of 3 million customers were

unprofitable. Profiling their characteristics enabled the

company accurately to predict costly behaviours and avoid them – which delivered £8

million additional profit.

A Retail ChainProduct categories with

negative net margins accounted for 17% of the

company’s overall net margin. Careful analysis of the causes

of excessive gross margin erosion produced positive net

margins for nearly all categories averaging £0.5 million a year in each store

MetricsPerformance Metrics (Sponsor Contribution)

• Strategic fit• Time for proposal preparation• Success Rates• Cost Recovery• Average value of awards• Late payments • Large administrative burden• Repeat Business• Royalty payments• Restrictive contractual terms

(improve funding base)

Metrics

Metrics

Metrics

Performance Metrics ( Academic Staff)

• Number of proposals per FTE• What % staff bring in % research funding• % working as PI• £’s per FTE• Personal portfolio growth• Consultancy v Research income• Outputs (impact on scientific area / outside of academe)• Licence income per FTE• PGR per FTESETTING TARGETS

(Increased Competitiveness)

MetricsPerformance Metrics (Academic Contribution)

• Unresponsive• Time for proposal preparation• Success Rates• Cost Recovery• Average value of awards• Late Technical Reports• Always on the phone over nothing• One project at a time• No Outputs• Not research but consultancy• Own slush funds

(improve internal base)

A universal law: The Hook CurveCumulative contribution

Funder & Academic

100%

Con

trib

utio

n

Internal and External Stakeholders

Positive contributors

Negative contributors

100%

30% 60%Thanks to Brian Plowman

MetricsPerformance metrics (Admin)

• Number of proposals per FTE• Value of awards per FTE• Number of awards per FTE• Increase in funding over a fixed period• Cost of RSO as a % of £’s awarded

Pre & Post award• Number of PI’s per RSO FTE• Central or Devolved Administration• Go deeper basic functions

MetricsPerformance Metrics (Own Staff Contribution)

“We all dream of a team of Carraghers”

• Managing self• Managing people• Managing work

How can all that be achieved ?Individual Training plansTeam Training PlansRobust Appraisal System (not just Head of Office meetings)Team Spirit / inclusive office / Snr Academic staff participation

Using Benchmark and Metrics Findings in a FEC Environment

Core

Adds value, meets business and customer needs

Support

Enables core activityto take place

Diversionary

Adds cost, but no value -caused by process failure

Reduce costs:By changing the method

Reduce costs:By making an upstream process improvement

..and byChanging Funder and Academics behaviours

Reduce (increase) costs:

By changing the level of service

internally & externally and identify risks

(benefits)

Research proposals

Support of Funders

and Academics

Explaining to Funders why last project was

late, report had missing sections and the invoice

was wrongThanks to Brian Plowman

Becoming Competitive

51%30%

19%

Typical starting

point

Overall less and

rebalancedBecomes

Thanks to Brian Plowman

Win Win for all

New MetricsPerformance Metrics (Research Support Office)• Reducing bureaucracy

• Removing administration burden

• Systems and processes that work for all

• Impact of the office

• Influence on Stakeholder opinion

• Influence on Stakeholder awareness

• Peer recognition

• Improved happiness metrics

New Metrics• Improved happiness metrics

New MetricsUseful Metrics for our Academics / Faculty

• Distance to the Library

• Success rates for various schemes

• Value of funded applications by Sponsor

• Look at travel vs equipment vs staff costs

• Publishers who, when, how, current topics• https://sites.insead.edu/library/rankings/journal_rankings.cfm

• Technical reports frequency

New Metrics

Useful Metrics for our Academics / Faculty

• Journal impact factors (IF) for publications

• Collaborator metrics / Screening of potential partners

• Demand Management by Funders / University

• Teaching loads / administrative loads

• Impact routes and data for their research activity

• Grade Point Averages

• Personal Portfolio values

Basic Calculations for a GPA:Points Total = Sum of: Star-value X Number of articles at this Star-value level

e.g. for researcher in Table below:

Points Total = 3 X 3 + 2 X 1 = 11 (because there are 3 articles at star-value 3 and 1 at star-value 2)

GPA= Points Total/Outputs Required

For example in Table below:- GPA = 11/4 = 2.75 (because Points Total is 11 and Outputs required from this researcher is 4.

FTE Outputs Required

4* 3* 2* 1* GPA Points Total

1.00 4.00 3 1 2.75 11

Basic calculations:GPA= Points Total/Outputs Required

FTE Outputs Required

4* 3* 2* 1* GPA Points Total

0.60 2.00 2 3.00 61.00 4.00 3 1 2.75 111.00 4.00 1 3 3.25 131.00 4.00 1 3 3.25 131.00 4.00 1 3 3.25 131.00 4.00 1 3 3.25 131.00 4.00 2 2 3.50 141.00 4.00 1 2 1 3.00 12

Metrics

Colin’s Top Five Metrics

1. No proposals = No Awards

2. ? % staff bring in ? % research funding

3. ? % Sponsors supply what ? % research funding

4. What’s the happiness metric in your office

5. No metric can replace a gut feeling or local knowledge

Metrics

TEF Teaching Excellence Framework

• Metric Based

• Opt out ?

• National Student Survey

• What’s the benefit

• Do league tables count

Thank You

Colin [email protected] +44 (0)151 291 3745Twitter: @colincooperc1