Thomson Reuters Research Evaluation & Information Management - Research Impact In Sweden
-
Upload
thomson-reuters -
Category
Technology
-
view
2.323 -
download
3
description
Transcript of Thomson Reuters Research Evaluation & Information Management - Research Impact In Sweden
THOMSON REUTERS RESEARCH EVALUATION & INFORMATION MANAGEMENT
Account Manager – Emma Dennis Global Sales Support – Rachel Manganpp g
[email protected]@thomsonreuters.com
Agenda• Introduction to Research Evaluation Roadshow• Introduction to Research Evaluation Roadshow-
Account Manager Emma Dennis
U i Cit ti D t t E l t R h• Using Citation Data to Evaluate Research Performance
• The Data used in Research Evaluation Tools
• The Metrics
• Research in Sweden
Research Analytics for Research Evaluation• Research Analytics for Research Evaluation
• Managing Research Output
2
Using Citation Data to EvaluateUsing Citation Data to Evaluate Research Performance
Using Citation Data to Evaluate ResearchUsing Citation Data to Evaluate Research Performance• Why evaluate research performance?• Why evaluate research performance?
• Examples of Governments, Agencies, Ranking I tit ti (THE) l d i it ti d t iInstitutions (THE) already using citation data in research evaluation
f• Types of metrics used in research evaluation
• What do institutions want from citation data
4
Why Evaluate Research Performance?• Quantitative analysis is the main tool of scienceQuantitative analysis is the main tool of science
• Communicating research results complex
• Personal knowledge no longer sufficient for decision making
• Need to be selective over support for research projects
• Evaluation and strategic planningPeriodic evaluation of research performance– Periodic evaluation of research performance
– Institution, departmental or researcher level assessments• Accreditation, tenure, faculty review
– Performance indicators• Used in strategic planning• Reporting to government bodies, boards of directors/trustees
• Research Centers– Find new staff– Develop lines of investigation– Develop lines of investigation– Compete for funds
Examples of Research Evaluation- THE WorldExamples of Research Evaluation THE World University Rankings
6
Examples of Research Evaluation- LeidenExamples of Research Evaluation Leiden Ranking
The Leiden Ranking 2011/2012 is based on:
•Publications in Thomson Reuters’ Web of Science database in the period 2005-2009.• Only publications in the sciences and the social sciences are included. Publications in the arts and humanities are excluded because in these domains the bibliometricindicators of the Leiden Ranking do not have sufficient accuracy. •Furthermore, only publications of the Web of Science document types article, letter, and review are considered in the Leiden Ranking.
7
THOMSON REUTERS RESEARCH ANALYTICS:USED ACROSS THE GLOBE FOR EVALUATIONUSED ACROSS THE GLOBE FOR EVALUATION AND POLICY
Global Research Reports: http://researchanalytics.thomsonreuters.com/grr//
The MetricsThe Metrics
Types of Metrics used in Research Evaluation• What Types of Data are best for which Purposes?• What Types of Data are best for which Purposes?
– There are no all-purpose indicators– Start by identifying the question the results are supposedStart by identifying the question the results are supposed
to answer, then collect data– Clearly define
• Purpose of the evaluation• Types of data required • How the results will be used• How the results will be used
10
Types of citation metrics and what theyTypes of citation metrics and what they measure
P d ti it # Journal actual/expectedProductivity # papers
Total# citations
Journal actual/expected citation rateCategory actual/
t d it ti tTotal influence H-index
Avg. citation rate Relative Impact/
expected citation ratePercentile in category and mean percentile
Efficiency
Avg. citation rate Relative Impact/Bench-marking % papers in top 10% of
their field% papers in top 1% ofy
Percent of paperscited
% papers in top 1% of their field
Aggregated Performance Indicator
Specialization Disciplinarity indexInterdisciplinarity index
Can be applied to an institution, a researcher, a
11
p yresearch group, etc
Building a detailed profile of researchBuilding a detailed profile of research performance
A profile report offering a range of indicators based on papers of an author:on papers of an author:•Number of citations per year•Total CitationsTotal Citations•Average Cites per paper•% Cited Documents •Total PapersTotal Papers•Papers per collaborating author/institution country/ subject area
12
j
Building a detailed profile of researchBuilding a detailed profile of research performance
A summary report offering a range of
t i b d thmetrics based on the citation performance of papers compared to global averagesglobal averages (journal and category)
13
What do institutions want from citation data? • What is the university’s research performance?• What is the university s research performance?
• Are we competitive compared with our peers?
• How can the university forecast growth?
• Which are our centers of excellence?Which are our centers of excellence?
• What is our citation ranking?
• What is the influence of our research?
• Which are our most influential papers?p p
• Which are our top researchers?
14
The DataThe Data
The Data used in Research Evaluation • Web of Science is the source of citation data• Web of Science is the source of citation data
• Journal Selection Process
• Consistent Indexing
• International CoverageInternational Coverage
• Completing the research picture- Book Citation IndexIndex
• Consistency is the key to validity
16
Global Research Community Using Web ofGlobal Research Community Using Web of Science
6.000 Research institutions 82 t i
17
82 countries
Thomson Reuters Web of Science • World’s largest citation index• World s largest citation index
– 51 million papers– 800 million citations
• Multidisciplinary– 250 categories covering Science, Social Sciences, Arts & Humanities
115 ears consistent co erage• 115 years consistent coverage– Articles: (1898 – 2012)– Citations: (1898 – 2012)( )
• Content• Journal publications
Conference proceedings papers• Conference proceedings papers• Books
• Updated weekly• Exclusively hosted on Web of Knowledge platform
Why not index all journals?
100
12040% of the journals:
• 80% of the publications
60
80
atab
ase
• 80% of the publications
• 92% of cited papers
40
60
% o
f da
4% of the journals:
0
20
0 1000 2000 3000 4000 5000 6000
• 30% of the publications
• 51% of cited papers0 1000 2000 3000 4000 5000 6000
# of journals
Articles Citations
How to decide on which journals to cover •Approx 2 500 journals evaluated annually•Approx. 2.500 journals evaluated annually
– 10-12% accepted
•Thomson Reuters editorsThomson Reuters editors– Information professionals – Librarians – Experts in the literature of their subject area
Web of Science Journal ‘quality’
Accepted
Journals under evaluation
Rejected
Thomson Reuters Journal Selection
•Publishing Standards– Peer review, Editorial conventions
•Editorial content– Addition to knowledge in specific subject field
•Diversity– International, regional influence of authors, editors, advisors, g , ,
•Citation analysisEditors and authors’ prior work– Editors and authors prior work
http://thomsonreuters.com/products_services/science/free/essays/journal_selection_process/
Global Research represented in Web ofGlobal Research represented in Web of Science
Region # Journals from Region in Web of ScienceEurope 6 082 50%Europe 6,082 50%
North America 4,456 37%Asia-Pacific 1,031 9%,
Latin America 289 2%Middle East/Africa 200 1%
Language # Journals in Web of ScienceEnglish 9114 81% Other 2147 19%
Research in Social Sciences
23
Rationale for a Book CitationRationale for a Book Citation Index
Rationale for a Book Citation Index
25
Rationale for a Book Citation Index-Rationale for a Book Citation IndexCompleting the Research Picture
26
Editorial Profile of the Book Citation IndexEditorial Profile of the Book Citation Index
27
Book Citation Index in Web of Science
28
Book Citation Index in Web of Science
Use the Document Types options within “Refine” to get directly to book content only
Book Citation Index Selection Policy • Original research• Original research
• References / bibliography / footnotes
• Multi or single author
• Books in series or monographsBooks in series or monographs
• Print or e-Books
• Copyright year 2005 and later
http://wokinfo.com/products_tools/multidisciplinary/bookcitationindex/
30
A thoritati e data from the orld’s leading pro ider of research
Consistency is the key to validity Authoritative data from the world’s leading provider of research
evaluation data
Strict selection policy applying consistent criteria over the last 50Strict selection policy applying consistent criteria over the last 50 years
This has created a large set of journals containing comparable papers and citations
One consistent editorial policy (all authors, all address, all it ti )citations)
Unique set of multi-disciplinary comparable data
The MetricsThe Metrics
The Metrics-Absolute Counts
Total Impact How does the research impact p
Productivity Efficiency compare to similar research?This question can
t b d
33
not be answered by these metrics
The Metrics- Normalised Metrics ‘The number of times that papers are cited is notThe number of times that papers are cited is not
in itself an informative indicator; citation counts need to be benchmarked or normalised against gsimilar research. In particular citations accumulate over time, so the year of publication needs to be
fftaken into account; citation patterns differ greatly in different disciplines, so the field of research needs to be taken into account; and citations toneeds to be taken into account; and citations to review papers tend to be higher than for articlesand this also needs to be taken into account.’
Source Research Excellence Framework Pilot Study, UK
34
Is this a highly cited paper?
This paper has been cited 56times. How does this citation count compare to the expected citationcompare to the expected citation count of other articles published in the same journal, in the same year?It i t li fIt is necessary to normalise for:•Journal = Nucleic Acids Research•Year = 2010Year 2010•Document type = article•Category = Biochemistry & Molecular Biology
35
Creating a benchmark
Search for papers that match the criteria
Run the Citation Report on the results pageon the results page
36
Creating a benchmark
Articles published in ‘Nucleic Acids Research ’ published in 2010 have been cited on average 10.26 times. This is the Expected Count
We compare the Total Citations received to a paper to what is Expected
56 (Journal Actual) / 10.26 (Journal Expected) = 5.46The paper has been cited 5 56 times more than expected We call this
37
The paper has been cited 5.56 times more than expected. We call this Journal Actual/Journal Expected
The Metrics- Normalised metrics
The research with WellcomeTrust has been cited 4.36 times more than expected at j l l ljournal level
By providing a suite of metrics one can
The research with WellcomeTrust has been
Papers onaverage are in
y p gbuild up a more accurate description of performance. In this case, the performance of collaborations.
Trust has been cited 17.22times more than expected at
average are in the top 20% of their field (based on citation
Because the data is normalized, relevant comparisons can be made even when the collaborations are f i diff t bj t d
38
expected at category level
on citation performance)
focusing on different subject areas and with different lengths of publication history
Using citation metrics wiselyTen Rules in Using Publication and Citation Analysisg y
1. Consider whether available data can address the question.
2. Choose publication types, field definitions, and years of data.
3 Decide on whole or fractional counting3. Decide on whole or fractional counting.
4. Judge whether data require editing to remove “artifacts”.
5. Compare like with like.
6. Use relative measures, not just absolute counts.
7. Obtain multiple measures.
8. Recognize the skewed nature of citation data.g
9. Confirm that the data collected are relevant to the question.
10. Ask whether the results are reasonable.
And above all present the results openly and honestlyAnd, above all, present the results openly and honestly.
White Paper :Using Bibliometrics in Evaluating Research
David A. Pendlebury Research Department, Thomson Reuters, Philadelphia, PA USA
39
Research in SwedenResearch in Sweden
Productivity1981-2011N mber of Web of Science Records ith at least one
Total = 596,42130000
Number of Web of Science Records with at least one occurence of Sweden in the address
25000
15000
20000
Records
10000
0
5000
1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
41Source: Web of Science™
198
198
198
198
198
198
198
198
198
199
199
199
199
199
199
199
199
199
199
200
200
200
200
200
200
200
200
200
200
201
201
Collaborations Top 20 Collaborating Countries
60000
70000
Top 20 Collaborating Countries
40000
50000
30000
40000
Records
10000
20000
0
42Source: Web of Science™
Research in Sweden
A Trended Graph
The impact (cites per paper) for most institutions has been consistently above the global average impact for the 30 year period. A Trended Graph
presenting the Relative Impact for selected Swedish intuitions for period 1981-2010 in 5 year
y p Karolinska Institute has maintained an
impact that is twice the global average. The relative impact of Stockholm
University has increased dramatically
43
period 1981 2010 in 5 year groupings
y ysince the period ending 2002
Source: InCites™
Research in Sweden 1990-2010
A Cumulative Graph showing Impact Relative to Field:Impact in Computer Science for each institution relative to the average impact in Computer Science
Red line represents the Global Average in Computer ScienceAverage in Computer Science.Uppsala University has an impact in Computer Science that is more than 8 times above the global average
44Source: InCites™
Research in Sweden by Field 1990-2010
A multi-subject cumulative graph showing Impactgraph showing Impact Relative to Field Graph presents Sweden’s impact in multiple subject areas compared to global average
45
compared to global average impact in the subject area.
Global AverageSource: InCites™
Research Analytics forResearch Analytics for Research Evaluation
Citation metrics as a measure of researchCitation metrics as a measure of research output and impact
47
A key piece of the research performanceA key piece of the research performance puzzle
48
SolutionsSolutions
InCites components
Research Performance Profiles Global Comparisons Institutional Profiles
Bibliometrics Driven Internal View - Data for your institution’s published
Bibliometrics Driven Comparative across institutions and
360o View of the world’s leading research institutions.your institution’s published
work. Granular - Detail at the paper author discipline
institutions and countries.Top-Level - Detail summarized at the
institutions. Academic Reputation
Survey Institution Submittedpaper, author, discipline
level, and more. Collaboration data. Customized Based on
summarized at the institution and country level for various disciplines
Institution Submitted DataBibliometrics
A h d t ki Customized - Based on customer requirements. Optional Author-Based data sets
Standardized - annual production, uniform data cut-off for all institutions
A huge undertaking, providing a unique set of data, objective and subjective, and tools
Current -Updated Quarterly
Diverse, a wide range of metrics place influence of published research into multiple perspectives for
subjective, and tools through which to examine and compare institutional resources, influence, and
t timultiple perspectives for an institution.
reputation.
50
Key Questions in ResearchKey Questions in Research Evaluation
1. We want to monitor our research output and measure the performance of departments,measure the performance of departments, authors, funding, collaborations...
52
Institutional Comparisons- compare your research output and impact to that of otherresearch output and impact to that of other institutions
53
1a.Monitor research output....
Productivity per institution 1981-2010Number of papers perNumber of papers per 5 year grouping
54
1b.Monitor research impact...
Average impact 1981-2010 (average cites per(average cites per paper) in 5 year groupings
55
1c.Research Performance Profiles: Monitor1c.Research Performance Profiles: Monitor the performance of departments..
1. Select a report type2. Select a
department3 C t i t3. Customise report
56
1c. Measure the performance of departments..1c. Measure the performance of departments..Which are our centres of excellence?
Citation Performance metrics measuring the impact of papers produced by these departments to global
57
departments to global averages
1d.Measure the performance of collaborations..collaborations..Which collaborations are the most valuable?
These collaborations have an impact that is more than 3 times above the globalabove the global expected impact
58
1e.Measure the performance of researchers..Which authors papers have performed theWhich authors papers have performed the best in their field?
School of Material Science
F lt f Lif
59
Faculty of Life Sciences
1e.Overview of a researcher....
Author publication profile
Papers in top
profile
Papers in top 1% of their field
60
1f.Measure the performance of funding...
Web of Science captures Funding information since 2007
61
1f.Evaluating research funded by the National1f.Evaluating research funded by the National Science Foundation- paper level metrics
62
1g.Measure the performance of papers..Which papers (articles) are in the top 1% of theirpapers (articles) are in the top 1% of their field?
Determine theDetermine the indicators to include to the report
Customise the report using th d li it
63
the delimiters
1g.Granular data- Paper level metrics
A range of metrics gthat provide a complete profile of the performance f thi
Percentile in
of this paper compared to similar papers Re order the
papers using thePercentile in Field. Performance of paper based on
papers using the range of metrics and standard bibliometric data
64
citations
Global Comparisons: Institutional Profiles
•Updated annually y•Standardised•Top Level view•National and International comparisonscomparisons •Data common to all subscribers •Ability to include ‘World’ for global averages
1. Select institutions
2. Select subject
65
subject3. Select Time
Period
2. How does our research in a subject area2. How does our research in a subject area perform nationally?
Impact relative to subject area in Material Science 1981-20101981-2010
66
3.How does our research impact (overall)3.How does our research impact (overall) compare to others internationally?
Impact prelative to world latest 10 years
67
4.How does our impact in ‘Chemistry’4.How does our impact in Chemistry compare to others internationally?
68
Institutional Profiles • Institutional Profiles is a dynamic web-based resource presentingInstitutional Profiles is a dynamic web based resource presenting
portraits on more than 550 of the world’s leading research institutions.
Th h i ll ti tti ti d li ti• Through rigorous collection, vetting, aggregation and normalization of both quantitative and qualitative date, the profiles present details on a wide array of indicators such as faculty size, reputation, funding citation measures and morefunding, citation measures and more.
• Citation metrics from Web of ScienceSM
• Profile information from the institution’s themselves• Profile information from the institution s themselves.
• Reputational data from the Global Institutional Profiles Project. Data from this project is used by THE to inform on the World University Rankings
69
Research Footprint
Broad indicators to id l tprovide a complete
profile of an institutions performanceperformance
70
5. How does our research income compare to others and are we seeing a return onothers and are we seeing a return on investment?
71
6. With whom should we collaborate?
X axis = Research StaffY axis = Research income
72
7. How is our reputation perceived globally?
Strong research reputation in E fEurope for Uppsala University
73
Research ManagementResearch Management
ResearcherID- Author level researchResearcherID Author level research management
75
Export papers to RID
76
Sign in or Join RID.com
77
RID Profile
78
ResearcherID 118,000+ users globally
79
Author submitted data
80
RID fully integrated with Web of Science
81
Research In View Site levelResearch In View- Site level Management g
Research In View- Activity Types
Biographical Information
• Positions• Degrees
Teaching• Undergrad / Grad / Professional Courses Taught• CE / Extension Courses Taught
Ad i i d M t i
Published Works• Books & Monographs• Edited Books• Book Chapters• Journal Articles
Creative Works• Artwork• Musical Works & Performances• Audiovisual Works
S ft & O li• Certifications• Bar Admissions• Licenses• Clinical Interests• Languages
• Advising and Mentoring• Awards for Teaching• Undergrad/Grad Courses Taught• Continuing Ed/Extension Courses Taught
• Conference Papers & Proceedings• Technical Reports• Reference Works/Entries• Legal Proceedings• General Press
• Software & Online Multimedia• Equations and Figures• Inventions and Patents• Other Creative Works• Creative Works
• Preferred Personal Information• Media Contact Subjects• Biographical Narrative• Focus of Work
g• Student Advising• Postdoctoral Advising• Approach & Goals to Teaching• Evaluation of Teaching
General Press• Pre-publication Items• Internet Communications• Scholarly Presentations• Data Sets• Published Works
Quality Indicators Service
Funding• Funded Grants/Contracts
• Quality Indicators Service• Editorships• Professional Societies• Consultant Services• Clinical Services• Clinical TrialsCivic Engagement &
P t hi Facilities &• Pending Grants/Contracts• Training Grants• Clinical Trials Support• Scholarly Awards• Research Funding
• University Committees• Advising Student Groups• Strategic Initiatives• Awards for Service• Student Life Activities• Service Activities
Partnerships• Outreach Initiatives• Community Partners• University Partners• Civic Engagement• Partnerships & Collaborations
Facilities & Instrumentation
• Facilities• Instrumentation• Facilities and Instrumentation
Service ActivitiesPartnerships & Collaborations
A wide range of fil ti itiprofile activities
84
Faculty Profile Data Sources
Institutional Data• Personnel Information (Human Resources)
• Courses & Evaluations (Registrar)
Thomson Reuters Data• Web of Science (Journals and Proceedings)
• MedlineCourses & Evaluations (Registrar)
• Inventions (Tech Transfer)
• Financial Information (Accounts dept.)
Medline
• Selected grant awards
• Intellectual Property Offices (Patents)
Lib f C b k• Document Repository • Library of Congress books
F lt d t i tFaculty data input• Creative works
• Narratives
• Awards/honors
• Civic engagements/partnerships
RIV- Adding Thomson Reuters Data
Adding publications connecting to Science Wire
86
RIV- Inputting Faculty Data
87
Research In View- Connect to Endnote
Research In View- Publications
Publication link to source in Web of Science, Medline & original publisher
Recommendation engine presents suggested publications in your namep y
Faculty can obscure l t d it f
89
selected items from public view
Research In View- Recommendation Engine
The recommendation engine learns from your selections.
RIV- Create a customised document
Select the activities to include in the document
91
RIV- Create a customised document
92
THOMSON REUTERS RESEARCH EVALUATION & INFORMATION MANAGEMENT
Thank You
Account Manager –Emma Dennis Global Sales Support – Rachel Mangan
emma swann@thomsonreuters [email protected]@thomsonreuters.com