Bibliometrics and responsible use of metrics in practice...
Transcript of Bibliometrics and responsible use of metrics in practice...
May 29, 2018
Bibliometrics and responsible use of metrics in practice for library staff
Dr. Alicia F. Gómez Sánchez
Research and Scholarly Communications Information Manager
@fagomsan
UEF Library Staff Research Evaluation Workshop
May 29, 2018
PART 1A. BIBLIOMETRICS INDICATORS
May 29, 2018
informetrics
scientometricsbibliometrics
cybermetrics
webometrics
Adapted from Björneborn & Ingwersen (2004, p. 1217)
The statistical analysis of written publications, such as books or articles
Study of the quantitative aspects of information resources, structures and technologies on the WWW drawing on bibliometric and informetricapproaches. (Björneborn, 2004)
Study of the quantitative aspects of the construction and use of information resources, structures and technologies on the whole Internet (including electronic resources) drawing on bibliometric and informetric approaches. (Björneborn, 2004)
altmetricsAlternative metrics that cover citation counts, and also other aspects of the impact such as article views, downloads, or mentions in social media and news media
Study of the quantitative aspects of information, includes the production, dissemination, and use of all forms of
information, regardless of its form or origin.
May 29, 2018
Classic bibliometrics laws and definitions
▪ Lotka, 1926Author productivity, examining the publication contributions of authors to a given discipline.
▪ Bradford, 1934 Journal productivity, examining the concentration of articles in a subject area within a set of scholarly journals.
▪ Zipf, 1949Word usage, examining the frequency of occurrence of words within texts.
▪ Scientometricsby Nalimov & Mulchenko, 1969: “the application of those quantitative methods which are dealing with the analysis of science viewed as an information process”
▪ Bibliometricsby Pritchard, 1969: “the application of mathematical and statistical methods to books and other media of communication”
May 29, 2018
Main document types
▪ Traditionally papers published in periodicals and serials▪ Citable items =
research articles, short communications and notes, letters, reviews, and proceedings papers
▪ Book reviews, editorials, corrections/errata, meeting abstracts and reprints are not considered original research output.
Source: Zhang et al. JASIST 2011; 62(7), 1403-1411
A, Article; L, Letter; R, Review; B, Book review; E, Editorial material; M, Meeting abstract.
May 29, 2018
Types of metrics
Journal Level metrics
• Criteria to be indexed in a database (Academic Journal Guide, Directory of Open Access Journals, Journal Citation Report, etc.)
• Impact factor (Web of Science, JCR) or CiteScore (Scopus Journal Metrics)
Article Level Metrics
• Citations
• Downloads, views.
• Altmetrics (mentions, citations in policy documents or syllabus, etc.)
Author Level Metrics
• h-index
• h5-index
• i10-index
Source: Isidro Aguillo . Curso "¿Cómo evaluamos nuestra producción científica?: Seminario crítico", ICOMEM, 12.04.2018. Madrid
May 29, 2018
Metrics by level
Macro-level
• global developments
• national R&D systems
• policies
• cross-sectional fields
Meso-level
• research and grant programs
• academic fields
• universities, research institutes, funding agencies
Micro-level
• university institutes/departments
• target/status groups
• research groups
• individuals
May 29, 2018
Types of indicators
▪ Productivity / Activity → number of publications to reflect the research output.
▪ Visibility → count of publications in recognized databases; number of articles in peer reviewed journals; IF, quartiles or deciles.
▪ Collaboration → number of co-authors or co-affiliations to reflect national and international networking
▪ Impact → citation rates (several citation indicators)
▪ Cognitive structures → co-occurrences of words, classifications relations between citations, etc.
▪ Others → main authorship, percentage of contribution, characterization of publications and disciplines, disciplinary vs cross-disciplinary vs interdisciplinary etc.
May 29, 2018
The impact factor (IF) of an academic journal is a measure reflecting the average number of citations to recent articles published in the journal.
It is frequently used as a proxy for the relative importance of a journal within its field, journals with higher impact factors deemed to be more important than those with lower ones.
In any given year, the impact factor of a journal is the average number of citations received per paper published in that journal during the two preceding years
Useful for:
▪ Measuring the journal's status in its scientific community
▪ Defining which are the most influential journals and decide where to publish
▪ Defining in which journals are the widely-read articles
IF
Received citations in the last year to articles published in the last 2 years
Number of articles published by the journal in the last 2 years
The star is:
May 29, 2018
May 29, 2018
How a unique article can affect the IF of a journal
Source: Isidro Aguillo . Curso "¿Cómo evaluamos nuestra producción científica?: Seminario crítico", ICOMEM, 12.04.2018. Madrid
May 29, 2018
72 journals:
⁻ JIF under 0-0.999 - 16⁻ JIF 1-1.999 - 9⁻ JIF 2-2.999 - 26⁻ JIF 3.-3.999 - 13 ⁻ JIF 4.-4.999 - 10⁻ JIF more than 5 - 7
May 29, 2018
Some pros and cons of the impact factor
Problems:
• Top journals or journals with an IF >10 are very restricted • Top journals vary highly by discipline• Coverage is not uniform• Some areas or disciplines may vary significantly in a time of 2 or 5 years
In addition: • The impact factor is not the only indicator of the quality of publications• For a broader vision we should consider other indicators• The non-inclusion of a journal is not synonymous with poor quality• Lack of differentiation between different types of documents
Nevertheless:✓The inclusion of a journal in the JCR is already an indicator that the
papers published in it have passed a quality filters and are cited✓It is a universally recognized way to measure the quality of the
research
May 29, 2018
h-index
“A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np − h) papers have ≤ h citations each”.
Hirsch, JE. PNAS 2005;102(46): 16569-72
For instance: an h-index of 25 tells us that an author has written 25 papers which have each been cited at least 25 times.
May 29, 2018
▪ Some metrics and tools are still not well established for some
subject areas (Arts, Humanities and Social Sciences), or for
some document types (books, commissioned reports…)
▪ Coverage of tools is not always comprehensive
▪ Metrics tend not to account for age of researcher
▪ Citation patterns vary between disciplines or types of
document
▪ Self-citations can distort metrics
▪ Citations to a paper may not reflect its quality
Limitations
May 29, 2018
PART 1B. INFORMATION SOURCES
May 29, 2018
Bibliometric data sources
May 29, 2018
Other bibliographic sources
May 29, 2018
Analytic tools: SciVal
https://www.scival.com
May 29, 2018
Analytic tools: Essential Science Indicators (i)
May 29, 2018
Analytic tools: Essential Science Indicators (ii)
May 29, 2018
Sources for alternative metrics
Social Impact of Research: Altmetrics(alternative metrics and tools)
May 29, 2018
Downloads and views
May 29, 2018
Publish or Perish
May 29, 2018
New tools
http://www.metrics-toolkit.org
https://www.lens.org
May 29, 2018
Some metrics using SciVal
May 29, 2018
May 29, 2018
Activities
i. Make a list with the tools known, adding the tools you discovered in this session. In a range from 1 to 5, how deep do you consider your knowledge about them?
ii. Which kind of questions do you think you could ask?
iii. Choose a database of your working area and find out which range of information you can find, and how can you apply that for a metric report.
May 29, 2018
PART 2. RESPONSIBLE USE OF METRICS
May 29, 2018
Differences by disciplines(WoS or Scopus)
SJRi 28.1883/1948Biochemistry, Genetics, Molecular Biology
SJRi 2.3822/209Library and Information Sciences
SJRi 8.0214/1665 Mathematics
SJRi 1.6815/246 Archeology
May 29, 2018
…more than journal articles and conference papers
May 29, 2018
In November 2007 the European Association of Science Editors (EASE) issued an official statement recommending "that journal impact factors are used only—and cautiously—for measuring and comparing the influence of entire journals, but not for the assessment of single papers, and certainly not for the assessment of researchers or research programs“ Source: Blog Green Tea and Velociraptors
May 29, 2018
The movement for responsible metrics
The Metric Tide:
Role of Metrics in
Research
Assessment and
Management.
Wilsdon, J., et al.
(2015). DOI:
10.13140/RG.2.1.49
29.1363
San Francisco
Declaration on
Research
Assessment
http://am.asc
b.org/dora/
16/12/2012 22/04/2015 07/2015
Bibliometrics: The Leiden Manifesto
for research metrics. Hicks, D.,et al. (2015)
Nature. 520, 429-31. DOI:
10.1038/520429a
*metrics –
Measuring The
Reliability and
perceptions of
Indicators for
interactions with
sCientific productS
https://metrics-
project.net/
18/05/2017 30/01/2018
A tool that
provides
guidance for
demonstrating
and evaluating
claims of
research impact http://www.metri
cs-toolkit.org/
May 29, 2018
10 principles to guide research evaluation:
1. Quantitative evaluation should support qualitative, expert assessment
2. Indicators used to evaluate performance should relate clearly to the program goals
3. Protect excellence in locally relevant research
4. Keep data collection and analytical processes open, transparent and simple
5. Allow those evaluated to verify data and analysis.
6. Account for variation by field in publication and citation practices
7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
8. Avoid misplaced concreteness and false precision.
9. Recognize the systemic effect of the assessment and indicators.
10. Scrutinize indicators regularly and update them.
The Leiden Manifesto
May 29, 2018
https://twitter.com/LizzieGadd/status/995264590131982337
https://twitter.com/LizzieGadd/status/995264590131982337
https://twitter.com/tanvir_h/status/995369945876844547
May 29, 2018
May 29, 2018
Where to search for the appropriate metrics
http://www.metrics-toolkit.org
May 29, 2018
Statement of responsible metrics
https://thebibliomagician.wordpress.com/resources/
May 29, 2018
PART 3. BIBLIOMETRICS IN PRACTICE
May 29, 2018
Group activity: Self-analysis of the publications context in the UEF
i. How much do you know about the publication trends in your institution?
ii. What is the % of publications affiliated to the UEF deposited in your CRIS/repository?
iii. Can you improve those numbers? How?
iv. How does your validation control look like? Do you think, you can you improve it?
v. Which curation procedures do you have?
vi. Are UEF authors consistent using names and affiliations? Are they attributing funding?
vii. Are UEF authors publishing in predatory journals?
May 29, 2018
Control through alerts
May 29, 2018
Ask for corrections
May 29, 2018
Review profiles
May 29, 2018
Merge authors profiles
May 29, 2018
Review documents. Option 1: add publications
If the document it’s already on the list, the option to ‘Add selected articles’ will not be available.
May 29, 2018
Review documents. Option 1: add publications
May 29, 2018
Option 2: delete publications
May 29, 2018
PART 4. LIBRARY SUPPORT ON RESEARCH
ASSESSMENT
May 29, 2018
How should research be evaluated?
Photo credit: Matt Lavoie https://medium.com/@MattPLavoie
May 29, 2018
How can we assess and support research evaluation?
▪ Monitoring the institutional scientific output▪ Develop an internal workflow and set alerts to register the institution’s
scientific output
▪ Developing bibliometric reports and help users with expert bibliometric searches
▪ Identify new research lines and possibilities for collaboration
▪ Evaluate external candidates
▪ Prepare CVs and profiles for appraisals and funding applications▪ Science Experts Network Curriculum Vitae (SciENcv)
▪ Unique digital identifiers: Researcher ID, Orcid, Scopus ID
▪ Assess researchers regarding publication sources and strategies
51
May 29, 2018
Questions or report possibilities
▪ How much research is taking place?
▪ In what fields is research being conducted?
▪ Where is your work having the greatest impact?
▪ Does that research have impact on other research fields?
▪ Benchmarking comparing research groups or institutions
▪ Identify front research subjects
▪ Evaluate the differences of citation tendences betweendifferent fields
▪ Citation patterns between research groups or journals
May 29, 2018
Questions for consideration regarding the use of metrics in assessing research performance
▪ What is the optimal balance between direct peer reviewing, and the use of quantitative measures based on publication records?
▪ What weighting should be applied to publication number, h factors, journal IF, citation number, primary publication vs review?
▪ Noting that the order can vary considerably from one field to another, how much credit should be given to first or last authorship, middle authorship, or corresponding authorship?
▪ What credit should be given for pre-prints and other electronic publications, peer reviewed or not? Should impact indices such as downloads, or links be taken into consideration?
▪ Should the number of publications that count towards grants or appointments be capped? For example, should only the best three publications per year be taken into consideration? Should scientists even be penalized for authorship on more than, say, 20 publications per year
▪ What weighting should be given to other quantitative measures of research output, such as patent applications, patents granted or patents licensed?
Source: Publication practices and indices and the role of peer review in research assessment (July 2008) "International Council for Science statement". Icsu.org
May 29, 2018
May 29, 2018
Training and advice
May 29, 2018
Advice about where to publish
May 29, 2018
How to avoid predatory publishers
http://thinkchecksubmit.org
May 29, 2018
How to search for a reputable a publisher
▪ The journal or the proceedings needs to be reputable publishers (some of the applicable criteria can be: is the publisher identifiable and well known?, is it indexed in main reliable databases?, is the journal hosted on one of INASP’s Journals Online platforms, is the publisher linked with a reputable association?, etc.).
▪ Fully OA journals should be listed in the Directory of Open Access Journals (DOAJ) or be member of the Open Access Scholarly Publishers Association (OASPA).
May 29, 2018
Search for a top journal by category
May 29, 2018 https://journalmetrics.scopus.com/
May 29, 2018
ROUND UP, REFLEXIONS, AND
CONCLUSIONS
May 29, 2018
What did you learn?
What can we Implement
Where would you like to improve tour knowledge
Ideas for the next year?
May 29, 2018
Questions
https://orcid.org/0000-0003-4898-1680
http://es.linkedin.com/in/aliciafgomez
@fagomsan