Guidelines and Metrics for Assessing Space System Cost Estimates
Assessing research performance: missions and metrics
-
Upload
paul-wouters -
Category
Education
-
view
284 -
download
2
description
Transcript of Assessing research performance: missions and metrics
![Page 1: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/1.jpg)
Measuring Science and Research Performance, 8 – 12 September 2014
Assessing research performance: missions and metrics
Paul Wouters
![Page 2: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/2.jpg)
2
![Page 3: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/3.jpg)
3
A SIMPLE idea underpins science: “trust, but verify”. Results should always be subject to challenge from experiment. That simple but powerful idea has generated a vast body of knowledge. Since its birth in the 17th century, modern science has changed the world beyond recognition, and overwhelmingly for the better. But success can breed complacency. Modern scientists are doing too much trusting and not enough verifying—to the detriment of the whole of science, and of humanity.
Too many of the findings that fill the academic ether are the result of shoddy experiments or poor analysis (see article (http://www.economist.com/news/briefing/21588057-scientists- think-science-self-correcting-alarming-degree-it-not-trouble) ). A rule of thumb among biotechnology venture-capitalists is that half of published research cannot be replicated.
![Page 4: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/4.jpg)
4
![Page 5: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/5.jpg)
5
![Page 6: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/6.jpg)
➡ discrepancy between evaluation criteria and the social and economic functions of science
➡ evaluation methods (esp. qualitative) have not adapted to increased scale of research
➡ available quantitative measures are often not applicable at the individual level
➡ lack of recognition for new types of work that researchers need to perform
Evaluation Gap
![Page 7: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/7.jpg)
7
![Page 8: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/8.jpg)
8
Sociaal proces
Kennis-produktie
Evaluatie
![Page 9: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/9.jpg)
9
Citation theories
![Page 10: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/10.jpg)
Theories about citing behaviour
• How do researchers decide what to cite?
• What can be inferred from patterns of citation?
![Page 11: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/11.jpg)
11
Scientific tradition requires that scientists, when documenting their own research, refer to earlier works that relate to the subject matter of their reported work. (Nicolaissen 2007, p. 610)
• Referencing as scholarly practice emerged gradually since the twelfth century (Grafton 1997)
• Since the creation of the SCI, recurring calls for “a citation theory”, without clear result
• Mulkay 1974: not clear how references reflect scientific influence
• 1981 three key publications:– Cozzens: review sociological citation theories
– Cronin: information science citation theory
– Smith: we do not know enough about citing behaviour
![Page 12: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/12.jpg)
As Kochen (1987, p. 54) has noted, “a paper that conforms to the norms of scholarly perfection would explicitly cite every past publication to which it owes an intellectual debt.”
This ideal has long been debated by information scientists and others, with discussion centering around two fundamental questions: (1) What makes authors cite / not cite their influences? and (2) To what extent is the ideal exemplified?
Nicolaisen ARIST 2008, p. 612
![Page 13: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/13.jpg)
Citing motivations
Popular between 1965 and 1979: developing classifications of citing motivations:
• Moravcsik and Murugesan (1975): perfunctory citations
• Brooks (1985): seven motives
• Number of surveys: low number negative citations
![Page 14: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/14.jpg)
L. C. Smith‘s (1981, p. 87-89) list of basic assumptions underlying citation analysis in general:
• Citation of a document implies use of that document by the citing author.
• Citation of a document (author, journal, etc.) reflects the merit (quality, significance, impact) of that document (author, journal, etc.).
• Citations are made to the best possible works.
• A cited document is related in content to the citing document.
• All citations are equal.
![Page 15: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/15.jpg)
Critiques of normative citation theory• MacRoberts and MacRoberts (1980s): if one wants to know
what influence has gone into a particular bit of research, there is only one way to proceed: head for the lab bench, stick close to the scientist as he works and interacts with colleagues, examine his lab notebooks, pay close attention to what he reads, and consider carefully his cultural milieu. (MacRoberts & MacRoberts, 1996, p. 442)
• Brooks (1985, p. 228) found that about 70 percent were multiply motivated, concluding that “no longer can we naively assume that authors cite only noteworthy pieces in a positive manner. Authors are revealed to be advocates of their own points of view who utilize previous literature in a calculated attempt to self-justify.
![Page 16: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/16.jpg)
Citing as persuasion work
Authors preparing papers will tend to cite the “important and correct” papers, may cite “erroneous” papers in order to challenge them and will avoid citing the “trivial” and “irrelevant” ones.
Indeed, respected papers may be cited in order to shine in their reflected glory even if they do not seem closely related to the substantive content of the report. (Gilbert 1977)
![Page 17: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/17.jpg)
17
Indicator theories
• Van der Veer Martens (2001, online): ‘holy grail’ in scientometrics” is the development of indicator theories rather than the development of theories of citing behavior
• Two different approaches:– Semantic studies: Small (1978): co-citations as concept
markers– Semiotic studies:
• Wouters (1998; 1999): analyze citations as sign systems• Further extended by Cronin (2000)
![Page 18: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/18.jpg)
18
“If one wishes to be precise, one should distinguish between the notions `reference' and `citation'. If paper R contains a bibliographic note using and describing paper C, then R contains a reference to C and C has a citation from R (Price 1970). Stated otherwise, a reference is the acknowledgement that one document gives to another, while a citation is the acknowledgement that one document receives from another. So, `reference' is a backward-looking concept while `citation' is a forward-looking one. Although most authors are not so precise in their usage of both terms, we agree with Price (1970) that using the words `citation' and `reference' interchangeably is a deplorable waste of a good technical term.” (Egghe & Rousseau 1990)
![Page 19: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/19.jpg)
19
Semiotics
![Page 20: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/20.jpg)
20
![Page 21: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/21.jpg)
21
![Page 22: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/22.jpg)
22
![Page 23: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/23.jpg)
23
Cronin (2000)
• Polysemy of signs: multiple interpretations are possible
• Acknowledgements and altmetrics
• Value semiotics lies in its sensitivity to the variation in interpretation in evaluative contexts
• Limited to purely symbolic analysis (stays within the sign system) but we need the material context
• Put “the meaning of the citation” central but for whom?: the role of the citation
![Page 24: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/24.jpg)
24
“More specifically, if an individual’s, department’s or university’s ability to amass symbolic capital of this kind were to become the critical determinant of future research funding and career advancement, then it would not be difficult to imagine distortions creeping into the system, as players devised recruitment, publication, collaboration and citation harvesting stratagems to accelerate and maximise the accrual of symbolic capital.” (Cronin 2000, p. 450)
![Page 25: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/25.jpg)
The publication cycle
4/29/2010
![Page 26: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/26.jpg)
The peer review cycle
4/29/2010
![Page 27: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/27.jpg)
The citation cycle
Word
Co-word
4/29/2010
![Page 28: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/28.jpg)
Two interacting cycles
Co-word
4/29/2010 History SCI Madrid
![Page 29: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/29.jpg)
Implications
• Indicator: embodiment of a specific newly created link between the formal and the paradigmatic
• Not one and only but multiple indicator theories
• Building indicators is extending the representational systems
• Citation theories are performative
4/29/2010 History SCI Madrid
![Page 30: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/30.jpg)
Types of interactions
• Indicators directly used in funding decisions
• Indicators may indirectly redefine what scientific quality means
• The maps of science may influence priorities
• Scientists may validate indicators or maps
• Scientists may help construct indicators
![Page 31: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/31.jpg)
Citation in two contexts
• Most citation theories based on communication system of science
• This is not identical to the social institution of evaluation in science
• Explaining the social life of citation indicators should be based on the latter
• Example: the black hole in “informed peer review”
![Page 32: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/32.jpg)
32
The sociology of quality
![Page 33: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/33.jpg)
Questions
• How does the evaluation of scientific or scholarly quality affect the creation of knowledge?
– Which concept of “quality” can be used to understand this interaction?
– Which concept of “science” or “knowledge” should we use?
![Page 34: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/34.jpg)
Knowledge as infrastructure
• Infrastructures are not constructed but evolve
• Transparent structures taken for granted
• Supported by invisible work
• They embody technical and social standards
(Edwards, A Vast Machine, 2010)
![Page 35: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/35.jpg)
35
Quality• Substantive (expert based)
• Formalized (procedural – meta method?)
• Ethnographic (actor defined)
• Sociological (power or interest based)
• Semiotic (translation)
• Proposal:
quality is not an intrinsic property at the level of the individual but an effect of infrastructures
![Page 36: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/36.jpg)
36
Quality – alternative definition
• Quality is the level of “fit” between a particular work and the infrastructure to which it aspires
• Quality is multi-dimensional: more than 1 infrastructure at the same time
• Quality is distinct from the interests of the author
• New infrastructures can emerge from a lack of fit
• Innovativeness can be an aspect of quality but does not have to be required
• Quality can be measured but only partially
![Page 37: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/37.jpg)
Mushroom growth of evaluation
• Relatively recent phenomenon (since mid 1970s)
• Formal evaluation protocols: performance indicators all over the place but citation indicators hardly visible
• Science policy studies tend to underestimate the proliferation and impact of indicator based evaluations
• Recent studies focus on performance based funding
• “Anecdotal evidence” shows the proliferation of especially the Hirsch Index and the JIF
![Page 38: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/38.jpg)
38
New trends in assessment
• Increased bibliometric services at university level available through databases
• Increased self-assessment via “gratis bibliometrics” on the web (h-index; publish or perish; etc.)
• Emergence of altmetrics
• Increased demand for bibliometrics at the level of the individual researcher
• Societal impact measurements required
• Career advice – where to publish?
![Page 39: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/39.jpg)
Peter Dahler-Larsen The Evaluation Society
– “Evaluations are not something that the individual can reject”
– Evaluation as disembedded reflexive social practice
– Evaluation consists of:• Evaluand• Criteria• Systematic methodology• Purpose
![Page 40: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/40.jpg)
Evaluation Machines
• Primary function: make stuff auditable
• Mechanization of control – degradation of work and trust? (performance paradox)
• Risks for evaluand and defensive responses
• What are their costs, direct and indirect?
• Microquality versus macroquality – lock-in
• Goal displacement & strategic behaviour
![Page 41: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/41.jpg)
Constitutive effects
• Limitations of conventional critiques (eg ‘perverse or unintended effects’)
• Effects:• Interpretative frames• Content & priorities• Social identities & relations (labelling)• Spread over time and levels
• Not a deterministic process
• Democratic role of evaluations
![Page 42: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/42.jpg)
Effects of indicators
• Intended effect: behavioural change
• Unintended effects:– Goal displacement– Structural changes
• The big unknown: effects on knowledge?
• Institutional rearrangements
• Does quality go up or down?
![Page 43: Assessing research performance: missions and metrics](https://reader034.fdocuments.net/reader034/viewer/2022052619/556d1d1bd8b42a540c8b4b3a/html5/thumbnails/43.jpg)
Responses scientific community
• Strategic behaviour
• Ambivalence
• Sophisticated understanding of indicators and citation numbers
• Responses vary by discipline, style, position (Hargens and Schuman 1990)
• “Self-interest” not a valid explanation