Advanced thoughts on critical appraisal of qualitative research Esquire course 2015, University of...

36
Advanced thoughts on critical appraisal of qualitative research Esquire course 2015, University of Sheffield Karin Hannes Faculty of Psychology and Educational Sciences, KU Leuven

Transcript of Advanced thoughts on critical appraisal of qualitative research Esquire course 2015, University of...

Advanced thoughts on critical appraisal of qualitative research

Esquire course 2015, University of Sheffield

Karin HannesFaculty of Psychology and Educational Sciences, KU Leuven

The aspirin case

The aspirin case transferred to critical appraisal

• The more you appraise, the lesser the chance to end up with flawed results. The main criterion is quality!

• The more you appraise, the more it stifles creativity. The main criterion is relevance!

Qualitative Scientists

Qualitative Inquirists

The aspirin case transferred to critical appraisal

Basic questions First issue to consider

Take a good, hard look at yourself and then answer the questions!

The worldview of a review author as a determining factor

Qualitative ScienceQualitative Inquiry

Meta-ethnography

Critical Interpretive synthesis

Meta-aggregation

Thematic synthesis

Framework synthesis

Meta- Grounded TheoryMeta-

narrativeEcological triangu-lation

The worldview of a review author as a determining factor

Meta-narrative

Critical interpretive synthesis

Meta-ethno-graphy

Grounded theory

Thematic synthesis

The JBI meta-aggregative approach

Framework synthesis

Ecological triangulation

Subjective idealism

Subjective idealism

Objective idealism

Objective idealism

Critical realism

Critical realism

Critical realism

Scientific realism

Based on Barnett-Page and Thomas, 2009

Idealist Realist

There is no shared reality independent of multiple alternative human constructions

There is a world of collectively shared understandings:

Qualitative ScienceQualitative Inquiry

Knowledge of reality is mediated by our perceptions and beliefs

It is possible for knowledge to approximate closely an external reality

Spencer, 2003

Qualitative Scientistsand critical appraisal

“To see means” Masters of the window• Clarity

• Neutrality

“What you see, is what you get.” Transparancy

Qualitative Scientistsand critical appraisal

quality of the study

• Check the credibility of the findings in terms of an accurate display of people’s voices

• Check the means to correct for the impact of the researcher on the findings

• Check whether the conclusions are grounded in the data

Before using it for decision making processes

Qualitative Scientists and critical appraisal

Questions Answers

Should we appraise? Yes

Which criteria should we use? Translation of general quality concepts such as validity, generalizability, reliability, objectivity

What approaches to quality assessment will likely be taken?

Criterion based judgment

How do researchers generally deal with the outcome of a critical appraisal?

Exclude low quality studies

Qualitative Inquiristsand critical appraisal

“TO see means” MASTERS OF THE LANTERN

• To step in someone else’s shoes

• To explore the dark corners or gaps in our knowledge base

• To go beyond what has been reported in the primary studies

• To problemize existing literature

“Shed light where there has been no light before” ILLUMINATION

Qualitative Inquiristsand critical appraisal

QUALITY OF THE STUDY•

the process of systematically examining research evidence to assess its relevance and utility for the story line to be developed

before using it to inform a decision

Qualitative Inquirists and critical appraisal

Questions Answers

Should we appraise? Not necessarily

Which criteria should we use? Incisiveness, concision, coherence, generativity, illumination

What approaches to quality assessment will likely be taken?

Overall judgment approach

How do researchers generally deal with the outcome of a critical appraisal?

Use quality appraisal as a baseline measurement, rather than seeing it as a measure to include or exclude studies

One of the major problems is that those who portray themselves as idealist researchers or qualitative inquirists adopt the rules of qualitative scientists!

Karin Hannes

Back to the aspirin case…Who is right and who is wrong?

CONFIGURATION AGGREGATION

Metaphors honestly stolen from Gough and Thomas, 2012

Transfer of Critical appraisal logic

Back to the aspirin case …Who is right and who is wrong?

The moose (?) would subscribe to the argument that the impact of the researcher on the research is inherent to the way qualitative research is conducted. They may prefer a CAI evaluating issues such as ‘thick description’ and ‘the innovative nature’ or ‘value for practice’ of the findings.

Epistemological and ontological assumptions of quantitative and qualitative research are incompatible.

The elephant (?) who highly values the validity of primary studies would prefer a CAI that is sensitive to aspects of validity, including criteria such as ‘all statements should be well-grounded in the data’ and ‘the impact of the investigator on the study results should be reported’.

Validity and researcher bias should be evaluated Some qualitative studies are more rigorous than others.

A more relevant question to ask Who am I?

What does this potentially imply for the choices I make?

The Third Road: Pragmatism

Meta-narrative

Critical interpretive synthesis

Meta-ethno-graphy

Grounded theory

Thematic synthesis

The JBI meta-aggregative approach

Framework synthesis

Ecological triangulation

Subjective idealism

Subjective idealism

Objective idealism

Objective idealism

Critical realism

Critical realism

Critical realism

Scientific realism

Idealist Realist

Qualitative ScienceQualitative Inquiry

PRAGMATISM

Pragmatismo The choice for using certain critical appraisal

instruments (CAI) or criteria should be based on its ‘utility’ and ‘fit for purpose’ for the studies to be included in the reviews.

o Reviewers should select CAIs that are suitable for the retrieved original studies.

PragmatismExample:

If you are appraising an action research design informed by critical theory, you would use an instrument including transformative- emancipatory criteria:

• Do the authors openly reference a problem in a community of concern? • Do the authors openly declare a theoretical lens? • Were the research questions or purposes written with an advocacy stance? • Did the literature review include discussions of diversity and oppression? • Did the authors discuss appropriate labeling of the participants?• Did data collection and outcomes benefit the community? • Were the stakeholders involved in the research project? • Did the results elucidate power relationships? • Did the results facilitate social change, and were the stakeholders empowered as a

result of the research process? • Did the authors explicitly state their use of a transformative framework?

(Saini & Shlonsky, 2012, pp. 133-135; Sweetman, Badiee, & Creswell, 2010, pp. 442-443)

PragmatismExample:

If you are appraising a mixed method design, you would use criteria that allow you to assess the added value of the mixed method component:• Was it appropriate or adequate to opt for an MMR study?

• Was it legitimate and if so, has the rationale been provided?

• Have both strands adequately been integrated?

• Did the authors provide a clear and defensible rationale for mixing the findings of studies?

• Wat there an overall benefit of triangulating designs or combining quantitative and qualitative evidence?

• Did the combination of quantitative and qualitative evidence minimalize bias and if so, has it clearly been documented?

(Heyvaert, Hannes & Onghena, 2016, in press).

Short summary Apart from that, decide on:

1. Choosing between:o Existing instrument or self-

compiled set of criteria?o Generic or design specific

frameworks/instruments for quality assessment?

o Generic or design specific criteria for quality assessment?

Taking into account fit for purpose and expertise!

2. How to use the outcome of quality assessment in your review?

1. Determine your position

1. Qualitative scientist

2. Qualitative inquirist

3. Pragmatist

4. ?

2. My prediction:

1. You will be more certain about why you do what you do (and it feels good!)

2. You will be less prone to external influences deciding for you (or at least be able to discuss your thoughts with those promoting other strategies).

1. Choosing between instruments and criteria

Motivate your choice thoroughly!

1. Choosing between instruments and criteria

Evaluate the strenghts and weaknesses of available instruments

1. Choosing between instruments and criteria(study based on core criteria outlined in instruments)

Criterion JBI CASP

Screening Q

Theoretical framework NO

Appropriateness design

Data collection procedure

Data-analysis procedure

Findings

Context NO

Impact of investigator

Believability NO

Ethics

Evaluation/Outcome NO

Value/Implication Research NO

Both instruments have focussed on the accuracy of the audit trail = Quality of reporting.

2. Dealing with the outcome of a critical appraisal exercise

Basis strategies:

• Use it to in/exclude studies

• Use it as a baseline measure for quality without excluding anything

• Assign more weight to studies that score high on quality

Assign more weight to studies that score high on quality

A. Sensitivity analyses

B. Levels of evidence assigned to findings

C. Frequency of themes combined with quality appraisal.

A. Sensitivity analysis• Sensitivity analysis involves testing how sensitive the

review findings are to the inclusion and exclusion of studies of different quality.

• Question: What would happen to the results if all studies below a certain quality threshold would systematically be excluded?

A. Sensitivity analysis

9/19 studies were judged to be inadequately reported for both reviews.

Dichotomization of studies: Adequately (>2)Inadequately (<2) reported

A. Sensitivity analysesReview 1:

Sexual health studies

No single principal theme was completely dependent on data from inadequately reported studies! No data emerged as exclusive findings from the inadequately reported studies. With the exception of two viewpoints from lower quality studies, all instances of dissonance, richness or complexity for each theme emerged from on or more of the adequately reported studies.

Review 2:

Online learning

Overall, data derived from inadequately reported studies did little to supplement data from adequately reported studies. Some of the richness was generated from inadequately reported studies.Excluding them would have resulted in the loss of valuable data on one particular subgroup (nurses). As a consequence differences between doctors and nurses might have been conceiled.

Consistent with studies from other authors having conducted sensititivity analyses (Noyes and Popay 2007, Thomas and Harden 2008)

B. Assign levels of evidence to findings

Author statements in primary studies included in the synthesis can be considered unequivocal, credible or unsupported, based on how well they are supported with excerpts from the data collection phase.

The author of a review chooses whether or not to include unsupported and credible findings.

B. Assign levels of evidence to findingso Unequivocal: Where the theme or metaphor is unequivocally supported by

direct quotes from the research participants. There is a clear relationship between the author’s theme or metaphor and the participants’ expressed experiences.

o Credible: where the participants’ expressed experiences are less clearly related to the author’s theme or metaphor and the author has extended beyond the expressed experiences of the participants based on the participant quotes that have been used.

o Unsupported: where there is no relationship between the expressed experiences of the participants and the author’s themes or metaphors, then it is clear that the author is generating findings that are unsubstantiated by the participants.

Joanna Briggs Institute, Australia

Rationale: Excluding findings may be an interesting alternative to excluding studies as a whole.

B. Assign levels of evidence to findings

C. Weighing the evidence: Frequencies

C. Weighing the evidence: Frequencies

o Counting the frequency of a theme in an included article

o Combine the frequency with the weight of quality appraisal done by expert judgement (EJ)

o Combine the frequency with the weight of quality appraisal done by checklists (CA)

Average score of 6.8 and Inter rater reliability=0.88

Average score of 6.9 and Inter rater reliability of 0.94

•Lower scores for criteria related to validity of the study, reporting of potential bias, contextual info in order to evaluate transferability of the findings.•Some articles scored low on the checklist but passed the expert judgement based on significance of the findings!

The evidence for high frequency themes increases, while for low frequency themes it declines!The direction of the change is the same for EJ and CA, CA has a better differentiating ability.

C. Weighing the evidence: FrequenciesSome conclusions:

o When a topic is frequently studied in a methodologically sound way, there is strong evidence for the value of the findings.

o Not all studies are of equal methodological quality and this should be accounted for when integrating findings.

o Limitation: Working with a summary score may conceal errors that can be considered fatal.

Closing remarks ...

• Checklists may only capture what has been reported (but reviewers can dig deeper ).

• To validate is to investigate, to check, to question, and to theorize. All of these activities are integral components of qualitative inquiry that insure rigor (Morse, 2002).

• In evaluating validity at the end of a study (post hoc), rather than focusing on processes of verification during the study we run the risk of missing serious threats to validity until it is too late to correct them. The process of inquiry is where the real verification happens.

[email protected] you!Questions?