Karin Hannes Centre for Methodology of Educational Research K.U.Leuven

29
Validity in Qualitative Research: a comparative analysis of 3 online appraisal instruments ability to establish rigor Karin Hannes Centre for Methodology of Educational Research K.U.Leuven

description

Validity in Qualitative Research: a comparative analysis of 3 online appraisal instruments ability to establish rigor. Karin Hannes Centre for Methodology of Educational Research K.U.Leuven. The Aspirine C ase. To appraise or not to appraise. - PowerPoint PPT Presentation

Transcript of Karin Hannes Centre for Methodology of Educational Research K.U.Leuven

Page 1: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Validity in Qualitative Research: a comparative analysis of 3 online appraisal

instruments ability to establish rigor

Karin HannesCentre for Methodology of

Educational ResearchK.U.Leuven

Page 2: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

The Aspirine Case

Page 3: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

To appraise or not to appraise The more you

appraise, the more it stifles creativity. The main criterion is relevance!

The more you appraise, the lesser the chance to end up with flawed results. The main criterion is quality!

Page 4: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

To appraise or not to appraise… That’s no longer the

question…

I appraise!

The question is…

Page 5: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which instrument am I going to use?

Which criteria am I going to use when evaluating methodological quality?

Page 6: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which instrument?

MAKING SENSE OF THE MYRIAD OF CRITICAL APPRAISAL INSTRUMENTS

Page 7: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which instrument?

Selection of appraisal instruments:› Used in recently published QES (2005-

2008)› Online available and ready to use› Broadly applicable to different qualitative

research designs› Developed and supported by an

organisation/institute/consortium or a context other than individual, academic interest.

Page 8: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which criteria are used to evaluate the quality of a study?

Three instruments fit the inclusion criteria:› Joanna Briggs Institute-Tool

http://www.joannabriggs.edu.au/cqrmg/tools_3.html› Critical Appraisal Skills Programme-Tool

http://www.phru.nhs.uk/Doc_Links/Qualitative%20Appraisal%20Tool.pdf

› Evaluation Tool for Qualitative Studieshttp://www.fhsc.salford.ac.uk/hcprdu/tools/qualitative.htm

To facilitate comparison:› Criteria grouped under 11 headings› Cross-comparison of the criteria (main headings)

Page 9: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven
Page 10: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which criteria are used to evaluate the quality of a study?

Criterion JBI CASP ETQSScreening Q Details study

Theoretical framework NOAppropriateness design NOData collection procedureData-analysis procedureFindingsContext NOImpact of investigatorBelievability NOEthicsEvaluation/Outcome NOValue/Implication Research NO

Page 11: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which criteria are used to evaluate the quality of a study?

Criterion JBI CASP ETQSScreening Q Details study

Theoretical framework NOAppropriateness design NOData collection procedureData-analysis procedureFindingsContext NOImpact of investigatorBelievability NOEthicsEvaluation/Outcome NOValue/Implication Research NO

All 3 instruments have focussed on the accuracy of the audit trail = Quality of reporting.

Page 12: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Which criteria am I going to use evaluate the quality of a study?

Let’s change the focus from What had been evaluated in critical

appraisal instruments?To What should be the main focus in

evaluating the methodological quality in qualitative research.

Page 13: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Validity and researcher bias should be evaluated Some Qual. studies are more rigorous than others.

Epistemological and ontological assumptions of Quant. and Qual. research are incompatible It is inappropriate to use such measures.

Which criteria am I going to use to evaluate the quality of a study?

Validity in Quant. Research instruments and procedures

Validity in Qual. Research kinds of understanding we have of the phenomena under study (accounts identified by researchers)

Page 14: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Validity as the main criterion

We need to know › whether the set of arguments or the

conclusion derived from a study necessarily follows from the premises.

› whether it is well grounded in logic or truth. › whether it accurately reflects the concepts,

the ideas that it is intended to measure.

What is validity? operationalisation using Maxwell’s framework

Page 15: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Validity as the main criterion

Maxwell’s deconstruction of the concept validity (1992)› Descriptive validity› Interpretative validity› Theoretical validity› Generalisibility (external validity)› Evaluative validity

Page 16: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Descriptive validity The degree to which descriptive

information such as events, subjects, setting, time, place are accurately reported (facts rather than interpretation).

Evaluation techniques: Methods- & Investigator triangulation allows for cross-checking of observations

Page 17: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Interpretive validity The degree to which participants’

viewpoints, thoughts, intentions, and experiences are accurately understood and reported by the researcher.

Evaluation techniques: Display of citations, excerpts, use of multiple analysts (inter-rater agreements), self-reflection of the researcher, (member checking).

Page 18: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Theoretical validity The degree to which a theory or

theoretical explanation informing or developed from a research study fits the data and is therefore credible/defensible.

Evaluation techniques: Persistent observation stable patterns, deviant or disconfirming cases, multiple working hypotheses, theory triangulation and active search for deviant cases, pattern matching

Page 19: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

External validity (transferability)

The degree to which findings can be extended to other persons, times or settings than those directly studied.

Evaluation techniques: Demographics, contextual background information, thick description, replication logic

Page 20: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Evaluative validity The degree to which a certain

phenomenon under study is legitimate, degree to which an evaluative critic is applied to the object of study.

Evaluation techniques: Application of an evaluative framework, ethics, ...

Page 21: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

What is the extent to which the different instruments establish validity?

Page 22: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

What is the extent to which the different instruments establish validity?

The most commonly used instrument ‘CASP’, is the least sensitive to aspects of validity (findings based on screening the main headings). It does not address interpretive nor theoretical validity or context as a criterion.

› The theoretical position and the background of a researcher has a direct impact on the interpretation of the findings.

› Statements that have no clear link to excerpts are at risk of not being grounded in the data.

› Therefore, they should be LEAD CRITERIA in a critical appraisal instrument!

Page 23: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

This study is limited by › Its focus on the main headings of the

instrument. Some of the subheadings of CASP do address issues of e.g. interpretive validity and some issues are not addressed in the JBI-tool, e.g. sampling procedures.

› its focus on validity as an evaluation criterion. Which other aspects are important to consider in evaluating the quality of an instrument?

What is the extent to which the different instruments establish validity?

Page 24: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

What is the extent to which the different instruments establish validity?

Are there fundamental differences between appraisal instruments?

Do we need to give our choice of appraisal instrument some thought?

Could it assist us in evaluating the (methodological) quality of a study?

Does it help us to establish rigor in qualitative research?

Page 25: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

What is the extent to which the different instruments establish validity?

Checklists only capture what has been reported.

I argue for the use of verification techniques for validity as a means for obtaining rigor.

In evaluating validity at the end of a study (post hoc), rather than focusing on processes of verification during the study we run the risk of missing serious threats to validity until it is too late to correct them.

Page 26: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Basic qualitative researchers › should be motivated to adopt techniques

that improve validity› Should be guided in how to report

qualitative research in order to facilitate critical appraisal

To conclude...

Page 27: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

To conclude The development of a

standards set of reporting criteria for qualitative research is virtually impossible!

The development of a standard set of reporting criteria would facilitate critical appraisal.

We might need you!To participate in a Delphi study on exploring the potential feasibility, appropriateness, meaningfulness and effectiveness of reporting criteria.

Page 28: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

To conclude...

To validate is to investigate, to check, to question, and to theorize. All of these activities are integral components of qualitative inquiry that insure rigor (Morse, 2002).

The process of inquiry is where the real verification happens.

Page 29: Karin  Hannes Centre for Methodology of Educational Research K.U.Leuven

Conflicts of interest:I am a Cochranite

and a Campbell knight

I might substantially have been brainwashed in the ‘risk of bias’ discourse, beyond my personal control.

[email protected]