Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and...

21
Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi M. Levitt University of Massachusetts Boston Sue L. Motulsky Lesley University Fredrick J. Wertz Fordham University Susan L. Morrow University of Utah Joseph G. Ponterotto Fordham University – Lincoln Center The current paper presents recommendations from the Task Force on Resources for the Publication of Qualitative Research of the Society for Qualitative Inquiry in Psychol- ogy, a section of Division 5 of the American Psychological Association. This initiative was a response to concerns by authors that reviews of qualitative research articles frequently utilize inflexible sets of procedures and provide contradictory feedback when evaluating acceptability. In response, the Task Force proposes the concept of methodological integrity and recommends its evaluation via its two composite pro- cesses: (a) fidelity to the subject matter, which is the process by which researchers develop and maintain allegiance to the phenomenon under study as it is conceived within their tradition of inquiry, and (b) utility in achieving research goals, which is the process by which researchers select procedures to generate insightful findings that usefully answer their research questions. Questions that guide the evaluation of these processes, example principles, and a flowchart are provided to help authors and reviewers in the process of both research design and review. The consideration of methodological integrity examines whether the implementation of fidelity and utility function coherently together. Researchers and reviewers also examine whether methods further the research goals, are consistent with researchers’ approaches to inquiry, and are tailored to the characteristics of the subject matter and investigators. This approach to evaluation encourages researchers and reviewers to shift from using standardized and decontextualized procedures as criteria for rigor toward assessing the underlying methodological bases for trustworthiness as they function within research projects. Keywords: methodological integrity, qualitative methods, qualitative research, review, trustworthiness In the discipline of psychology, qualitative research refers to methodical scientific practices aimed at producing knowledge about the nature of experience and/or action, including social processes (e.g., Fine, 2013; Morrow, 2005; Parker, 2004; Wertz et al., 2011). These meth- This article was published Online First November 7, 2016. Heidi M. Levitt, Department of Psychology, University of Massachusetts Boston; Sue L. Motulsky, Division of Coun- seling and Psychology, Lesley University; Fredrick J. Wertz, Department of Psychology, Fordham University; Susan L. Morrow, Department of Educational Psychology, University of Utah; Joseph G. Ponterotto, Division of Psychological & Educational Services, Fordham University – Lincoln Center. In 2013, the Society for Qualitative Inquiry in Psychol- ogy formed a Task Force on Resources for the Publication of Qualitative Research. This paper is the product of their efforts. The task force would like to thank Lisa M. Osbeck for thoughtful comments in the review process. Correspondence concerning this article should be ad- dressed to Heidi M. Levitt, Department of Psychology, University of Massachusetts Boston, 100 Morrissey bou- levard, Boston, MA 02125. E-mail: [email protected] Qualitative Psychology © 2016 American Psychological Association 2017, Vol. 4, No. 1, 2–22 2326-3601/17/$12.00 http://dx.doi.org/10.1037/qup0000082 2

Transcript of Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and...

Page 1: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

Recommendations for Designing and Reviewing QualitativeResearch in Psychology: Promoting Methodological Integrity

Heidi M. LevittUniversity of Massachusetts Boston

Sue L. MotulskyLesley University

Fredrick J. WertzFordham University

Susan L. MorrowUniversity of Utah

Joseph G. PonterottoFordham University – Lincoln Center

The current paper presents recommendations from the Task Force on Resources for thePublication of Qualitative Research of the Society for Qualitative Inquiry in Psychol-ogy, a section of Division 5 of the American Psychological Association. This initiativewas a response to concerns by authors that reviews of qualitative research articlesfrequently utilize inflexible sets of procedures and provide contradictory feedbackwhen evaluating acceptability. In response, the Task Force proposes the concept ofmethodological integrity and recommends its evaluation via its two composite pro-cesses: (a) fidelity to the subject matter, which is the process by which researchersdevelop and maintain allegiance to the phenomenon under study as it is conceivedwithin their tradition of inquiry, and (b) utility in achieving research goals, which is theprocess by which researchers select procedures to generate insightful findings thatusefully answer their research questions. Questions that guide the evaluation of theseprocesses, example principles, and a flowchart are provided to help authors andreviewers in the process of both research design and review. The consideration ofmethodological integrity examines whether the implementation of fidelity and utilityfunction coherently together. Researchers and reviewers also examine whether methodsfurther the research goals, are consistent with researchers’ approaches to inquiry, andare tailored to the characteristics of the subject matter and investigators. This approachto evaluation encourages researchers and reviewers to shift from using standardized anddecontextualized procedures as criteria for rigor toward assessing the underlyingmethodological bases for trustworthiness as they function within research projects.

Keywords: methodological integrity, qualitative methods, qualitative research, review,trustworthiness

In the discipline of psychology, qualitativeresearch refers to methodical scientific practicesaimed at producing knowledge about the nature

of experience and/or action, including socialprocesses (e.g., Fine, 2013; Morrow, 2005;Parker, 2004; Wertz et al., 2011). These meth-

This article was published Online First November 7,2016.

Heidi M. Levitt, Department of Psychology, University ofMassachusetts Boston; Sue L. Motulsky, Division of Coun-seling and Psychology, Lesley University; Fredrick J. Wertz,Department of Psychology, Fordham University; Susan L.Morrow, Department of Educational Psychology, Universityof Utah; Joseph G. Ponterotto, Division of Psychological &Educational Services, Fordham University – Lincoln Center.

In 2013, the Society for Qualitative Inquiry in Psychol-ogy formed a Task Force on Resources for the Publicationof Qualitative Research. This paper is the product of theirefforts. The task force would like to thank Lisa M. Osbeckfor thoughtful comments in the review process.

Correspondence concerning this article should be ad-dressed to Heidi M. Levitt, Department of Psychology,University of Massachusetts Boston, 100 Morrissey bou-levard, Boston, MA 02125. E-mail: [email protected]

Qualitative Psychology © 2016 American Psychological Association2017, Vol. 4, No. 1, 2–22 2326-3601/17/$12.00 http://dx.doi.org/10.1037/qup0000082

2

Page 2: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

ods use natural language and other descriptiveand interpretive forms of human expression intheir data, analysis, and findings. Qualitativeresearch tends to centralize an iterative processin which data are analyzed and meanings gen-erated in a fruitful, recursive manner, yieldingresults that gradually produce original knowl-edge of psychological life (e.g., Osbeck, 2014;Rennie, 2012; Wertz, 1999). In view of thepluralism of qualitative research traditions thathas emerged in psychology over the years, wehave attempted to develop a unified frameworkthat can be commonly employed in the designand evaluation of this kind of research by bothresearchers and reviewers. This report presup-poses some familiarity with and expertise inqualitative research in general and at least onespecific approach. Its recommendations are notthe result of a qualitative analysis nor are theysummarizations of guidelines and recommenda-tions that have been previously suggested.Rather, they are an attempt to articulate a uni-fied set of principles concerning the scientificintegrity of the many contrasting approachesand varied methods of qualitative research thatare currently employed.

Although it is not possible to review theirhistory in the present paper, descriptions of thedevelopment of qualitative research methods inhuman science can be traced to precursors ofmodern psychology, such as Wundt, Vico,Dilthey, and James (Danziger, 1990; Wertz,2014). Scholars in psychology have been activein developing, teaching, conducting, and dis-seminating research using these empirical meth-ods in recent years. There is great diversity inthe goals and procedures of contemporary qual-itative research (Gergen, 2014), and its fullerintegration in the science literature holds greatpromise for both advancing psychology and ed-ucating interdisciplinary and lay audiences(Gergen, Josselson, & Freeman, 2015).

This paper is intended for readers who con-duct qualitative research and for those who havesufficient knowledge to serve as reviewers oraction editors of submissions for publication. Itis not intended to be a justification of qualitativeresearch, a prescriptive list of its purposes, nora primer for basic terminology in and founda-tions of qualitative research (see Ponterotto,2005b for a useful primer). Instead, we describein the paper principles meant to advance quali-tative research design and evaluation, accompa-

nied by examples, to aid researchers and re-viewers in considering methods across multiplequalitative traditions.

In North American psychology journals, thepublication of qualitative methods has beensteadily rising (Hays, Wood, Dahl, & Kirk-Jenkins, 2016; Ponterotto, 2005a, 2005c). Thesemethods (e.g., phenomenology, grounded the-ory, discourse analysis, consensual qualitativeresearch) are coming into common practice, es-pecially in research related to counseling, edu-cation, health, psychotherapy, and cultural stud-ies. Still, on the whole, qualitative researchersremain a minority within psychology and facechallenges in communicating and disseminatingtheir findings, as do reviewers and editors eval-uating this work.

Challenges for Reviewers andJournal Editors

The process of reviewing qualitative researchfor publication can entail a number of compli-cations for both reviewers and editors of psy-chological journals.

Training in Qualitative Methods

Graduate-level education in qualitative meth-ods is relatively recent and still rare withinAmerican psychology. Although psychologycoursework in qualitative methods is becomingmore common (Ponterotto, 2005a; Rennie,2004), it was not available when most journaleditors and article reviewers were in training.As a result, reviewers with expertise in a subjectarea may have in-depth topical knowledge butmay not be equipped to review the qualitativeapproach used in a submission. Even reviewersexperienced with some qualitative approachesmay not be familiar with many others currentlyin use. Although reviewers are asked to evaluateand provide recommendations on manuscripts,their limited knowledge base may foster well-intentioned but inappropriate appraisals (e.g.,asking authors to include a control group orexpecting all qualitative methods to use brack-eting). This situation may result either in therejection of strong work or the acceptance ofweak manuscripts that might comply withmethodological rules but do not enhance knowl-edge.

3INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 3: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

Diverse Goals and Approaches to Inquiry

Qualitative research methods in psychologymay be rooted within a number of philosophicalapproaches and methodological traditions of in-quiry that have distinct goals, norms, ways ofcommunicating, and procedures for establishingtrustworthiness (Hunt, 2011). Goals may in-clude concept clarification, theory development,hypothesis generation, promotion of social jus-tice, social transformation, or practical applica-tions—and necessitate the tailoring of methodsto each project’s particular purpose. For in-stance, certain methods might best bolster asystems-level inquiry when applied toward ef-fecting social change, but different methodsmight be required to develop an empiricallybased description or theory of that same topic.Depending on their philosophical assumption,methods might be conceptualized variously anddistinct sets of procedures might be valued. Forinstance, whereas some approaches prioritizethe demonstration of reliability across investi-gators, others prioritize the depth of engage-ment of one investigator (e.g., Giorgi, 2009;Hill, 2012). Whereas some methods might de-scribe their procedures in a list of steps laid outat the outset of projects, others view their pro-cedures as shifting in response to developingfindings (e.g., Braun & Clarke, 2006; Hosh-mand, 2005).

This diversity can create difficulties in thedesign and review process, as authors and re-viewers are faced with research based upon acomplex set of considerations rather than uponadherence to a single established set of proce-dures. For example, authors may need to con-sider how to adjust their qualitative methodswhen conducting research with new goals, phil-osophical assumptions, or types of participants.Similarly, reviewers may need to adjust expec-tations (e.g., that a priori operationalized defi-nitions of key concepts be presented as requiredfor hypothesis testing) to fit the goals of specificqualitative research projects (e.g., to define con-cepts or generate hypotheses). Action editorswith a flexible understanding of the principlesunderlying variations in method are essential.

Knowing What We Do Not Know

Given the variety of qualitative approachesand their underlying philosophical assumptions,

it is rare for editors or reviewers, even thosewith considerable qualitative expertise, to beknowledgeable about them all or to be able toreview them all equally well. It may be chal-lenging to identify the level of expertise neededto conduct a review. For instance, reviewerswho have used one version of a design maynot realize that their view of its legitimate vari-ants is limited, which can lead to faulty recom-mendations (e.g., a realist vs. constructivistgrounded theory study). Reviewers also maynot know what design is used in a submissionuntil they agree to conduct the review and thenmay feel obliged to complete the review, evenwhen they are not firmly grounded in its ap-proach.

Knowing Whom to Trust

Editors often are presented with conflicts inreviews that can be traced to varying levels ofexpertise among reviewers. For example, onereviewer who is skilled in qualitative researchdesign may have only minor concerns that aredrowned out by stronger inaccurate criticisms ofreviewers who are less familiar with these meth-ods. That reviewer may make recommendations(e.g., the use of a hierarchy of findings) that donot fit with the design in manuscript being re-viewed (e.g., a discourse analysis). It can bechallenging for editors to know whose review toprioritize, whether additional reviews are nec-essary, and when to invite further clarifyingconversation between reviewers and authors.

Challenges for Authors

Additional challenges to publishing qualita-tive research arise from the perspectives of au-thors preparing submissions. Journal expecta-tions may be inconsistent or inappropriate.

Inconsistent Design and ReviewExpectations

When authors are engaging in research de-sign, they may be flummoxed by the multiplesets of recommendations that are tailored to-ward specific methods, philosophical stances, orcontent areas and their intersections (e.g., Guba& Lincoln, 2005; Hill, 2012; Kidd & Kral,2005). Although increasing numbers of journalspublish qualitative research, there is little con-

4 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 4: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

sistency in the types of information to be in-cluded in reports. Because these expectationscan differ by reviewer, it may be impossible forresearchers to know what to expect, even withinthe same venue. For example, some reviewersseek detailed information on investigators’ re-flexivity (examining their own process of en-gagement) or ontological/epistemologicalframework, whereas others discourage the in-clusion of this information. Authors are leftuncertain and may be penalized for either inclu-sions or omissions.

Page Limits and InappropriatePublishing Guidelines

Authors face a difficult challenge to adhere toa journal’s maximum page or word limit andinclude all the important information. Editorswho object to dividing large qualitative studiesinto publishable sections may mistake this divi-sion as piecemeal publication—that is, the un-necessary division of research from one data setinto separate articles. Qualitative research tendsto require space not needed by quantitative stud-ies, because its presentation often requires in-depth rationale for methodological choices,demonstrations of how the data analysis led tothe findings, and natural language exposition offindings with quotations. Also, findings are of-ten context-dependent and may require exten-sive descriptions of conditions and contexts inorder to be intelligible. As a result, authors arein a position where they must choose to eitherexplain their method clearly or present theirresults persuasively. This concern may be evenmore acute for mixed methods researchers, de-spite growing interest in funding multimethodprojects by the US National Institutes of Health(http://sigs.nih.gov/cultural/Pages/default.aspx,November 7, 2013).

A helpful trend has begun in which somejournals allow extra pages for qualitative man-uscripts, ameliorating this problem to someextent. For instance, the Journal of Counsel-ing Psychology permits 10 extra pages forqualitative manuscripts (shifting their pagelimit from 35 to 45 pages), which aids theirauthors in submitting research that reviewerscan appropriately evaluate. The availability ofthe option for online supplements to an articleis another useful development for qualitativeresearchers. Online supplements might in-

clude detailed methodological informationsuch as complete interview protocols, recruit-ment scripts, comprehensive demographic in-formation, and supporting pictures or record-ings, as well as result descriptions thatinclude more quoted material than might fitwithin a printed version.

Challenges Related to the TransitionToward Qualitative Methods

In addition to the aforementioned advances inunderstanding qualitative methods, there havebeen some attempts to support this research inpsychology that have controversial implica-tions.

Reducing Methods to Singular Variants orto Key Procedures

To aid with the problem of reviewers’ limitedinformation about diverse qualitative methods,some journals have made available to their re-viewers guidelines to enhance reviewing, in-cluding lists of key procedures associated withspecific methods (Letts et al., 2007) or a list ofquestions (Mallinckrodt, 2010) that orient re-viewers to procedures that enhance trustworthi-ness. Although these efforts might benefit anaïve reviewer, procedure-based evaluations ofrigor tend not to invite the consideration ofintegrity with specific reference to investiga-tors’ research goals, approaches to inquiry, orstudy characteristics, which may require cre-ative procedural innovation. Also, versions ofan approach by the same name might use dif-ferent procedures and terminology (cf., Braun &Clarke, 2006; Krippendorff, 2013). For in-stance, what leading researchers (Bryant &Charmaz, 2010; Rennie, 2000) and originatorsof the method (Glaser, 1998; Strauss & Corbin,1990) each called grounded theory differs interms of procedures, language, and philosophi-cal frameworks, even though there are com-monalities among approaches under the rubricof grounded theory (Fassinger, 2005). A proce-dural approach to reviewing can result in con-flicting reviews, with reviewers referencing al-ternate variants of a qualitative researchtradition. Also, it discourages the appropriateadaptation of established designs and the devel-opment of new methods (e.g., Charmaz, 2014).

5INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 5: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

Question-Guided Versus Fixed-ProcedureResearch Design

Although procedurally driven descriptions ofmethods can be helpful primers when first learn-ing a particular approach, the framing of qual-itative methods as rigid sets of procedures canlead to the faulty assumption that a mechanicaladherence to established steps is ideal. In hisclassic critique of psychological research prac-tices, Bakan (1967) referred to this flaw asmethodolatry, in which psychology was ac-cused of idolizing adherence to fixed methodsrather than flexibly utilizing methods suited tothe research questions. He suggested that, in-stead of using methods to further inquiry, psy-chologists have restricted their inquiry to fitestablished methods. Defining rigor as adher-ence to rigid sets of procedures may lead to theinappropriate rejection of research when inves-tigators have innovated or adjusted proceduresin accordance with their unique subject matter,goals, research questions, and other importantcharacteristics of their studies.

Valuing Qualitative Methods Only WhenQuantified or Supplemental

Although the publication of qualitative meth-ods in psychology is on the rise, a recent review(Eagly & Riger, 2014) of the state of feministpsychological science found that qualitative re-search that is not part of a mixed method designwas uncommon in the high citation journalsexamined, with some editorial policies havingrequired qualitative findings to incorporatequantification (Frieze, 2013; cf. Hesse-Biber,2016). This state of affairs is of particular con-cern because the value of using qualitativemethods to explore the experiences of womenand other marginalized people has been com-pellingly established in feminist and multicul-tural scholarship. The review concluded thatpsychology pays “negligible attention to episte-mology” (Eagly & Riger, 2014, p. 698), whichlimits the qualitative research traditions under-stood and accepted for publication.

The costs of these problems in the field arehigh. Contributing to many of these challengesare conflicting ideas on how rigor in qualitativeresearch should be understood. Qualitative re-searchers receive contradictory design adviceand unhelpful reviews and need to resubmit

articles repeatedly; reviewers face uncertaintywhen evaluating articles; and editors make de-cisions based upon inconsistent advice. In rec-ognition of these multiple systemic challenges,the Society for Qualitative Inquiry in Psychol-ogy, a section of Division 5 (Quantitative andQualitative Methods) of the American Psycho-logical Association, formed the Task Force onResources for Qualitative Research Publicationto develop resources that could facilitate thepublication of high-quality qualitative researchin psychology. To advance the state of qualita-tive research, the task force sought to develop aunified approach to considering trustworthinesswhile maintaining the flexibility needed to ac-commodate the diversity of research approachesand their appropriate adaptation across studies.

A Diversity of Qualitative ResearchApproaches and Goals

Qualitative researchers pursue a variety ofresearch goals and often do so within diverse,established traditions. It can be challenging forauthors and reviewers to understand and assessthe validity of research conducted according tosuch distinct goals and approaches. One line ofwork that has been successful in raising psy-chologists’ awareness of the methodologicalpluralism of dominant qualitative traditions typ-ifies them in categories, such as post-positivist,constructivist-interpretive, and critical-ideolog-ical paradigms (Guba & Lincoln, 2005; Pon-terotto, 2005b). Although these categories arenot inclusive of all qualitative approaches, anddistinctions among ontology, epistemology, andmethod are not always clear (Staller, 2013), thedelineation of these three traditions has sup-ported diversity in the methods and goals ofqualitative research (Creswell, 2013; Morgan,2007).

Because these three paradigms are reviewedelsewhere in detail (see Morrow, 2005; Pon-terotto, 2005b), we briefly describe only thecentral tenets of these traditions in order toencourage an appreciation of the variety ofmethods and goals that may be at play whendesigning and evaluating qualitative research.The goal of science for post-positivist research-ers is to use an objective approach to analysis inorder to proffer explanations or make predic-tions, while working to minimize human errorand biases. Constructivist-interpretive re-

6 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 6: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

searchers seek to use dialogical exchanges withparticipants in order to uncover meanings thatare held by sets of people or systems, whileexemplifying their process of analysis in orderto illustrate and make transparent their interpre-tive processes. For critical-ideological re-searchers, the purpose of the research may be tounmask and disrupt privilege, power, and op-pression for the sake of liberation, transforma-tion, and social change, using their perspectivesovertly as a lens to guide the analysis of theirdata and report on their findings.

In addition to these three frameworks, thereare other well established and developing ap-proaches to qualitative research worthy of rec-ognition. The phenomenological approach has ahundred-year interdisciplinary history of devel-oping qualitative methods for the study of livedexperience that include descriptive, interpretive,and narrative variants in psychology (Churchill& Wertz, 2001; Giorgi, 2009; Wertz, 2005,2014). The pragmatic approach, which may usemultiple methods to achieve practical aims, isfocused on solving problems that may be de-fined by multiple stakeholders in order to yieldconsequences that serve human interests incomplex institutions from education to businessto psychotherapy (Fishman, 1999; Patton,2015). These five categories of traditions are notmutually exclusive and are not meant to beexhaustive. Research practices along each ofthese lines are in a continual process of fluidinterchange, innovation, and change. New ap-proaches, such as the currently growing arts-based or performative inquiry (Gergen & Ger-gen, 2012), may rise in influence as methodsevolve over time. Nevertheless, an appreciationof the distinctiveness of various goals and tra-ditions of qualitative projects facilitates aneeded understanding of diversity in both thedesign and the review of research.

Task Force Aims and Procedures

To meet their charge to provide resources tosupport the design and evaluation of qualitativeresearch, the task force developed this paper. Itspurpose is to further thinking about qualitativemethods by articulating a systematic method-ological framework that can be useful for re-viewers and authors as they design and evaluateresearch projects. These two processes are in-evitably intertwined because the same sets of

norms concerning good scientific process areused in both designing and reviewing research.We hope to synthesize the literature on theseprocesses and to identify central theoreticalprinciples that can replace a cookbook approachto these tasks. We aim to develop a frameworkthat respects the diversity and complexities ofqualitative research methods (e.g., Gergen,2014). We do not propose to replace methodsthemselves, close down discussion of differ-ences among research designs, nor hinder theirdevelopment by setting in place a new set offixed procedural rules. Rather, we propose foun-dational principles that can complement discus-sions of specific research methods, promote di-alogue, and support the continued evolution ofqualitative methods.

The process of writing this article was char-acterized by theoretical collaboration acrosspsychologists in different specialty areas (coun-seling, clinical, and human development) whohave used a variety of qualitative methods inresearch on diverse topics and cultural identi-ties. We have taken deliberate steps to avoidimposing the values and procedures associatedwith any one qualitative tradition or methodwithin this work. The task force chair initiallyproposed a draft based upon her prior work(Levitt, 2014, 2015a, 2016a). Then, the TaskForce extended the consideration of the litera-ture on trustworthiness, and the ideas in thedraft were sharpened and their relevance broad-ened over a 2-year period. The group projectwas co-constructive in nature in that writing,reviewing, and revising followed an iterativeprocess. Input on the final document was trian-gulated across the writing team, independentqualitative research experts, and discussions atvarious conferences (Levitt, 2015b; Levitt,Bamberg, Josselson, & Wertz, 2016; Levitt,Morrow, Wertz, Motulsky, & Ponterotto, 2014;Levitt, Motulsky, Wertz, & Josselson, 2015;Levitt, Motulsky, Wertz, Josselson, & Pon-terotto, 2015) with professional audiences inter-ested in qualitative research and then was sub-mitted for review. As such, the authors of thisreport believe that its content can speak equallywell across methodologies as well as to bothseasoned and novice qualitative researchers andreviewers.

In the process of seeking feedback, we twicesent drafts of our paper to a group of indepen-dent qualitative researchers selected for their

7INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 7: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

expertise and leadership roles across a widerange of qualitative traditions and research foci(i.e., Valerie Futch, Michelle Fine, Mark Free-man, Marco Gemignani, Kenneth Gergen, MaryGergen, Joseph Gone, Clara Hill, Ruthellen Jos-selson, Linda McMullen, and Cynthia Winston)and invited comment. At both points, the con-sultants’ responses were considered by the taskforce and informed the evolving manuscript.Most of the feedback focused on challenges inarticulating the ideas in the paper across a rangeof research traditions. The task force was care-ful to be attentive to this feedback, leading torevisions that made the principles more inclu-sive. By the second review, feedback indicatedthat most of the reviewers viewed the manu-script as clearer and improved in its fit with theirapproaches. The task force followed up in con-versations when they were unsure how to ad-dress reviewers’ concerns. The coauthors madechanges to address the final clarification re-quests and came to consensus on those changes.

Because there is no common language thatcrosses methods and traditions, we recognizethat readers may need to translate our terms intothe language of their own preferred approaches,as providing terms and examples from all epis-temologies and methods would be too cumber-some. For instance, “data collection” is used torefer to varied processes such as data identifi-cation, coconstruction, or fieldwork. “Analysis”encompasses processes such as coding, catego-rizing, or the use of reflexive self-examinationby researchers. Similarly, at points the phrasingmight seem too realistic for some readers or toopostmodern to others. With this caveat, how-ever, the underlying ideas presented have beenfound compelling by researchers across manytraditions and methods.

A Singular Framework for MethodologicalIntegrity

In proposing principles concerning the designand evaluation of diverse features and processesof qualitative methods and traditions in psy-chology, the crucial question arises of whetherqualitative approaches have a sufficientlyshared basis for unitary norms. This questionhas been considered by Rennie (2000, 2012),Wertz (1983, 1999; Wertz et al., 2011), andOsbeck (2014). Osbeck developed the thesisthat “the basic processes of selection of relevant

facts or meaning units, extraction of similari-ties, discrimination, arrangement, and emphasisare common across many domains of scienceand that these are also the basic elements ofqualitative inquiry” (2014, p. 34). She arguedthat inferential processes, such as inductive rea-soning, explanation, and model-based reason-ing, are common to scientific understanding aswell as the hermeneutic circle, in which there iscontinual reflective movement among aspectsof a text and its whole.

Wertz extended a bottom-up approach, re-flecting upon and then identifying the analyticprocedures within phenomenological and exis-tential inquiry (e.g., Wertz, 1983); these prac-tices were subsequently found through the di-verse history of qualitative psychoanalyticresearch (Wertz, 1987). Most recently, the com-parative study of qualitative analysis by expertsin five current approaches (phenomenology,grounded theory, discourse analysis, narrativeresearch, and intuitive inquiry) yielded a de-scription of common attitudinal and analyticpractices that were traced through the history ofpsychology in such inquirers as those of Freud,James, Flanagan, Maslow, and Kohlberg (Wertzet al., 2011). Such commonalities included openreading, empathic immersion, differentiatingdata units, distinguishing implicit meanings incontext, identifying emergent structural pat-terns, modifying findings in view of counterin-stances, reflexivity, and the critical evaluationof limitations. In addition to distinctive proce-dures developed within each tradition, multipletraditions articulated similar practices in variedterms, such as eidetic analysis by phenomenol-ogists and the hermeneutic circle by interpretiveresearchers (Wertz et al., 2011).

Rennie (2012) described a cycle involvingfour inferential processes utilized within a num-ber of qualitative traditions. Although his writ-ings provide a depth of description on eachprocess, we briefly summarize them here: (a)drawing forth meaning via the researchers’ re-flection on the data about what is important(eduction), (b) formulating an approximation ofthe inherent meaning (abduction), (c) decidingthat further analysis could provide useful evi-dence (theorematic deduction), and (d) seekingout commonalities after adding new data to theset under consideration (induction). Dependingon the approach (Rennie, 2012), qualitative re-searchers use these processes to move in either

8 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 8: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

direction between the analysis of portions ofdata or holistic experiences of phenomena.Thereby, the process of induction becomes self-correcting, and cycling among these stagesgradually refines the meaning generated. Thiscycling eventually leads to some stability inconceptualization that signals the end of theanalytic process. This four-step version of me-thodical hermeneutics was put forward as ajustification of qualitative methods (see Levitt,Lu, Pomerville, & Surace, 2015).

Across these three attempts to compare ana-lytic processes among research traditions, itmay be of interest to future methodologists thathermeneutic processes, such as cycling betweenparts and the whole of a dataset, have emergedas central features. For our purposes, however,these theories of commonality provide supportto pursue a singular framework for generatingmeaning across qualitative methods.

Methodological Integrity as the Basis forTrustworthiness in Qualitative Research

Over time, criteria have been recommendedfor appraising rigor, or trustworthiness, in qual-itative research that are congruent with particu-lar epistemological approaches (e.g., Guba &Lincoln, 2005; Morrow, 2005). In addition tothese recommendations, there are a variety ofexcellent guidelines for conducting and review-ing qualitative research that outline desirablefeatures of single designs (e.g., Fassinger, 2005;Fine, 2013; Gilligan, 2015; Gilligan, Spencer,Weinberg, & Bertsch, 2003; Hill, 2012; Hosh-mand, 2005; Kidd & Kral, 2005; Suzuki, Ahlu-walia, Mattis, & Quizon, 2005; Wertz, 2005). Asmaller number of papers describe proceduresassociated with rigor across qualitative methods(e.g., Elliott, Fischer, & Rennie, 1999; Letts etal., 2007; Levitt, 2014; Morrow, 2005; Parker,2004; Stiles, 1993; Tracy, 2010; Wertz et al.,2011; Williams & Morrow, 2009).

Emerging from our consideration of thisbody of work, we propose an overarching con-cept, methodological integrity, as the method-ological foundation of trustworthiness. Withinthis, we distinguish two constituents, fidelityand utility, at the core of methodological integ-rity. The overarching concept of integrity unitesthese two concepts and addresses the way theyboth are to be considered when selecting andevaluating methods and procedures within indi-

vidual studies. We describe these processesconceptually, rather than operationalize them interms of procedures, as they drive the selectionof specific procedures and undergird their valueof specific methods and procedures (see Levitt,Neimeyer, & Williams, 2005 on the function ofprinciples).

These three concepts—integrity and its con-stituent components of fidelity and utility—concern all aspects of research, including thedelineation of the topic of research; the criticalliterature review; the research goals; the philo-sophical and methodological tradition em-ployed; the formulation of questions; proce-dures such as researcher reflexivity, participantselection, data collection, and analytic steps; thearticulation of study implications, the audience,and report presentation. Here, however, we fo-cus on research constituents of prime concern toevaluation and design— data collection andanalysis.

Methodological integrity. Trustworthinessis a term that has been used across qualitativetraditions and epistemologies to indicate theevaluation of the worthiness of research andwhether the claims made are warranted,whereas other terms such as credibility and va-lidity have been associated with specific per-spectives (e.g., Denzin & Lincoln, 2005).Whereas the term trustworthiness describes thedegree to which researchers and readers areconvinced that a research study has captured asignificant experience or process related to theirtopic (e.g., Denzin & Lincoln, 2005; Morrow,2005), we use the term integrity to specify themethodological basis of that confidence. It isdistinct from elements of trustworthiness thatare not based upon method (e.g., reputation ofauthors, aesthetic elements of presentation, con-vergence of findings with readers’ prior experi-ences and expectations).

Integrity is the aim of making decisions thatbest support the application of methods, as eval-uated in relation to the following qualities ofeach study. Integrity is established when re-search designs and procedures (e.g., autoeth-nography, discursive analysis) support the re-search goals (i.e., the research problems/questions); respect the researcher’s approachesto inquiry (i.e., research traditions sometimesdescribed as world views, paradigms, or philo-sophical/epistemological assumptions); and aretailored for fundamental characteristics of the

9INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 9: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

subject matter and the investigators. Relevantcharacteristics related to the subject matter in-clude both its qualities and the qualities of theresearch participants or data sources that influ-ence the communication about the subject andengagement in the study (e.g., the complexity ofthe subject matter, verbal ability or insightful-ness of data sources/participants, participants’commitment or ability to participate in re-search). Relevant characteristics related to theinvestigators include their identities, statuses,and lived experiences (i.e., whether similar ordifferent from the topic studied) and the re-sources they bring to support the research andits dissemination. We propose that integrity beunderstood as the establishment of fidelity andutility as a functional synergy among these fea-tures of a study and with each other. This con-text-driven approach to design and evaluationcontrasts with approaches in which methods areexpected to adhere to fixed procedures. We nowturn to defining the concepts of fidelity andutility and to providing examples of their func-tion in achieving integrity.

Fidelity to the subject matter. We de-scribe fidelity as an intimate connection thatresearchers can obtain with the phenomenonunder study. Many qualitative researchers inpsychology structure their data collection tocapture the Erlebnis—the lived experience ofthe participants or phenomenon—reaching veri-similitude through thick description (Geertz,1973; Ponterotto, 2006). Our recommendationis that researchers select methods to enhancefidelity, regardless of whether they view thephenomena under study as social constructions,existential givens, unmediated experiences, em-bodied practices, or any other kind of subjectmatter that may be reflected in data and analy-ses. That is, fidelity to the research phenomenonis not tied to any one epistemological perspec-tive or world view.

Data may be procured by inviting partici-pants to interview or describe their experiences,with great attention to gaining access to theoften covert and internal experiences of partic-ipants that may be challenging to observe. Alsocommon in qualitative research is the collectionof discursive or observational data as a basis forthe analysis of social and linguistic practices.Other times, data may be selected from existingtexts or dialogical exchanges, again with atten-tion to selecting data that have the potential to

demonstrate an aspect of experience or a pro-cess. In ethnographic research, fidelity of data isenhanced through immersion in a system orculture. Multiple expressions or portrayals ofphenomena with equally high fidelity can existbecause there are many ways in which the re-searcher can achieve authentic closeness to andintimacy with the phenomenon under study.

Utility in achieving goals. The second coreprocess, utility, refers to the effectiveness of theresearch design and methods, and their syner-gistic relationship, in achieving study goals—answering questions and/or resolving problems.Our framing utility as a study’s success in meet-ing its goals specifies the functional referent ofthe utility assessment (i.e., method as usefultoward what end?), rather than identifying de-contextualized procedures at specific phases ofresearch activity. Like fidelity, the consider-ation of utility is at play throughout the researchendeavor.

Decisions about utility are best understoodwithin the parameters of a specific study. Re-searchers might determine their questions oranalytic tools to enhance the ability of theirfindings to meet their goals. These goals caninclude varied aims, such as galvanizing socialaction, deepening understandings and descrip-tions, or developing hypotheses. The question iswhether the research decisions enable a projectto make contributions that fulfill its stated goals.

Guidelines for Considering CentralProcesses: A Framework for

Understanding Integrity

Having identified core evaluative processes,guidelines can be put forward on how best toconsider these processes in data collection andanalysis. The following section outlines fourfeatures of each process that can assist research-ers and reviewers in considerations of researchdesign and demonstrates how each feature canbe considered in light of a study’s methods,goals, inquiry traditions, and characteristics toenhance integrity (see Figure 1). We present aprinciple relevant to each feature in the contextof a common research dilemma. These princi-ples describe the methodological norms thatunderpin the design and review process andexplicate the type of thinking that can aid inboth qualitative design and review. Then, re-

10 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 10: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

flections upon how these principles can be ap-plied in the design and review processes follow.

Fidelity to the Subject Matter: Guidelinesand Principles

Qualitative researchers are concerned withgathering and developing findings from datathat provide a clear and vivid portrayal or ex-cerpt of the phenomenon as it is understoodwithin the traditions or perspectives in use (e.g.,as real, interpreted, or constructed). Data maybe compiled in many ways, including inter-views, texts, documentation of events (e.g., me-dia, diaries), arts-based videos/photos, partici-pant observations, archival materials, orresearchers’ reflections. In all, considering fi-delity will help researchers and reviewers holdin mind the complexities and variety of thephenomena under study as well as the expres-siveness of the data. Guidelines for achievingand evaluating fidelity follow, with the first twofocused on data collection and the second twoon the analytic process.

Adequate data. The principle we suggestis: Fidelity is improved when data are collectedfrom diverse sources that can shed light upon

variations in the phenomenon as they are rele-vant to the study goals. With this principle, wewish to stress that, across traditions and quali-tative research designs, adequacy of data refersnot to a simple “magic number” of interviews orparticipants (Morrow, 2005). Rather, it asks re-searchers to consider how well they gain accessto the comprehensiveness of and variations inthe subject matter. Within the scope of the re-search question, they consider the kinds andsources of data that will allow them to meettheir goals (Levitt, 2015a; Morrow, 2005). Fol-lowing from this understanding, differencesamong sources of data (e.g., participants, texts)are seen as a strength in qualitative designs asresearchers seek to develop results that are richand encompassing. This latter consideration isparticularly relevant for research goals thatwould be furthered by representing perspectivesthat might be marginalized if not deliberatelyintegrated (Mertens, 2012).

In contrast to those quantitative studies thatseek larger samples to create representativefindings, qualitative studies tend to requiresmaller numbers of participants (with some ap-proaches, such as autoethnography, psychobi-

Concepts Re: Fidelity to Subject Concepts Re: Utility in Achieving Goals

Data Collection

Catalyst for Insight: Can the data lead to insights relevant to the project goals when analyzed using the method under study? Principle: Collecting data that are unconstrained and provide rich grounds for insightful analyses will maximize the utility of the research.

Data Analysisis

Perspective Management in Data Collection: Is the researcher perspective in data collection managed to enhance fidelity? Principle: Fidelity is improved when investigators recognize and are transparent about the influence of their perspectives upon data collection and appropriately limit that influence to obtain clearer representations of their phenomenon--regardless of the researchers’ direct experience with or standpoint in relation to that phenomenon.

Contexualization: Are data contextualized and limits clear? Principle: Considering findings within their appropriate context (e.g., location, culture, historical epoch) improves their utility.

Groundedness: Are the findings grounded within the data that supports their understanding? Principle: Fidelity is enhanced when findings are based within data that support understanding.

Perspective Management in Analysis: Is the researcher perspective in data analysis managed to enhance fidelity? Principle: Fidelity is increased when researchers consider how their perspectives influenced or guided their analysis (e.g., suspending, countermanding, or consciously using their perspectives, depending on their approach to inquiry) in order to enhance their perceptiveness in their analysis.

Coherence: Are the meanings of findings coherent with one another? Principle: Delving into differences within findings and explaining how they relate to one another will enhance the coherence within the findings and their utility.

Meaning Contributions: Are the findings meaningful contributions toward the project goal? Principle: Using methods that enable a meaningful contribution in relation to the study goals increases utility.

Adequate Data: Are the data adequate? Principle: Fidelity is improved when data are collected from diverse sources that can shed light upon variations in the phenomenon as they are relevant to the study goals.

CAPrco

MAto

d

Figure 1. Flowchart to exemplify considerations of methodological integrity in researchdesign and evaluation.

11INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 11: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

ography, or case studies focusing only on oneperson or case); thus, adequacy of data dependsnot on numbers of participants, but on the qual-ity and sufficiency of information as it providesclose access to the richness of the subject mat-ter. In other words, trustworthiness of qualita-tive research may not come from conducting acomprehensive mapping of variation within thepopulation, but rather from selecting experi-ences that map the variation within a phenom-enon (Levitt, 2016c).

Applying this principle in research design.Because trustworthy qualitative analyses pro-vide analogical accounts with enough context,description, and flexibility for readers to makejudgments and apply them to a wide range ofvariations (Osbeck, 2014), researchers need tostrategically decide which forms of variations toseek in their data collection. By consulting theresearch and theoretical literature, conductinginitial analyses, immersing themselves in thecontext or culture of interest, or analyzing theirdirect experience with the phenomenon (e.g.,via ethnography), researchers can become fa-miliar with the characteristics of subject matterand how to represent diversity effectively. Di-versity may be incorporated in many ways, in-cluding through the collection of data from mul-tiple sources (e.g., texts from multiple religions),sources who have a reflective and longstandingengagement with a topic across contexts (e.g.,extensive lived experiences), or sources whohold multiple viewpoints (e.g., recruiting perpe-trators and victims), and may accumulate acrossstudies (e.g., with case studies building uponprior work).

Researchers are not expected to collect datafrom participants with every form of diversity;instead, forms of diversity should be consideredin relation to the study goal. For instance, inresearch on psychotherapy, researchers mightcentralize diversity in psychotherapy orienta-tion; however, in a study on sexuality, research-ers might prioritize diversity in sexual orienta-tion. In other work, the most relevant variety ofdiversity might be across forms of a phenome-non (e.g., different contexts or types of an ex-perience) rather than a demographic character-istic of participants. Although considering howcultural factors may influence a phenomenon isstrongly advised, simply expecting that all qual-itative studies include data sources across alldemographic variables may be neither realistic

nor beneficial. Instead, developing a rationalefor the types of diversity sought in relation tothe research question can help researchers meettheir project goals.

Applying this principle in research review.The proposed principle can assist in the evalu-ation of adequacy of data. Reviewers will wantto keep in mind that the number of participantsby itself is not the criterion of adequacy formost qualitative analyses (Morrow, 2005) andinstead to consider the sources of data in rela-tion to the most important forms of diversityand the purposes and claims made. For instance,if in a paper on the psychological experience ofrecovery from hysterectomies the researchershave interviewed only patients who have hadthis surgery because of a cancer diagnosis,rather than for other reasons (e.g., sex reassign-ment surgery), the reviewer might suggest thatthe authors either broaden their data collectionor narrow the scope of their paper, because theirdata collection would be inadequate to addressthe experiences of hysterectomy broadly. In re-viewing a study with smaller numbers of par-ticipants (or a single participant), as describedin the previous section for authors, a reviewerwould consider if the study has captured diver-sity within the experience of the phenomenon sothat the understanding has fidelity in relation tothe question posed. Even if the diversity withina study is extremely limited, as in a case study,research can have adequate fidelity by adding anew perspective to the literature. For instance, apaper on African American men’s experiencesof a disability may select sources solely fromthat one identity group but may contribute anew understanding to the literature, improvingits fidelity.

Perspective management in data collection.The proposed principle is: Fidelity is improvedwhen investigators recognize and are transpar-ent about the influence of their perspectivesupon data collection and appropriately limit thatinfluence to obtain clearer representations oftheir phenomenon—regardless of the research-ers’ direct experience with or standpoint in re-lation to that phenomenon. This principle rec-ognizes that investigators have perspectives andlife experiences that can influence their researchprocess and which may be similar (e.g., as amember of the group that she or he is research-ing) or dissimilar to the participants or view-points they investigate. In either case, qualita-

12 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 12: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

tive researchers who recognize and evaluatetheir impact upon data collection can maximizefidelity and assist readers in understanding theinfluence of the researchers’ perspectives uponthe data. Although the strategies of inquiry (e.g.,types of participants selected, topic of inquiry)may be influenced by their values, questions,and methods, researchers should not engage indata collection seeking only to confirm theirown perspectives but instead strive to be open toall responses—even when they plan to use acertain perspective (e.g., critical analysis) toguide their analysis.

Applying this principle in research design.Researchers often use reflective strategies thatstructure the examination and limit the effectsof their perspectives upon the data collected.Procedures developed in qualitative traditionsinclude the use of bracketing (Giorgi, 2009;Wertz, 2005), in which researchers set asideideas that might interfere with or inappropri-ately guide data collection. In addition, reflex-ive journaling or memoing can help researchersidentify their assumptions and the ways theymight influence the data, even when researchersdo not believe that they can completely elimi-nate the effects of these assumptions (i.e., “fal-lible bracketing”; Rennie, 2000). Interview-related strategies to manage researchers’perspectives include seeking a wide range ofdata, using nonleading language when askingquestions, using open-ended questions, andclosely following the interviewee (Josselson,2013); asking participants to consider what hasnot been asked (Levitt, 2015a); considering howthe relational dynamics between the interviewerand participant impact the quality of data con-structed (Gilligan, 2015; Josselson, 2013; Polk-inghorne, 2005; Rogers, 2000); or strategicallyusing leading questions to check the reliabilityof answers or verify interviewers’ interpreta-tions (Kvale & Brinkmann, 2009).

Applying this principle in research review.In examining a manuscript, a reviewer wouldseek evidence that authors considered how theirown values and experiences might have influ-enced the data collection process. A challengein reviewing articles is that, although the dis-closure of researchers’ perspectives rhetoricallystrengthens qualitative research, it is not yetmainstream practice. As a result, authors oftenare unsure how much detail is desired by par-ticular journals. Reviewers should expect to

provide guidance to authors in this respect. Re-flexivity and the use of reflective strategies inreports of data collection strengthens fidelity byallowing readers to see how authors obtained apicture of their phenomenon while restrictingthe influence of their own perspectives.

Perspective management in data analysis.The principle is: Fidelity is increased when re-searchers consider how their perspectives influ-enced or guided their analysis (e.g., suspending,countermanding, or consciously using their per-spectives, depending on their approach to in-quiry) to enhance their perceptiveness in theiranalysis. Within the process of qualitative anal-ysis (as opposed to data collection), researcherstend to use two main strategies to manage theirown perspectives and preserve fidelity. In thefirst approach, researchers act to limit the ef-fects of their prior knowledge and theories uponthe analysis by developing self-awareness andacting to suspend or challenge these prior con-ceptions, such as in phenomenological,grounded theory, and participatory action re-search investigations (Giorgi, 2009; Glaser &Strauss, 1967; Kidd & Kral, 2005). These ap-proaches orient the researcher to better drawforth understandings that are presented in thedata and might be obstructed by the researchers’perspectives. A second approach is for investi-gators to use theoretical frameworks as the ve-hicle for their analysis—such as when usingfeminist, multicultural, and critical lenses toanalyze data (Fine, 2013; Gilligan, 2015). Theseapproaches permit researchers to observe dy-namics that are marginalized, inaccessible toparticipants, or that are masked within dominantnarratives. In either case, interrogating the waysin which researchers’ perspectives influencetheir analyses and taking steps to sharpen theirperspicacity increases fidelity (Rennie, 1995).

Applying this principle in research design.Strategies for managing researchers’ perspec-tives exist within many inquiry traditions andmay include independent coders, self-reflectivejournaling, dialogue with participants, or appli-cation of a critical perspective. Other strategiesinclude engaging multiple investigators, partic-ipants, or third parties in cogenerating researchfindings (e.g., Fine, 2013), using consensusmethods (e.g., Hill, 2012), or seeking feedbackon findings via participant checks (e.g., Mor-row, 2005). For instance, a critical researcherwould use dialogue with participants through

13INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 13: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

the process of shaping the results to sharpentheir sensitivity to how racism is unfoldingwithin an institution. A grounded theory re-searcher would use memoing to become awareof and to limit how the researcher’s perspectivesmight narrow the analytic lens. A consensualqualitative researcher would use one or moreauditors to provide feedback on preliminary re-sults. A qualitative researcher using the voice-centered relational method would read for andwrite reflections on the relational dynamicswithin the interview (e.g., Motulsky, 2010). Allthese processes are tools to augment and deepenthe understandings that are forthcoming from ananalysis.

Applying this principle in research review.To provide an example of how reviewers canuse this principle in evaluating papers, if re-searchers collected feedback from research par-ticipants or stakeholders, reviewers can con-sider how differing opinions were weighed inrelation to both the acuity and vantage points ofthe parties. Typically, researcher interpretationshave priority over participants’ or third parties’feedback on findings, as researchers’ perspec-tives are based upon the analysis of all theresearch data (Rennie, 1995; Wertz et al., 2011).Although participants have greater expertise ontheir own experience, the researcher can con-sider data that reaches across experiences. Re-viewers can consider how feedback was used inrelation to these perspectives in order to deepenunderstanding (Levitt, 2016c). This privilegingcan vary in relation to the goals of a projecthowever. For example, if participants in a par-ticipatory action project do not find the findingscompelling, an ensuing project may fail.

Groundedness. For this feature of fidelity,the principle is: Fidelity is enhanced when find-ings are based within data that support under-standing. Groundedness refers to the degree towhich the meanings identified in the analysisare rooted in data of good quality. To demon-strate this quality, qualitative researchers tend toexplicate the process of deriving their results,often supported by rich exemplars from the data(e.g., quotes, images, text) to an extent thatallows the reader to judge the fidelity of theanalysis. Evocative, creative, and aestheticallycompelling writing can aid in this process (Free-man, 2014).

Applying this principle in research design.Strategies for establishing groundedness may

differ across inquiry approaches, methods, re-search goals, and study characteristics. Al-though researchers may adopt analytic strate-gies in keeping with their traditions, the resultswill clearly show, in a balance of interpretivecommentary and supporting evidence, the linksacross data, analysis, and results. A few exam-ples from many possible strategies include aconstructivist investigator making extensive useof self-reflection to ensure that the results aregrounded in the data, a team that is influencedby post-positivist ideals using interrater reliabil-ity calculations to enhance the reliability of theiranalysis, or critical researchers engaging in aprocess of coanalysis with participants for thecoconstruction of meanings that retain fidelityto their life contexts.

Applying this principle in research review.In assessing groundedness in a paper, reviewersshould be convinced both that the findings arebased upon the data and that the data are richenough to support the findings. The thick de-scription provided should go beyond superficialfacts or confirmation of the finding (Ponterotto,2006). For instance, the quote, “Chocolate re-lieves my stress” would be a poor quote for thetheme, “Indulgence relieves stress,” eventhough it replicates the finding, as it does notsuggest that the finding was rooted in a deepunderstanding. A better quote would be,

Because the medical treatment felt like an undeservedpunishment, I felt a need to indulge myself afterwardwith chocolate and do something to correct what washappening to my body. It was a way of protesting thetreatment and valuing my own needs.

Reviewers should feel that the quoted materialnot only supports the finding but also vivifiesthe emotional/relational experience and deepensthe contextual or historical understanding.

Utility in Achieving Goals: Guidelines andPrinciples

We propose that the appropriateness of thedata collection and analytic procedures selectedcan be evaluated by whether they usefully allowa study to meet its aims. Utility is maximized byselecting a process of analysis that organizesdata so that some aspects become more centralin response to the research question. Many qual-itative methods provide guidance to structurethis process and suggest procedures such aschanging description into psychological lan-

14 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 14: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

guage (Giorgi, 2009); employing a codingscheme to develop a conceptual model or atheory (Bryant & Charmaz, 2010; Glaser &Strauss, 1967); or applying an interpretativeframework, such as analytic interpretation(Wertz, 1987) or a voice-centered relationalmodel (Gilligan, 2015; Motulsky, 2010). Thefollowing guidelines can be used to evaluate astudy’s utility in achieving goals, with the firsttwo focused more within data collection and thesecond two within the analytic process.

Contextualization of data. The principlefor this feature of utility is: Considering findingswithin their appropriate context (e.g., location,culture, historical epoch) improves their utility.Researchers must convey sufficient informationregarding the history, the setting, the partici-pants, and the researchers themselves so that thereader can understand features of the contextthat might influence the findings (Morrow &Smith, 2000; Rogers, 2000).

Applying this principle in research design.Considering information that contextualizesdata throughout the analysis can allow research-ers to be attuned to variations in the settings ofa finding and lead to more useful findings. Strat-egies used in this process may include a histor-ical account of the phenomenon or communityunder study, the consideration of demographicdata, details about the participants’ or research-ers’ experiences with the phenomenon, or theuse of research or clinical measures to situatethe participants in relation to a characteristic ofinterest. Identifying patterns that appear to becontext-bound will enhance the utility of thefindings.

Applying this principle in research review.Within a manuscript, the reviewer should ex-pect to find information that frames the findingswithin the context of the specific study andmakes sense of variations in findings. This con-textual information will situate the data col-lected so that findings can be usefully evaluatedand applied in a context-sensitive manner. Forexample, a finding may be found to be robustwithin a certain location (e.g., a dominant dis-course was supported by a dominant group andin its contexts) but to differ in a second locationin relation to a characteristic of a second group(e.g., the discourse was problematized by mar-ginalized groups and in their contexts). Clearstatements about setting, culture, and time pe-riod in relation to variations in the findings

permit the appropriate transferability of findingsacross contexts and enable the understanding ofhow findings might answer related questions.

Catalyst for insight. We have generatedthe following principle that considers the poten-tial for the data to support perceptive analysis:Collecting data that provide rich grounds forinsightful analyses will maximize the utility ofthe research. Throughout the data collection(e.g., selecting archives or data excerpts, iden-tifying participants, interviewing), methodsshould be selected and implemented to enhancethe potential for insight to be derived from thedata.

Applying this principle in research design.As they begin the design process, it will advan-tage researchers to consider how best to identifyor generate data that can support insightful anal-ysis given their perspectives, skills, and posi-tions. For instance, it may be that certain studypersonnel with specific training or understand-ing (e.g., knowledge, interview skills, sharedexperiences), access to data (e.g., proximity todata source), or interpersonal relationships (e.g.,in-group membership) are better able to collectdata. To provide an example, interviewers’ sta-tus or perceived privilege may negatively influ-ence their ability to elicit generative responsesand may reduce participants’ willingness todisclose insightful data. Or, the ways thatinvestigators prepare for and present withinthe interview context may be varied to sup-port confidence (e.g., emphasizing profes-sional credentials, making shared valuesovert) or enhance relational interviewingskills (Josselson, 2013). Interviewers willwant to ask questions that demonstrate sensi-tivity, clarify issues of uncertainty, and leadto innovative responses. Qualities that sup-port collection in one study might impair itwithin another, so this feature should be con-sidered in relation to each study’s attributes.

Applying this principle in research review.Reviewers will consider whether the data col-lected appear to contain insight into the phe-nomenon under study. If the data presented donot appear to support insightful findings, re-viewers can consider whether the data sourceswere unable to provide insight (e.g., studies thatask about experiences that participants have nothad) or were constrained by the interpersonaldynamic (e.g., responding under duress). Typi-cally, there are two ways that qualitative data

15INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 15: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

lead to innovation. Data can enable deep under-standings that evocatively draw out aspects of aphenomenon that have not been considered pre-viously. Alternately, the data collected maybring a wider systemic analysis to bear upon theunderstanding of a topic and place into play newconsiderations of the interactions of social andpersonal processes. These functions can becombined to enhance the utility of the findings.

Meaningful contributions. Complementingthe focus on insight within data collection, re-searchers work to draw forth and shape theinsights afforded by the data so that findingswill be meaningful in addressing the analyticgoals. The principle for this feature is: Usingmethods that enable a meaningful contributionin relation to the study goals increases utility.Meaningful contributions can take many forms(e.g., new theories, social change). For example,replication studies may have utility when thereare questions about earlier findings, when con-verging methods help establish emerging find-ings, or when a new context is under explora-tion.

Applying this principle in research design.There are many ways in which researchers canenhance the meaningfulness of their studies,including forming questions that augment orchallenge current representations of a phenom-enon in the literature, selecting methods that canbest expand prevalent understandings, and dem-onstrating the ability of findings to solve prob-lems posed in their research (e.g., ability toprompt institutional change). Researchers mayperform checks on their analyses (e.g., seekingfeedback on findings) to see whether the mean-ings generated have shed a new light on a phe-nomenon for readers, stakeholders, or the par-ticipants. Checks on the meanings generated inthe findings, if used, should be coherent with thegoals of research, the approach to inquiry, andresearch study characteristics. For instance, re-quiring feedback from participants may not fa-cilitate the research goals when the researchersare using a theoretical lens in their analysis thatthe participants do not share (e.g., a study onracist sentiments embedded within the speechesof anti-immigration politicians) or when work-ing with transient participants who are unavail-able for comment (e.g., homeless youth).

Applying this principle in research review.In the review process, meaning contributionsshould be evaluated with respect to the research

goals as well; meanings generated can servemany functions such as theory development,deepening understandings, generating ques-tions, clinical guidance, or social change. Ifresults within a paper appear shallow and do notfurther the dominant understandings, reviewerscould ask authors to make clearer the contribu-tions or reject a paper. If a study’s goals are met,reviewers may still need to determine whetherthose goals are relevant to a journal’s audienceor mission—not as an issue of methodologicalintegrity but as an issue of fit. For example, apaper might produce contributions of importabout its subject matter but not hold relevancefor a journal that is focused upon methodolog-ical contributions.

Coherence among findings. This principlecan aid considerations of inconsistent findings:Delving into differences within findings and ex-plaining how they relate to one another willenhance the coherence within the findings andtheir utility. The findings developed in the anal-ysis should make sense in relation to one an-other. When contradictions exist in the data,these should be explained so that the readers canunderstand their basis and function.

Applying this principle in research design.Strategies used by researchers to increase inter-nal consistency among findings include the useof models or diagrams to show how findingsrelate to one another. Narratives or artistic rep-resentations also may be used to create proto-typical stories, poems, or voices that convey thecomplexity of a phenomenon (Gilligan, 2015;Hoshmand, 2005; Motulsky, 2010). During theanalysis, researchers may return to the field foradditional data or reanalyze existing data todevelop coherence. Highlighting contradictionsand portraying them in context or seeking outalternate or discrepant meanings also enhancescoherence (Morrow, 2005). For instance, re-searchers might identify clinical decisionalpoints and provide principles that guide thera-pists to follow different routes (e.g., Levitt &Williams, 2010).

Applying this principle in research review.In the process of review, a reviewer can requestthat the author assist the reader in making senseof discrepant findings and how to use them. Ifthe contradictions within central findings re-main unaddressed, this would post a seriouslimitation to the study’s utility. Reviewers maysuggest that researchers articulate qualifications

16 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 16: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

of findings that might reconcile them (e.g., in-dicating when certain findings do or do nothold, whether one finding is dominant with spe-cific exceptions, or if there are underlying fac-tors that can be brought to bear upon findings).By providing this advice, reviewers can serve amentoring function that supports authors to im-prove the utility of their work.

Methodological integrity revisited. Insummary, the Task Force recommends that in-tegrity be understood as the methodologicalfoundation of trustworthiness. It may be as-sessed by considering the criteria of both fidel-ity and utility in relation to a study’s researchmethods, goals, approaches to inquiry, and thecharacteristics of the subject matter and inves-tigators. Neither a study that represents its sub-ject well but fails to usefully address its researchgoals nor a study that contributes a possiblesolution but misrepresents its subject can beconsidered trustworthy. The concepts of fidelityto subject and utility in achieving goals areintertwined (e.g., findings are less likely to beuseful if they do not demonstrate fidelity); how-ever, they are not redundant (e.g., research canhave fidelity but not be useful in answering aquestion). The principles presented exemplifydesign considerations and appraisals of howwell a given study demonstrates these qualities.

These recommendations do not specify pro-cedural definitions or cut-off points for eachappraisal, but rather the conceptual questionsthat would need to be satisfactorily met in theeyes of the author or evaluator. Like trustwor-thiness, methodological integrity remains a mat-ter of interpretation, and we remain wary offraming our recommendations procedurally forthe reasons already detailed. Although compro-mises may sometimes be necessary, an evalua-tion would look for adequacy across these fea-tures in relation to the problem at hand (see thequestions posed in Figure 1). Paramount in theconceptualization of methodological integrity isthat methods are synergistic: for instance, thedata collection method should work well withthe characteristics of the participants to enhancethe fidelity and utility of a study (e.g., withchildren, observing play and art work may pro-vide revelatory data that generates new in-sights). To be clear, we are not arguing thatevery researcher’s goals should be seen as a fitfor every journal, but that methodological integ-

rity should be assessed in relation to the goalsand features of each study.

Although they often are intermingled, expli-cating the functions of fidelity, utility, and in-tegrity within the study design can help re-searchers to design and report their studies andreviewers to differentiate their thinking whenevaluating studies. These recommendations areintended to augment the value of learning thedistinctive procedures of various methods (e.g.,theme analysis, conversational analysis,grounded theory) by considering the logic ofstudy design when adapting them for use withinspecific individual studies or when reviewingqualitative research.

As described previously, this paper buildsfrom the existing corpus of psychological writ-ing on guidelines for qualitative research (e.g.,Elliott et al., 1999; Stiles, 1993). Althoughoverlap exists in the concepts being proposed,the current paper organizes and condenses thevarious recommendations into a framework thatemphasizes their conceptual underpinnings. Im-portantly, this conceptual frame can replacefixed procedure-bound checklist evaluations byproviding a flexible approach for grounding theassessment of trustworthiness across qualitativemethods within the logic underpinning researchdesign in these approaches. They provide a rel-atively straightforward schema for understand-ing the concepts driving design and review.

Recommendations for Journal Editors andEditorial Boards

Until now, recommendations have been pro-posed to inform researchers and reviewers. Inthis section, editors are presented with sugges-tions to best support their implementation:

1. Editors are encouraged to communicate toreviewers that applying evaluative criteriarooted in a philosophical tradition differ-ent from the research in question is inap-propriate, unless the journal is explicitlycommitted to that tradition. For example,although a reviewer might request interra-ter reliability ratings in reviewing a post-positivist content analysis, it would notmake sense for a phenomenological anal-ysis. Within the former method, research-ers tend to prioritize agreement from mul-tiple perspectives in seeking an objective

17INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 17: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

description of a phenomenon, but the lat-ter approach tends to prioritize the in-depth understanding developed from in-tensive analysis. Understanding how theapproaches to inquiry view both the natureof the phenomenon they study and theirmethods will enhance the appropriatenessof review recommendations (e.g., Mor-row, 2005).

2. It is impossible for any editor to have adepth of knowledge in all research tradi-tions. Qualitative action editors with ex-pertise across qualitative methods, however,can be effective in selecting reviewers anddifferentiating between good and poor re-views. These action editors may be betterable to determine appropriate reviewers,weigh conflicting reviews, and to make sug-gestions that are aligned with the researchdesign in use.

3. Editors are encouraged to extend page limitsso that qualitative researchers can describetheir methods as well as present and contex-tualize findings adequately. An extension of10 pages or more would be ideal, but thisdetermination should be made in referenceto the journal’s style, existing page limits,and desire to have the methodological de-tails and results descriptions that would sup-port the paper’s appraisal by both reviewersand readers. If an extension for the printversion is not possible, editors could request,postacceptance, that authors place detailedmethod or results sections in online supple-ments or online versions of an article andthen provide guidance on what informationto delete from the print version. Otherwise,reviewers may expect submissions to con-form to guidelines that were meant to sup-port the reporting of quantitative articles butare inadequate for reporting qualitative stud-ies.

4. Given that it may be impossible for authorsto predict the level at which methodologicaldetails are desired, we encourage editors toinvite authors to respond to reviews seekinggreater methodological detail, especiallywhen reviews are mixed. Qualitative re-searchers may be glad to provide furtherdetail and may have withheld information inan effort to reduce their page numbers.

5. Within their instructions to authors and re-viewers, editors can promote considerations

of methodological integrity as a basis ofevaluation. This recommendation can dis-courage reviewers from using checklists toevaluate methods, from inflexibly applyingprocedural rules from one approach to qual-itative research to another, and from discour-aging innovation and adaptation of methodsto support rigorous study. Instead, it promptsreviewers to conduct a conceptually drivenreview and to tailor that review to the prop-erties of the specific study under consider-ation. Editors can routinely include in theirinvitations to reviewers of qualitative man-uscripts a link to this paper. In addition, theycan include the link to an APA video onreviewing qualitative research that is based,in large, upon the current paper and thatreviewers can access without charge or therequirement of APA-membership (Levitt,2016b; http://www.apa.org/pubs/authors/review-manuscript-ce-video.aspx).

Although there is much variation in how jour-nals and reviewers encourage researchers topresent qualitative research, it is hoped thatthese recommendations will foster greater con-sensus and a higher caliber of qualitative re-search.

Conclusion

Our task force advocates for a way to designand review qualitative research such that twoprocesses—fidelity and utility—are used toguide both design and evaluation in conjunctionwith the concept of methodological integrity. Alist of principles is provided to illustrate theprocess of thinking through integrity and aflowchart streamlines these ideas. Althoughthese recommendations have been developedwithin the rhetoric of qualitative research, over-lap with quantitative research exists as someprinciples of good science apply broadly (Os-beck, 2014).

Instead of institutionalizing rules for authorsthat locate trustworthiness and rigor solelywithin set procedures, this approach is intendedto promote a process of research design andevaluation that enhances the appreciation of di-versity and complexity in qualitative research aswell as supports ethical standards of research(Haverkamp, 2005; Shaw, 2008). In the evalu-ation of fidelity to subject and utility in achiev-

18 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 18: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

ing goals, we recommend that researchers andreviewers consider the interrelation among thegoals of the researchers, the approach to in-quiry, the study characteristics, and the methodsof analysis. Future writings on integrity canelaborate on the working of the relationshipsamong these concepts and within research tasks,designs, and traditions. Above all, we encour-age authors, reviewers, and editors to engage indiscussions that support the continued develop-ment of qualitative methods, design, and re-view.

References

Bakan, D. (1967). On method: Toward a reconstruc-tion of psychological investigation. San Francisco,CA: Jossey-Bass.

Braun, V., & Clarke, V. (2006). Using thematic anal-ysis in psychology. Qualitative Research in Psy-chology, 3, 77–101. http://dx.doi.org/10.1191/1478088706qp063oa

Bryant, A., & Charmaz, K. (Eds.). (2010). The SAGEhandbook of grounded theory. Thousand Oaks,CA: Sage.

Charmaz, K. (2014). Grounded theory in global per-spective: Reviews by international researchers.Qualitative Inquiry, 20, 1074–1084. http://dx.doi.org/10.1177/1077800414545235

Churchill, S. D., & Wertz, F. J. (2001). An introduc-tion to phenomenological research in psychology:Historical, conceptual and methodological contri-butions. In K. J. Schneider, J. F. T. Bugental, &J. F. Pierson (Eds.), The handbook of humanisticpsychology: Leading edges in theory, research,and practice (pp. 248–262). Thousand Oaks, CA:Sage. http://dx.doi.org/10.4135/9781412976268.n19

Creswell, J. W. (2013). Research design: Qualitative,quantitative, and mixed methods approaches.Thousand Oaks, CA: Sage.

Danziger, K. (1990). Constructing the subject: His-torical origins of psychological research. NewYork, NY: Cambridge University Press. http://dx.doi.org/10.1017/CBO9780511524059

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2005).Handbook of qualitative research (3rd ed.). Thou-sand Oaks, CA: Sage.

Eagly, A. H., & Riger, S. (2014). Feminism andpsychology: Critiques of methods and epistemol-ogy. American Psychologist, 69, 685–702. http://dx.doi.org/10.1037/a0037372

Elliott, R., Fischer, C. T., & Rennie, D. L. (1999).Evolving guidelines for publication of qualitativeresearch studies in psychology and related fields.British Journal of Clinical Psychology, 38, 215–229. http://dx.doi.org/10.1348/014466599162782

Fassinger, R. E. (2005). Paradigms, praxis, problems,and promise: Grounded theory in counseling psy-chology research. Journal of Counseling Psychol-ogy, 52, 156–166. http://dx.doi.org/10.1037/0022-0167.52.2.156

Fine, M. (2013). Echoes of Bedford: A 20-year socialpsychology memoir on participatory action re-search hatched behind bars. American Psycholo-gist, 68, 687– 698. http://dx.doi.org/10.1037/a0034359

Fishman, D. (1999). The case for pragmatic psychol-ogy. New York, NY: NYU Press.

Freeman, M. (2014). Qualitative inquiry and the self-realization of psychological science. QualitativeInquiry, 20, 119–126. http://dx.doi.org/10.1177/1077800413510270

Frieze, I. H. (2013). Guidelines for qualitative re-search being published. Sex Roles, 69, 1–2. http://dx.doi.org/10.1007/s11199-013-0286-z

Geertz, C. (1973). The interpretation of cultures.New York, NY: Basic Books.

Gergen, K. J. (2014). Pursuing excellence in qualita-tive inquiry. Qualitative Psychology, 1, 49–60.http://dx.doi.org/10.1037/qup0000002

Gergen, K. J., Josselson, R., & Freeman, M. (2015).The promises of qualitative inquiry. AmericanPsychologist, 70, 1–9. http://dx.doi.org/10.1037/a0038597

Gergen, M. M., & Gergen, K. J. (2012). Playing withpurpose: Adventures in performative social sci-ence. Walnut Creek, CA: Left Coast Press.

Gilligan, C. (2015). The Listening Guide method ofpsychological inquiry. Qualitative Psychology, 2,69–77. http://dx.doi.org/10.1037/qup0000023

Gilligan, C., Spencer, R., Weinberg, M. K., &Bertsch, T. (2003). On the Listening Guide: Avoice-centered relational method. In P. M. Camic,J. E. Rhodes, & L. Yardley (Eds.), Qualitativeresearch in psychology: Expanding perspectives inmethodology and design (pp. 157–172). Washing-ton, DC: American Psychological Association.http://dx.doi.org/10.1037/10595-009

Giorgi, A. (2009). The descriptive phenomenologicalmethod in psychology: A modified Husserlian ap-proach. Pittsburgh, PA: Duquesne UniversityPress.

Glaser, B. G. (1998). Doing grounded theory: Issuesand discussions. Mill Valley, CA: SociologyPress.

Glaser, B. G., & Strauss, A. L. (1967). The discoveryof grounded theory: Strategies for qualitative re-search. London, UK: Wiedenfeld and Nicholson.

Guba, E. G., & Lincoln, Y. S. (2005). Paradigmaticcontroversies, contradictions, and emerging con-fluences. In N. K. Denzin & Y. S. Lincoln (Eds.),Handbook of qualitative research (pp. 191–215).Thousand Oaks, CA: Sage.

19INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 19: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

Haverkamp, B. E. (2005). Ethical perspectives onqualitative research in applied psychology. Jour-nal of Counseling Psychology, 52, 146–155. http://dx.doi.org/10.1037/0022-0167.52.2.146

Hays, D. G., Wood, C., Dahl, H., & Kirk-Jenkins, A.(2016). Methodological rigor in Journal of Coun-seling and Development qualitative research arti-cles: A 15 year review. Journal of Counseling &Development, 94, 172–183. http://dx.doi.org/10.1002/jcad.12074

Hesse-Biber, S. (2016). Qualitative or mixed meth-ods research inquiry approaches: Some looseguidelines for publishing in Sex Roles. Sex Roles,74, 6 –9. http://dx.doi.org/10.1007/s11199-015-0568-8

Hill, C. E. (2012). Consensual qualitative research:A practical resource for investigating social sci-ence phenomena. Washington, DC: American Psy-chological Association.

Hoshmand, L. T. (2005). Narratology, cultural psy-chology, and counseling research. Journal ofCounseling Psychology, 52, 178–186. http://dx.doi.org/10.1037/0022-0167.52.2.178

Hunt, B. (2011). Publishing qualitative research incounseling journals. Journal of Counseling & De-velopment, 89, 296 –300. http://dx.doi.org/10.1002/j.1556-6678.2011.tb00092.x

Josselson, R. (2013). Interviewing for qualitative in-quiry: A relational approach. New York, NY:Guilford Press.

Kidd, S. A., & Kral, M. J. (2005). Practicing partic-ipatory action research. Journal of CounselingPsychology, 52, 187–195. http://dx.doi.org/10.1037/0022-0167.52.2.187

Krippendorff, K. (2013). Content analysis: An intro-duction to its methodology (3rd ed.). ThousandOaks, CA: Sage.

Kvale, S., & Brinkmann, S. (2009). InterViews:Learning the craft of qualitative research inter-viewing (2nd ed.). Thousand Oaks, CA: Sage.

Letts, L., Wilkins, S., Law, M., Stewart, D., Bosch,J., & Westmorland, M. (2007). Guidelines forcritical review form: Qualitative studies. Re-trieved from http://www.canchild.ca/en/canchildresources/resources/qualguide.pdf

Levitt, H. M. (2014). Qualitative psychotherapy re-search: The journey so far and future directions.Psychotherapy: Theory, Research, & Practice, 52,31–37. http://dx.doi.org/10.1037/a0037076

Levitt, H. M. (2015a). Interpretation-driven guide-lines for designing and evaluating grounded theoryresearch: A pragmatic-social justice approach. InO. Gelo, A. Pritz, & B. Rieken (Eds.), Psychother-apy research: Foundations, outcome and process(pp. 455–483). New York, NY: Springer. http://dx.doi.org/10.1007/978-3-7091-1382-0_22

Levitt, H. M. (2015b). Recommendations for design-ing and reviewing qualitative research: The role of

methodological integrity. In A. Bohart (Chair),Evolving understandings on trustworthiness orrigor in qualitative research. Presented at the So-ciety for Psychotherapy Research, Philadelphia,PA.

Levitt, H. M. (2016a). Qualitative approaches. InJ. C. Norcross, G. R. VandenBos, D. K. Freed-heim, & B. O. Olatunji (Eds.), APA handbook ofclinical psychology: Volume II. Clinical psychol-ogy: Theory and research (pp. 335–348). Wash-ington, DC: American Psychological Association.

Levitt, H. M. (2016b). Recommendations for review-ing qualitative research. Film in the AmericanPsychological Association’s Continuing EducationDivision. Washington, DC: American Psycholog-ical Association. Retrieved from http://www.apa.org/pubs/authors/review-manuscript-ce-video.aspx

Levitt, H. M. (2016c). Considering researcher con-sensus and generalization in qualitative research.Manuscript in preparation. University of Massa-chusetts Boston, Boston, MA.

Levitt, H. M., Bamberg, M., Josselson, R., & Wertz,F. (2016). Considering methodological integrity inthe review process. In H. M. Levitt (Chair), De-veloping standards for qualitative research in psy-chology. Paper presented at the Society for Qual-itative Inquiry in Psychology Conference,Ramapo, NJ.

Levitt, H. M., Lu, E. C., Pomerville, A., & Surace,F. I. (2015). Pursuing the question of reflexivity inpsychotherapy and qualitative methods: The con-tributions of David L. Rennie. Counselling & Psy-chotherapy Research, 15, 3–11.

Levitt, H. M., Morrow, S. L., Wertz, F. J., Motulsky,S. L., & Ponterotto, J. G. (2014). Inviting com-ments on the evolving recommendations for re-viewing qualitative research in psychology. Paperpresented at the 122nd American PsychologicalAssociation Annual Convention, Ontario, Wash-ington, DC.

Levitt, H. M., Motulsky, S. L., Wertz, F. J., & Jos-selson, R. (2015). Recommendations for designingand reviewing qualitative research from the SQIPtask force. Conversation hour held at the 123rdAmerican Psychological Association Annual Con-vention, Toronto, Ontario, Canada.

Levitt, H. M., Motulsky, S. L., Wertz, F. J., Jossel-son, R., & Ponterotto, J. G. (2015). Recommenda-tions for designing and reviewing qualitative re-search from the SQIP task force. Annual meetingof the Society of Qualitative Inquiry in Psychol-ogy, New York, NY.

Levitt, H. M., Neimeyer, R. A., & Williams, D. C.(2005). Rules versus principles in psychotherapy:Implications of the quest for universal guidelinesin the movement for empirically supported treat-ments. Journal of Contemporary Psychotherapy,

20 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO

Page 20: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

35, 117–129. http://dx.doi.org/10.1007/s10879-005-0807-3

Levitt, H. M., & Williams, D. C. (2010). Facilitatingclient change: Principles based upon the experi-ence of eminent psychotherapists. PsychotherapyResearch, 20, 337–352. http://dx.doi.org/10.1080/10503300903476708

Mallinckrodt, B. (2010). Guidelines for reviewingmanuscripts for the Journal of Counseling Psy-chology. Retrieved from http://www.apa.org/pubs/journals/features/cou-reviewer-guidelines.pdf

Mertens, D. M. (2012). Ethics and social justice inethnocultural qualitative research. In D. K. Nagata,L. Kohn-Wood, & L. A. Suzuki (Eds.), Qualitativestrategies for ethnocultural research (pp. 61–84).Washington, DC: American Psychological Associ-ation. http://dx.doi.org/10.1037/13742-004

Morgan, D. L. (2007). Paradigms lost and pragma-tism regained: Methodological implications ofcombining qualitative and quantitative methods.Journal of Mixed Methods Research, 1, 48–76.http://dx.doi.org/10.1177/2345678906292462

Morrow, S. L. (2005). Quality and trustworthiness inqualitative research in counseling psychology.Journal of Counseling Psychology, 52, 250–260.http://dx.doi.org/10.1037/0022-0167.52.2.250

Morrow, S. L., & Smith, M. L. (2000). Qualitativeresearch for counseling psychology. In S. D.Brown & R. W. Lent (Eds.), Handbook of coun-seling psychology (3rd ed., pp. 199–230). Hobo-ken, NJ: Wiley.

Motulsky, S. L. (2010). Relational processes in ca-reer transition: Extending theory, research, andpractice. The Counseling Psychologist, 38, 1078–1114. http://dx.doi.org/10.1177/0011000010376415

Osbeck, L. M. (2014). Scientific reasoning as sensemaking: Implications for qualitative inquiry. Qual-itative Psychology, 1, 34–46. http://dx.doi.org/10.1037/qup0000004

Parker, I. (2004). Criteria for qualitative research inpsychology. Qualitative Research in Psychology,1, 95–106. http://dx.doi.org/10.1191/1478088704qp010oa

Patton, M. J. (2015). Qualitative research and eval-uation methods: Integrating theory and practice(4th ed.). Thousand Oaks, CA: Sage.

Polkinghorne, D. E. (2005). Language and meaning:Data collection in qualitative research. Journal ofCounseling Psychology, 52, 137–145. http://dx.doi.org/10.1037/0022-0167.52.2.137

Ponterotto, J. G. (2005a). Integrating qualitativeresearch requirements into professional psychol-ogy training programs in North America: Ratio-nale and curriculum model. Qualitative Re-search in Psychology, 2, 97–116. http://dx.doi.org/10.1191/1478088705qp035oa

Ponterotto, J. G. (2005b). Qualitative research incounseling psychology: A primer on research par-adigms and philosophy of science. Journal ofCounseling Psychology, 52, 126–136. http://dx.doi.org/10.1037/0022-0167.52.2.126

Ponterotto, J. G. (2005c). Qualitative research train-ing in counseling psychology: A survey of direc-tors of training. Teaching of Psychology, 32, 60–62.

Ponterotto, J. G. (2006). Brief note on the origins,evolution, and meaning of the qualitative researchconcept “thick description.” Qualitative Report,11, 538–549.

Rennie, D. L. (1995). On the rhetorics of socialscience: Let’s not conflate natural science and hu-man science. The Humanistic Psychologist, 23,321–332. http://dx.doi.org/10.1080/08873267.1995.9986833

Rennie, D. L. (2000). Grounded theory methodologyas methodical hermeneutics: Reconciling realismand relativism. Theory & Psychology, 10, 481–502. http://dx.doi.org/10.1177/0959354300104003

Rennie, D. L. (2004). Anglo-North American quali-tative counseling and psychotherapy research. Psy-chotherapy Research, 14, 37–55. http://dx.doi.org/10.1093/ptr/kph003

Rennie, D. L. (2012). Qualitative research as method-ical hermeneutics. Psychological Methods, 17,385–398. http://dx.doi.org/10.1037/a0029250

Rogers, A. G. (2000). When methods matter: Quali-tative research issues in psychology. In B. M.Brizuela, J. P. Stewart, R. G. Carrillo, & J. G.Berger (Eds.), Acts of inquiry in qualitative re-search (pp. 51–60). Cambridge, MA: Harvard Ed-ucational Review.

Shaw, I. (2008). Ethics and the practice of qualitativeresearch. Qualitative Social Work: Research andPractice, 7, 400–414. http://dx.doi.org/10.1177/1473325008097137

Staller, K. M. (2013). Epistemological boot camp:The politics of science and what every qualitativeresearcher needs to know to survive in the acad-emy. Qualitative Social Work: Research and Prac-tice, 12, 395– 413. http://dx.doi.org/10.1177/1473325012450483

Stiles, W. B. (1993). Quality control in qualitativeresearch. Clinical Psychology Review, 13, 593–618. http://dx.doi.org/10.1016/0272-7358(93)90048-Q

Strauss, A. L., & Corbin, J. (1990). Basics of quali-tative research: Grounded theory procedures andtechniques (2nd ed.). Thousand Oaks, CA: Sage.

Suzuki, L. A., Ahluwalia, M. K., Mattis, J. S., &Quizon, C. A. (2005). Ethnography in counselingpsychology research: Possibilities for application.Journal of Counseling Psychology, 52, 206–214.http://dx.doi.org/10.1037/0022-0167.52.2.206

21INTEGRITY WITHIN QUALITATIVE RESEARCH

Page 21: Recommendations for Designing and Reviewing Qualitative ... · Recommendations for Designing and Reviewing Qualitative Research in Psychology: Promoting Methodological Integrity Heidi

Tracy, S. J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research.Qualitative Inquiry, 16, 837– 851. http://dx.doi.org/10.1177/1077800410383121

Wertz, F. J. (1983). From everyday to psychologicaldescription: Analyzing the moments of a qualita-tive data analysis. Journal of PhenomenologicalPsychology, 14, 197–241. http://dx.doi.org/10.1163/156916283X00108

Wertz, F. J. (1987). Common methodological funda-ments of the analytic procedures in phenomeno-logical and psychoanalytic research. Psychoanaly-sis & Contemporary Thought, 9, 563– 603.Retrieved from http://fordham.bepress.com/psych_facultypubs/25

Wertz, F. J. (1999). Multiple methods in psychology:Epistemological grounding and the possibility ofunity. Journal of Theoretical and PhilosophicalPsychology, 19, 131–166. http://dx.doi.org/10.1037/h0091173

Wertz, F. J. (2005). Phenomenological researchmethods for counseling psychology. Journal of

Counseling Psychology, 52, 167–177. http://dx.doi.org/10.1037/0022-0167.52.2.167

Wertz, F. J. (2014). Qualitative inquiry in the historyof psychology. Qualitative Psychology, 1, 4–16.http://dx.doi.org/10.1037/qup0000007

Wertz, F. J., Charmaz, K., McMullen, L., Josselson,R., Anderson, R., & McSpadden, E. (2011). Fiveways of doing qualitative analysis: Phenomeno-logical psychology, grounded theory, discourseanalysis, narrative research, and intuitive inquiry.New York, NY: Guilford Press.

Williams, E. N., & Morrow, S. L. (2009). Achievingtrustworthiness in qualitative research: A pan-paradigmatic perspective. Psychotherapy Re-search, 19, 576 –582. http://dx.doi.org/10.1080/10503300802702113

Received January 21, 2016Revision received September 8, 2016

Accepted September 26, 2016 �

22 LEVITT, MOTULSKY, WERTZ, MORROW, AND PONTEROTTO