Appraising Qualitative Research Reports: A Developmental Approach
Cooper, Robin, The Qualitative Report
There is a rich and helpful body of academic literature related to writing and appraising qualitative research. This literature includes a discussion of what constitutes quality in qualitative research, as well as what contributes to a quality research report. In addition, there are several appraisal tools available to help those seeking to assess the quality of qualitative research reports. All of the Editors of The Qualitative Report think about how to evaluate qualitative research reports on a regular basis. We consider this issue both from a process perspective and a quality control perspective. In this paper, I describe how we as Editors of TQR came to the decision to develop our own appraisal tool for papers submitted to the journal and describe the tool that we developed--the TQR Rubric (Chenail, Cooper, Patron, & TQR Associates, 2011), as well as the developmental approach that guides our editorial processes and products.
Quality in Qualitative Research
Flick (2007) points out that the question of quality in qualitative research can be addressed on four different levels: the researcher interested in learning how well they have conducted their research, funding bodies seeking to determine what studies should be funded or evaluating funded research, journal editors deciding which research reports to publish, and readers hoping to learn what research they can rely on in their own work. In this paper we address quality in qualitative research from the perspective of journal editors. As Flick notes, "Here, the quality issue is in some way doubled. Consideration of rigour and criteria in the research is seen as essential if the research is to be published. The research in its presentation has to be linked back to existing literature, for example--which is a criterion at the level of presentation" (p. 5).
In the peer review process of reviewing manuscripts for journals, "a growing number of guidelines for assessing research papers (articles, proposals) are developed, used and published in different fields of application" (Flick, 2007, p. 22). For example, Kitto, Chesters, and Grbich (2008) describe how they assess the quality of submissions to the Medical Journal of Australia where their focus is on rigour of research and transferability of findings. In addition to journals having their own preferences for manuscript quality, other groups have developed their own tools to appraise completed articles as part of systematic reviews of previously published work. The recent work by Hannes, Lockwood, and Pearson (2010) is one attempt to provide a comparison of three such appraisal instruments available online free of charge: the Joanna Briggs Institute (JBI) tool, the critical appraisal skills program (CASP) tool, and the evaluation tool for qualitative studies (ETQS). In addition to these qualitative research specific instruments, many journal editors use the Publication Manual of the American Psychological Association (2010) as a guide to the quality of a manuscript. However, as Polkinghorne (2010) observes, the APA author guidelines were originally developed based on quantitative research reports, which can sometimes present a challenge to authors of qualitative research reports.
In this environment of transparency regarding the articulation of what constitutes quality in qualitative research reporting, it is apparent that the context of the appraising body is an important factor in what the reviewer demarcates as quality and how these preferences are communicated to the general public as well as to potential and actual authors. In this spirit of localization and transparency, the Editors of The Qualitative Report embarked on a process of self-reflection of what constitutes quality in qualitative research writing from our local perspective and history. The goal of such an endeavor was first to make the TQR preferences overt to the journal's internal community and second to produce a communication device though which these practices could be made more transparent. …