Magazine article National Association of School Psychologists. Communique

The Implications of "Estimating the Reproducibility of Psychology Science" for School Psychology Research

Magazine article National Association of School Psychologists. Communique

The Implications of "Estimating the Reproducibility of Psychology Science" for School Psychology Research

Article excerpt

In August 2015, in an article entitled "Estimating the Reproducibility of Psychology Science," the journal Science reported the findings of a comprehensive collaboration of social scientist researchers, collectively described as the "Open Science Collaboration" (OSC), who replicated the findings of 100 experiments that were published in three well regarded peer-reviewed journals in psychology (OSC, 2015). The opening sentence of the abstract was telling: "Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown" (p. 943). The OSC reported that while 97% of original research yielded significant findings, only 36% of their replications did so.

The most prominent visual in the article was a scatterplot depicting the low correlation between original study effect sizes and replicated effect sizes. Replicated effect sizes were almost always lower than that of the original report, and often to a large degree. The OSC reported the average effect size of the replications (r = .20) was half that of the original studies (r = .40). However, magnitude of the original study effect size and replication power appeared to increase replication success. These findings likely reflect several outstanding issues with psychological research at large: a lack of enduring lines of replicated findings; the overgeneralization of significance tests and a passive assumption that family-wise error has been reported factually; complexity of procedures and analyses; faith in the truthful report of experimental methods; and confirmation bias. These issues are not localized to psychology, and similar issues have been raised in other scientific fields, including biology and medicine (e.g., Errington et al., 2014; Ionnadis, 2005).

The psychological community's response appeared to be fairly muted, with a general reception being that the results came to no one's surprise. Perhaps this is because prominent leaders in the psychological community had been warning that such a pattern would likely emerge. Esteemed developmental researcher Jerome Kagan previously had expressed concerns with overconfidence in the external validity of psychological research, a brief attention span for research lines, and experimenter biases infecting studies (Winerman, 2012). Shadish, Cook, and Campbell (2002) noted, "Any single scientific study is an exercise in trust ... The ratio of trust to skepticism in any given study is more like 99% trust to 1% skepticism than the opposite" (p. 29). John, Loewenstein, and Prelec (2012) surveyed a sample of anonymous psychological researchers on the truthfulness of their reported findings in published manuscripts, finding that the majority of researchers recounted violations of integrity ranging from rearranging of hypotheses to fit collected data to exclusion of undesirable data.

The concern for school psychology is how valid recommendations are for practice generated out of its research base. Although it appears that findings from the OSC (2015) excluded any study directly related to school psychology practice, the overall cross-disciplinary trend is clear: It is likely that all branches of psychology contain a large proportion of nonreplicable findings in their published literature, which raises questions regarding the validity of the original research. Naturally, peer review is one way scientific communities weed out poor research, although this study shows that it is an imperfect process. A number of scholars have contributed rubrics for the evaluation of scientific claims. For example, Lilienfeld, Ammirati, and David (2012) described both common warning signs of questionable psychological practices and analytic tools that consumers of research can apply to remain objective in their evaluation of extant practices. Below I discuss a basic set of such tools and their potential implications for school psychologists as consumers of research.

Replicate, replicate, replicate. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.