Academic journal article Canadian Psychology

Matching the Limits of Clinical Inference to the Limits of Quantitative Methods: A Formal Appeal to Practice What We Consistently Preach

Academic journal article Canadian Psychology

Matching the Limits of Clinical Inference to the Limits of Quantitative Methods: A Formal Appeal to Practice What We Consistently Preach

Article excerpt

Abstract

Valid clinical inferences may be said to emanate directly from both substantive and quantitative considerations. To the extent that either component houses deficiencies, clinical inferences will be undermined. The quantitative literature has reiterated the need to strive for homogeneity in the "objective - analysis - inference" chain. However, in spite of its importance to valid clinical inference, this reminder has been overlooked with alarming frequency. This paper outlines three of the most common cases where quantitative errors have undermined data - based inferences. These cases include precomputational data aggregation, data residualization of one form or another, and post hoc significance testing. In each instance, errors may be corrected so as to redress violation of the homogeneity of quantitative research components. Solutions in each case are discussed and illustrated with reference to published and unpublished examples. It is suggested that deployment of these solutions will enhance the quality of quantitative information and thus, the quality of clinical inferences.

In an effort to enhance the treatment of clients, and/or to improve institutional practices, the clinician is continually required to engage in the process of decision - making (see Chassan, 1979; Hoch, 1982; Kazdin, 1980; Neufeld, 1977; Shakow, 1982). The integrity of decision - making is inherently related to the integrity of the information upon which it is based. In the clinical setting, results of empirical investigations should, in principal, constitute a primary source of information. In using empirical data to arrive at meaningful decisions, a clinician must validly be able to interpret extant research findings. In the absence of such findings, the clinician may additionally find it beneficial to conduct problem - relevant research. The ability to engage usefully in either of these latter two activities depends crucially on a certain integrative skill. Specifically, it requires the ability to integrate and apply knowledge from both the substantive and quantitative domains.

The notion that valid inferences are dependent on appropriate marriages between substantive and quantitative considerations is not new (e.g., see Shapiro, 1957). Historically, methodology textbooks have meticulously reminded us that research designs should emanate from such integrations. In addition, they have reminded us that there should always be coherence between the primary components of a research problem. That is, they have suggested that the objectives of a study should be consistent with the chosen observational units, statistical analyses, and levels of inference. Thus, if it is a researcher's objective to work with aggregated data, statistical analyses should utilize aggregated scores as the primary computational units. In addition, final inferences should be restricted to the aggregate level of analysis so as to maintain homogeneity among research components. In the previous example analyzing non - aggregated scores would disrupt this desired state of coherence. Making inferences about non - aggregated versus aggregated computational units would constitute a similar type of error. In the following paper, coherence between the components of a research problem will be referred to as consistency in the "objective - analysis - inference" chain. It will be contended that aiming for such consistency will act to mitigate certain errors frequently made within the quantitative domain.

The literature is replete with examples where there is a disruption in the consistency of the objective - analysis - inference" chain. Such disruptions can complicate, and even prevent, psychologically meaningful inferences and thus the integrity of decisional processes. In this paper lack of coherence will be discussed in relation to three common quantitative procedures. These include precomputational data aggregation, methods of data residualization, and post hoc significance testing. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.