Academic journal article Exceptional Children

Curriculum-Based Assessment and Direct Instruction: Critical Reflections on Fundamental Assumptions

Academic journal article Exceptional Children

Curriculum-Based Assessment and Direct Instruction: Critical Reflections on Fundamental Assumptions

Article excerpt

ABSTRACT: This article critiques the fundamental assumptions about what counts as knowledge and how knowledge is claimed that underlie Curriculum-Based Assessment (CBA) and Direct Instruction (DI). The central conclusion is that CBA/DI reflects not a model of assessment and instruction for human learning, but an isolated set of measurement and control procedures that are superimposed on but are unrelated to the human phenomena they claim to assess. Histolic understandings of assessment that directly emerge from the characteristically human aspects of learning and teaching and from contemporary understandings of the construction of knowledge are advocated instead.

For four wicked centuries the world has dreamed this foolish dream of efficiency, and the end is not yet.

George Bernard Shaw

in Courtney (1982, p. 68)

A perusal of special education literature of the last 5 years readily shows the popularity of curriculum-based assessment (CBA) and direct instruction (DI). CBA offers a set of measurement and control procedures for both assessment and instruction: Assessment outcomes are immediate indicators of what needs to be taught next. As Tucker (1985) stated: "CBA is the ultimate in 'teaching the test... (p. 199). DI "teaches the test" with even greater exactness by adding precise sequences of specific content components; explicit, scripted instructional steps to teach each component; and precise feedback procedures to deal with student errors.

The CBA literature typically has prescribed the following steps:

* List skills presented in the curriculum materials in logical order.

* Write objectives for each skill in behavioral terms so that correct responses to controlled tasks can be counted.

* Construct test items.

* Set level of mastery desired.

* Assess before instruction for level of mastery.

* Conduct instructional intervention.

* Count specified objectives (advice varies from frequently, to at least twice a week, to daily, if possible).

* Analyze performance trends to judge adequacy of intervention (for specific listings of steps see, e.g., Blankenship, 1985; Deno, 1987; Fuchs, Hamlett, Fuchs, Stecker, & Ferguson, 1988; Idol & Ritter, 1987).

Other labels have been used to refer to essentially the same constructs and procedures. Bursuck and Lessen (1987), for example, used the terms curriculum-based assessment and instructional design. idol and Ritter (1987) and Jones and Krouse (1988) used the term data-based instruction (DBI). Deno (1987), Fuchs et al. (1988), and Wesson, Fuchs, Tindal, Mirkin, and Deno (1986) used curriculum-based measurement. Fuchs, Deno, and Mirkin (1984) used the label data-based program modification, and Deno and Fuchs (1987) used the term curriculum-based progress monitoring.

At the level of methodology, CBA and DI are compatible, but differ in detail. DI can be seen as complementing CBA by adding actual prescriptive content. At the level of fundamental assumptions, however, they share the same ontological and epistemological beliefs about what can count as valid knowledge (and by implication what cannot) and how one is (or is not) allowed to claim such knowledge. The purpose of this article is to discuss these fundamental assumptions and to render a critical analysis of them.

SCOPE OF CBA/DI

The measurement and control procedures of CBA/DI are wide ranging. First, these techniques offer a framework for teacher preparation. Idol and Ritter (1987) concluded their description of a teacher training program on DBI as follows: "A majority (of teachers) may consider DBI to be the single most important skill learned in the course of their preparation" (p. 69). A survey of the methodological content of teacher training programs in learning disabilities (Pugach & Whitten, 1987) showed that up to 69% of the programs surveyed fell into the related categories of CBA, DI, and DBI. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.