A Factor Analytic Model of Eighth-Grade Art Learning: Secondary Analysis of NAEP Arts Data

Article excerpt

Investigators undertook a secondary analysis of 1997 National Assessment of Educational Progress (NAEP) Arts data to clarify the structure of relationships associated with visual arts achievement among eighth-grade students. Items from three NAEP surveys answered by students and school personnel as part of the visual arts assessment were considered as sets of indicators for underlying constructs. Confirmatory factor analysis was used to test a structural model presenting demographics (background characteristics), resources, and opportunities to learn categories. Factors within the three categories were found to be significantly related to dependent variables for arts achievement (responding and creating). The secondary analysis reported here embraces the rich complexity of the NAEP arts assessment, postulating that coordinated study of its vision, framework, procedures, and data has enormous potential to inform art educators and policy makers, particularly those interested in America's middle schools.

Preamble

A visual arts consortium formed in 1999 to study statistical data from the 1997 NAEP project (Diket, Burton, McCollister, & Sabol, 2000), responding to an open invitation at the National Art Education Association conference in Washington to apply for funding under a secondary analysis grant. Three investigative plans emerged from collaborative planning efforts. The lead investigator would test the consortium's structural model (see Figure 1) that represented responses and sets of responses from NAEP survey instruments. The primary question was "what are the factor sets associated with eighth-grade students' general and art background, and data provided at the school level, that are impacting arts achievement?" Other investigators would use confirmed variable sets from the model in examinations of regional and quartile variation (see Sabol & Burton, this issue). In 2000, group members received a secondary analysis grant from the National Center for Educational Statistics (NCES) and the Department of Education. Findings from the first year of investigation were reported at the NAEA annual conference in New York and during the AERA annual meeting in Seattle. The grant team also connected with Richard Siegesmund (see this issue) who had recently completed a school-- level NAEP replication in California, enlarging the loose consortium of investigators in the visual arts. The investigators continue to work interactively, sharing findings and consulting on decision paths as study proceeds with NAEP.

Introduction

The National Assessment of Educational Progress (NAEP) in the Arts illuminated several concerns in the field: (1) Was the impact on arts achievement for students taking art in eighth grade dependent upon issues of teacher preparation, instructional delivery, and collaborative activity with community arts agencies? (2) How important was family environment to arts learning? (3) What does student motivation in art contribute to art achievement?

A related issue emerged concerning the development of national standards (that include expectations for writing components and experience with a variety of media) which coincided with the development of the national assessment project. Respective committees intended a smooth articulation between standards and assessment (Arts Education Consensus Project Team, 1994). Constructed answers and artistic products typically are not standard fare in NAEP educational assessments that stress objective items; but constructed tasks have been considered standard, even exemplary practice, in arts assessment (Armstrong, 1994; Beattie, 1997). Given that national standards were released between piloting and full implementation of the NAEP arts assessment, the 1997 assessment provides carefully articulated benchmarks against which to measure subsequent infusion of standards and exemplary practices in arts education through student performance in American schools. …