Beyond the Bubble in History/social Studies Assessments: To Prepare Students for Assessments Tied to the Common Core, Teachers Need Tools and Tests That Help Students Analyze Primary and Secondary Sources and Develop Written Historical Arguments

Article excerpt

The wait is over. The Common Core State Standards have arrived in public schools. Like a long-awaited Hollywood blockbuster, the Common Core has been the subject of intense anticipation, speculation, and scrutiny. Teachers and administrators hurried to get ready. A mini-industry of how-to guides, curriculum maps, and professional development workshops has sprouted. Yet, despite all this effort and the welcome focus on literacy, teachers of history/social studies still lack adequate resources to implement these standards. The biggest trouble spot is assessment.

The Common Core introduces ambitious goals for student learning. In history/social studies, students are expected to analyze primary and secondary sources, cite textual evidence to support arguments, consider the influence of an author's perspective, corroborate different sources, and develop written historical arguments--crucial skills if students are to succeed in college and beyond. They also represent a radical turn from what was emphasized during a decade of relentless standardized testing. But if students are to master these skills, teachers need tools to monitor growth, identify where students are having trouble, and figure out how best to help them. What tools do teachers have to do this?

Multiple-choice tests continue to dominate assessment across all subjects, but especially in history (Martin, Maldonado, Schneider, & Smith, 2011). It's easy to understand the affinity for multiple choice tests: They're quick and inexpensive, and the number-right score provides a seductive (if false) sense of precision. But expecting multiple-choice tests to measure sophisticated cognitive capacities is like using a pocket-knife to do surgery. Multiple-choice questions are perhaps suited to measure aspects of factual recall, but they are ineffective for gauging the higher-order thinking demanded by the Common Core.

But this doesn't stop state departments of education from trying to use them, often with absurd results. Consider this standard from California's History/Social Science Framework. It asks students to "interpret past events and issues within the context that an event unfolded rather than solely in terms of present day norms and values" (California State Department of Education, 1998, p. 41). Historians refer to this as the ability to overcome presentism (Hunt, 2002), seeing beyond our brief lifetime into the expanse of human history and how people in the past conceived of their world.

Now, consider an item used to measure this understanding on California's year-end state test:

Which was one outcome of World War II?

A. England and France increased their overseas possessions.

B. The communists gained control over most of Western Europe.

C. Japan and Germany became dominant military powers in their regions.

D. The Soviet Union emerged as an international superpower. (California State Department of Education, 2009, p. 23)

Strong students will readily identify D as the correct answer, but what happened to interpretation? Or placing events in context? What happened, in short, to thinking? If we want students to develop the skills laid out in the Common Core, it makes little sense to ask them to pick facts from a bounded list of dubious distracters.

But what are the alternatives? In history/social studies, the most highly touted one is the document-based question made famous by the College Board's Advanced Placement Program. Widely known by its acronym, the DBQ asks students to read 10 to 12 documents, formulate a thesis on their basis, plan an argumentative essay, compose that essay, and then proofread it for clarity, coherence, and correctness--all in one hour. To its credit, the DBQ calls on many of the literacy skills identified by the Common Core: the ability to read multiple sources, evaluate claims, and mount arguments using evidence.

Still, given all of these moving parts, it is unclear what, exactly, the DBQ measures. …