Surveying Science Literacy among Undergraduates: Insights from Open-Ended Responses
Antonellis, Jessie, Buxner, Sanlyn, Impey, Chris, Sugarman, Hannah, Journal of College Science Teaching
Instructors who take constructivist, learner-centered approaches to teaching know that students come to the classroom with their own histories of learning that influence the way they respond to and process new information. It is important to acknowledge and engage these learning histories so that students can connect their prior knowledge with the new knowledge presented to them. Often, the topics covered in science courses are not entirely new to students; they have had perhaps nearly 20 years to experience the world and construct their own notions of how it works. The question is: What do students think about these topics when they come into the classroom? Are their ideas similar to the ideas presented in class, or are they radically different from the understandings the instructor hopes to engender?
This paper presents the qualitative analysis of data from a large long-term project, which aims to analyze the science knowledge and attitudes toward science of undergraduate students who were enrolled in introductory astronomy courses at the University of Arizona (Impey, Buxner, Antonellis, Johnson, & King, 2011). The students were predominantly freshman and sophomore nonscience majors taking a science class as part of a general education requirement. The data described in this paper were collected via a written survey from nearly 10,000 students in the first week of introductory astronomy courses at the University of Arizona over the course of 20 years, from 1989 to 2009. The survey was adapted from the science literacy questions analyzed by the National Science Foundation as part of its biannual report to Congress (National Science Board, 1988, 2010). The qualitative data are derived from four open-ended questions designed to delve deeper into respondents' understandings of concepts than the forced-response questions.
In addition to the student data, in 2009 we collected data from 170 University of Arizona science faculty members, postdocs, and graduate students; three questions from this online survey inquired into the scientists' criteria for assessing students' responses to three of the questions from the student survey. The first of these questions posed to students is the quintessential question for assessing scientific literacy: What does it mean to study something scientifically? We also inquired into students' knowledge and scientists' assessments for two content-knowledge questions: (a) What is DNA? and (b) Briefly, define computer software. Students were also asked: What is radiation? Although this question was not included on the scientists' survey, we had a substantial literature base on the topic from which to draw.
Our analysis of trends in students' responses allowed us to create a picture of student thinking relative to these four topics. Thus, we can assess both the range and the incidence of these different ideas, which provides us with a wealth of information about what conceptions students may have constructed prior to entering the classroom and assess them in comparison with the scientists' criteria for success.
For the purposes of understanding how this group of students conceptualized the subjects represented by the four questions, we developed methods that went beyond classifying responses as more or less "correct" and provide a more fine-grained picture of students' thinking. With the large number of responses (ranging from 5,700 responses for the radiation item to 7,800 responses for the DNA item), the method captured both the complexity of the variation in students' responses and systematically distilled the data into a manageable form for analysis. Each coding scheme (a) documented the frequency of common themes, (b) brought to the forefront the more unusual ideas that were nevertheless elucidating patterns in student thinking, and (c) noted the scarcity of other ideas that were less prevalent than anticipated or hoped for. The rich and varied landscape of responses to the open-ended questions, and the challenge of comparing students' and scientists' responses, meant that we decided early on to develop a coding scheme, driven by the data, so that we could draw inferences quantitatively. …