A Comparison of Case Study and Traditional Teaching Methods for Improvement of Oral Communication and Critical-Thinking Skills
Noblitt, Lynnette, Vance, Diane E., Smith, Michelle L. DePoy, Journal of College Science Teaching
Scientists must be able to communicate scientific and technical issues to a variety of professional and lay audiences in both oral and written form. In today's collaborative workplace, scientists are frequently called on to work in teams that include a variety of nonscientific participants. Scientific researchers are continually required to justify research and development needs to funding entities. Promotion within corporate, academic, and other entities inevitably requires the ability to communicate outside the scientific community. Furthermore, scientists are increasingly called on to explain scientific research and its broader implications to various nonscientific communities to inform the general public and even influence the government.
Undergraduate science curricula often focus primarily on problem solving, computation, and scientific concepts. Such a skill set, however, is not broad enough to encompass all the tasks and challenges science students will encounter in their careers. Science curricula that provide students with significant opportunities to improve their communication skills will provide a useful service. Development of such communication skills must be specifically focused on conveying scientific concepts to both scientific and nonscientific audiences. Students must understand the importance and relevance of communicating scientific concepts to a nonscientific audience and should be encouraged to identify possible hurdles that such an audience will encounter.
Purpose and background of study
The purpose of this study is to compare two teaching methods, a traditional paper presentation method and a case study method, to teach oral communication and critical-thinking skills. The course is part of the curriculum of the Forensic Science Program at Eastern Kentucky University. This program was established in 1974 and is one of only 16 undergraduate forensic science programs in the United States that is accredited by the Forensic Science Education Programs Accreditation Commission. The forensic science program has a strong chemistry and molecular biology focus with additional skills specific to forensic science added to the curriculum. The program has always recognized the need for forensic scientists to be both scientifically competent and able to effectively communicate difficult technical concepts to lay persons who make up juries. Thus, a course entitled Expert Witness Testimony has been a part of the program since its inception.
Students take the Expert Witness Testimony course in the second semester of the junior year. This semester is a "communication intensive" semester in the program. Students give technical oral presentations in two forensic science classes to an audience of their peers, as well as three speaking sessions during the Expert Witness Testimony course.
The Expert Witness Testimony course is intended to familiarize students with courtroom procedure and the role of the expert witness. A more important goal, however, is to have the students assimilate and apply knowledge gained from previous forensic science courses and to communicate results, concepts, and conclusions at a level that would be understandable to a jury. The goals of this course are directly related to the Eastern Kentucky University Quality Enhancement Plan, which strives to "develop informed, critical, and creative thinkers who communicate effectively."
The case study instructional method is a well-established instructional tool with extensive literature (Camill 2006; Gallucci 2006; Herreid 2005, 2006). In the case study approach, students are provided with true or realistic "stories" that form the basis for study. This approach has been used most often in law, medicine, and business schools, but has become more common in college-level science courses in recent years. The use of the case study approach is an excellent fit for the Expert Witness Testimony course, which combines law and science.
Although there is a great deal of literature about the case study approach and examples of case studies, there is much less written about the assessment of the effectiveness of case study teaching (Lundeberg and Yadav 2006a, 2006b). A national survey of faculty perceptions provided evidence that case-based instruction is perceived by faculty to be effective in improvement of student critical thinking and ability to grasp the "big picture" (Yadav et al. 2007). A meta-analysis of the effects of problem-based learning on student performance indicated that the case study method has little effect on accumulated knowledge but did improve students' application of knowledge and higher-order thinking (Dochy et al. 2003).
There is very little published work on the use of case studies to improve students' oral communication skills. Previous studies have noted that faculty perceive that the case study method does strengthen student communication skills. However, these observations are intertwined with faculty perception of the overall level of student class participation, not specific components of basic oral communication skills. This study attempts to clarify and confirm these findings through focusing on defined student learning outcomes related to oral communication that are measured using appropriately designed instruments.
The study compares the oral communication performance of students in the course taught using two distinct formats: case study and noncase study. In both course formats, students were given virtually identical lectures on legal concepts relating to expert witness testimony and oral presentation skills. In the noncase format, students selected a research paper from a group of about 20 published, peer-reviewed papers that instructors had selected so that they would be appropriate for the students. The papers represented a variety of topics in forensic science that students had been exposed to in course work. Instructors then asked students questions about the research and calculations described in the paper and basic background scientific concepts. Students were required to answer so that a layperson could understand their conclusions. No further context was provided to the students. The students had about three weeks to prepare for their first testimony session.
In the case study format, instructors assigned students an expert role to play in a mock trial simulation/case study. The mock trial case study was largely based on a case the American Mock Trial Association (AMTA) used during the 2006-2007 competition year (used in this course with permission from AMTA). Instructors gave all students an identical case packet that contained materials such as court filings and witness affidavits that related to the case. Whereas the original AMTA case focused only minimally on scientific information, instructors adapted the case packet to include raw data related to the students' scientific background. Five different data sets were prepared (DNA, bullet comparison, solid-dosage drug identification, blood alcohol determination, and gunshot residue analysis). Students selected the topic of their choice. To prepare for class, students had to perform calculations on the given raw data and integrate their findings with other facts in the case. The given raw data was designed to force students to make decisions regarding the types of calculations to perform and to come to scientifically sound, defensible conclusions based on the data. Two sessions on different days were needed to accommodate the testimonies of all students.
Students also watched videotaped performances of other students playing roles of attorneys and other witnesses, including a series of eyewitnesses and a forensic science supervisor. Students then "testified" as expert witnesses who worked for the forensic science supervisor. Instructors, acting as attorneys, asked the students a series of questions related to the scientific theory behind the data analysis, the approach used to analyze the data, and ultimate conclusions in the case. Once again students were required to answer the questions so that a layperson could understand their conclusions.
The two student groups included in this study shared many similarities, and great care was taken to control the study populations. All students involved in the study were second-semester juniors in the Eastern Kentucky University Forensic Science Program. As required in the program, the students had completed a core of approximately 40 credit hours of natural science courses, along with 10-15 hours of upper-division forensic science courses. All students had also completed a course in oral communication as part of the Eastern Kentucky University General Education Program.
The treatment of the students in each class format was also carefully controlled. Both groups of students were presented with the same series of lectures and in-class activities on legal concepts, cases, and procedures. The students were given the same information on communication goals and skills. The textbook and other reading assignments were identical. Exams and written assignments for both student groups were very similar and focused on identical substantive information and higher-order thinking skills.
The course instructors knew all of the students, so it was not possible to rate the students in a completely blind manner (i.e., without any knowledge of whether or not the students were in the case or noncase class). The instructors attempted to rate the students as objectively as possible, but this is a potential source of bias in the study. A total of 56 students were included in the study.
Rubric instrument To assess student performance of oral communication and critical-thinking skills in the Expert Witness Testimony course, instructors had to design an appropriate evaluation tool. The instructors chose to develop a rubric that would encompass the many oral communication and critical-thinking skills that students were expected to demonstrate during oral performances. The validity and ultimate reliability of the rubric were carefully monitored throughout the design process.
The rubric was an analytic, task-specific scoring rubric with 10 factors for evaluation. Each of the 10 factors had four performance levels for the oral presentations. The factors were based on the oral communication competencies the National Communications Association developed for college students (Morreale, Rubin, and Jones 1998). Instructors adapted these competencies to encompass additional substantive goals for the course, including fulfilling legal requirements for expert witnesses and demonstrating mastery of scientific and technical knowledge appropriately.
Six of the 10 factors in the rubric were selected for inclusion in this study. These factors were identified as observable on videotaped performances, reliably measurable by instructors, and indicative of the oral communication and related critical-thinking goals of the course (Table 1). The other 4 factors were not evaluated either because we did not have data available or the factors were not relevant or observable when the noncase study was used.
The first factor included in this study was the students' ability to effectively research information for message production. Students were expected not only to review data that was within their area of technical expertise, but also to conduct sufficient background research to explain the data and related concepts to a layperson and to answer specific instructor questions. Students were considered to have conducted effective research if they could answer all questions on the given data correctly, reach scientifically defensible conclusions, and further explain concepts using appropriate vocabulary and imagery so that laypersons could understand the discussion.
The second factor studied was the students' ability to effectively organize information for message production. Whereas instructors asked questions in the order they chose, students were responsible for effectively organizing their responses so that laypeople could understand the content and consequences of the testimony. Students were considered to have effectively organized information if their oral responses and related visual aides enabled laypeople to follow the testimony and reach appropriate conclusions based on the testimony.
The third factor studied was the students' ability to effectively integrate information required for message production. Students were responsible for reaching conclusions about the given data and for integrating reasons for the conclusions along with suggestions for further action. Students were encouraged to focus on "storytelling" and other techniques to simplify and clarify the overall message. Students were considered to have effectively integrated the material and arguments if they could successfully convey to a layperson the interpretation of the data and the conclusions.
The fourth factor studied was the students' ability to adapt their communication style to the appropriate context. In both teaching methods, students were asked to explain scientific and technical information to a lay audience and lead them to a specific conclusion. Students were considered to have effectively adapted their communication style to the lay audience if they used appropriate nonscientific vocabulary or carefully explained all scientific terms, described scientific concepts in a concrete and accessible fashion, and incorporated appropriate visual aides and nonverbal cues.
The fifth factor studied was the students' ability to implement appropriate verbal delivery skills. The students were required to use correct grammar, pronunciation, and word choice. Students were further encouraged to use inflection, repetition, and varied speaking speed to engage the lay audience and effectively convey the message. Students were considered to have effectively implemented such verbal skills if they used correct grammar and pronunciation while using other voice modulation and speaking skills in a manner that assisted in message production while not being distracting.
The final factor studied was the students' ability to implement appropriate nonverbal delivery skills. Students were required to use nonverbal communication such as hand gestures, movement, and facial expressions to assist in message production. Students were considered to have effectively implemented such skills if their nonverbal delivery resulted in highlighting, emphasizing, or otherwise assisting in message production while not being distracting or detrimental to the overall presentation.
Reliability of rubric scoring was established using a variety of methods discussed in assessment literature (Moskal and Leydens 2000; Moskal 2003; Zimmaro 2004). Prior to scoring, instructors chose individual performances to serve as "anchor" performances that would guide future study scores. The anchor performances were chosen for each of the applicable four performance levels for each factor assessed. The evaluators carefully discussed each anchor performance and defined general characteristics of each performance that would define that score for the specific factor. Each student performance had been videotaped in class. Two evaluators reviewed the videotapes and rated each student performance separately on the basis of the anchor performances. After scores were assigned, the evaluators compared scores to confirm interrater reliability. More than 95% of the scores were identical or only a single scoring level apart. Consequently, the total of the direct and deposition average scores of the two evaluators were used for the statistical analyses (yielding possible scores from 2 to 8). This approach simplified the model by removing the dependency between deposition and direct scores for each student and produced a response variable that was more continuous in nature.
The data analysis for this paper was generated using SAS software, Version 9.1 of the SAS System for Windows (SAS Institute, 2002-2003). The SAS software MIXED procedure was used to analyze the following model: [y.sub.ijk] = [mu] + [a.sub.i] + [[tau].sub.j] + [([alpha][tau]).sub.ij] + [e.sub.ijk] where [y.sub.ijk] is the measure of score of the [k.sup.th] student in the [i.sup.th] teaching method group for the [j.sup.th] rubric, [[alpha].sub.i] is the main effect of the teaching method group, [tau]j is the main effect of the [j.sup.th] rubric, [([alpha][tau]).sub.ij] is the Group x Rubric interaction, and the [e.sub.ijk] are the residual errors, assumed uncorrelated for observation on different subjects, but within subjects to have some variance-covariance structure. Using various criteria such as the Akaike's Information Criterion and Schwarz's Bayesian Criterion, the compound symmetry variance-covariance structure was chosen to analyze the data. Satterthwaite approximation for the denominator degrees of freedom was used for the F and t statistics.
The overall model, both main effect terms, and the interaction term were all statistically significant (P < .0001). Given the statistically significant interaction term, the two teaching methods for each rubric factor were compared. These results are presented in Figure 1. For each factor, the average case teaching method score was significantly higher than the average noncase teaching method score.
The rubric scores clearly indicate that students' critical-thinking and communication skills improved greatly when using the case study method rather than the paper presentation method. Although not all rubric factors improved to the same extent, students demonstrated improvement in all areas. Such improvement is particularly impressive in light of the greater difficulty that was presented to students in the case method.
The first rubric factor studied, the students' ability to effectively research information for message production, showed modest improvement when implementing the case method in lieu of the traditional paper presentation. Although this factor was the least improved of all those studied, this may not be unexpected because the level of research students were required to conduct for the case study method may have been slightly broader and more complex than for the traditional paper presentation. In the paper presentation, students were given data for which calculations had already been performed and peer reviewed. The students only needed to understand what decisions were made when performing calculations and why. In the case method, however, students had to decide what calculations to perform, formulate their own conclusions, and gather resources to support their conclusions. Despite this more challenging task, students were more successful in identifying the research required and successfully conducting this research.
The second rubric factor, the students' ability to effectively organize information for message production, showed some improvement. Although the paper presentation assignment provided students with a clear outline of the material that was to be covered, students struggled to organize material from the paper into an oral presentation that was accessible to a lay audience. Although students were taught how to design visual aides in the paper presentation assignment, they often merely copied figures from the paper and did not use these figures in a manner that effectively organized the material. In the case method, students were required to answer questions about their own calculations that were based on their own research and problem-solving process. It appears that such deeper thought and understanding of the material enabled students to better master the material and organize it in a manner that was accessible to laypersons.
The third factor, the students' ability to effectively integrate information required for message production, was greatly improved using the case study method. Although both groups of students were taught a series of techniques to integrate information into a coherent accessible message, only those students in the case method were able to successfully implement these techniques. Students using the case study method were able to reach conclusions about the given data and integrate reasons for the conclusions with suggestions for further action that were understandable to a lay audience. Students in the paper presentation method struggled to suggest anything not specifically mentioned in the paper. It appears that the paper may have stifled the students' ability to think critically and create additional conclusions. The case study, however, seems to have supported such student efforts.
The fourth factor, the students' ability to adapt their communication to the appropriate context, was also greatly improved with the case study method. Although in both methods, students were given scientific information in a form accessible to a scientifically educated audience, only students in the case method were successful in adapting this information to a lay audience. It appears that students in the paper presentation felt bound to the text of the paper and were unable to adapt to the new audience. Such adaptation requires students to think deeply and critically about the material and recast the information for the new audience. The case study seems to have provided the support and encouragement students need for adaptation.
The fifth factor, the students' verbal delivery of the material, also improved slightly using the case study method. Whereas students in both the paper presentation and case study method struggled with word choice, grammar, and pronunciation, the students in the case study method did show some improvement over the paper presentation method. The reasons for such improvement are not immediately clear. Instructors believe that the case study may have emphasized the importance of appropriate grammar and pronunciation to students through the formal atmosphere that was provided by the courtroom setting. The improved word choice may also be a direct result of the improved adaptation that students demonstrated in the case method format.
The final factor, the students' nonverbal delivery of the materials, also improved using the case method. Although both groups of students were taught the same material on the importance of nonverbal communication such as gesture and eye contact, students in the case study method were better able to implement appropriate nonverbal communication. Once again the reasons for this improvement are not immediately obvious. Instructors believe that this improvement may be largely due to many students choosing to use visuals aides when participating in the case study method. When using these visual aides, students often chose to stand and move around the classroom when testifying. This allowed students to engage in more natural nonverbal communication than those who remained seated throughout the presentation. Although students in the paper presentation method were also encouraged to use visual aides and to leave their seat when appropriate, very few chose to do so.
In summary, the case study method promoted improved critical thinking and communication skills for all rubric factors investigated. Particular improvement was noted in the students' ability to adapt scientific information to a lay audience and to integrate a message through presentation of the scientific information. Instructors found that such improved performance is particularly noteworthy in light of the greater difficulty of the case study assignment. It appears from this study that students can both think more deeply and critically about scientific and technical information when provided real-life context surrounding the material. This additional context can also provide the means for promoting improved communication skills as well other professional skills that were not directly studied in this research.
Camill, P. 2006. Case studies add value to a diverse teaching portfolio in science courses. Journal of College Science Teaching 36 (2): 31-37.
Dochy, F., M. Segers, P. van den Bossche, and D. Gijbels. 2003. Effects of problem-based learning: A meta-analysis. Learning and Instruction 13 (5): 533-568.
Gallucci, K. 2006. Learning concepts with cases. Journal of College Sc.ence Teach.ng 36 (2): 16-20.
Herreid, C.F. 2005. Using case studies to teach science. www.actionbioscience.org/education/herreid.html.
Herreid, C.F. 2006. Start with a story: The case study method of teaching college science. Washington, DC: NSTA Press.
Lundeberg, M.A., and A. Yadav. 2006a. Assessment of case study teaching: Where do we go from here? Part I. Journal of College Science Teaching 35 (5): 10-13.
Lundeberg, M.A., and A. Yadav. 2006b. Assessment of case study teaching: Where do we go from here? Part II. Journal of College Science Teaching 35 (6): 8-13.
Morreale, S., B. Rubin, and E. Jones. 1998. Speaking and listening competencies for college students. Washington, DC: National Communication Association. www.natcom.org/nca/files/ccLibraryFiles/FILENAME/000000000085/ College%20Competencies.pdf.
Moskal, B.M. 2003. Developing classroom performance assessments and scoring rubricsPart II. ERIC Digests (ED481715). www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/ 0000019b/80/1b/7d/ee.pdf.
Moskal, B.M., and J.A. Leydens, J.A. 2000. Scoring rubric development: Validity and reliability. Practical Assessment, Research, and Evaluation 7 (10). http://pareonline.net/getvn.asp?v=7&n=10.
SAS Institute. 2002-2003. Version 9.1 of the SAS System for Windows. Cary, NC: SAS Institute.
Yadav, A., M. Lundeberg, M. DeSchryver, K. Kirdin, N. A. Schiller, K. Maier, and C.F. Herreid. 2007. Teaching science with case studies: A national survey of faculty perceptions of the benefits and challenges of using cases. Journal of College Science Teaching 37 (1): 34-38.
Zimmaro, D.M. 2004. Developing grading rubrics. Austin: University of Texas at Austin, Instructional Innovation and Assessment. www.utexas.edu/academic/mec/research/pdf/rubricshandout.pdf
Lynnette Noblitt is an associate professor in the Department of Government, Diane E. Vance (firstname.lastname@example.org) is an associate professor in the Department of Chemistry, and Michelle L. DePoy Smith is an assistant professor in the Department of Mathematics and Statistics, all at Eastern Kentucky University in Richmond.
TABLE 1 Scoring rubric for communication assignment. 4 3 Critical-thinking Accomplished Competent or communication factor Exceeds course Meets course expectations objectives Critical thinking: Message is fully Message is effectively researches supported and supported with information challenges relevant information. listener(s). Critical thinking: Message is logical Message is logical effectively organizes and easy-to-follow and easy to follow. information and compelling. Critical thinking: Orally cites Orally cites effectively integrates research that adds research in the information insights in the message. message. Critical thinking: Strategically uses Tailors language and adapts oral communication for nonverbal cues to communication styles context. the listener(s). to contexts Oral communication: Language free Language is successfully of serious errors, appropriate and implements verbal appropriate, and free of serious delivery unusually errors. interesting. Oral communication: Nonverbal cues are Nonverbal cues successfully strategically used support and do not implements nonverbal to emphasize the distract from delivery message. message. 2 1 Critical-thinking Incomplete in Beginning or communication factor Incomplete in Inadequate in meeting course meeting course objectives objectives Critical thinking: Message is supported Message is not effectively researches by information but supported with information may at times be accurate, relevant, inaccurate. or recent information. Critical thinking: Message structure Message is not well effectively organizes is somewhat logical, organized; is information but listener(s) may difficult to follow. struggle to follow. Critical thinking: Uses research to Fails to use effectively integrates support message "outside" support information sometimes. and/or fails to acknowledge supporting information. Critical thinking: Understands needs Uses inappropriate adapts oral of audience but not communication so communication styles consistently adapts. that listener(s) are to contexts "distanced." Oral communication: Language is usually Inappropriate, successfully appropriate to unethical, and/or implements verbal context; may contain potentially delivery some errors. offensive language is used. Oral communication: Nonverbal cues Speaker is largely successfully are sometimes unaware of the use implements nonverbal incongruent or or importance of delivery distracting to nonverbal cues. message. FIGURE 1 Comparison of mean scores for case and noncase groups for each rubric factor. **p < 0.01. ***p < 0.001. Mean score Rubric factor Case study Noncase study 1 5.3 4.5 2 5.4 3.8 3 5.4 3.2 4 5.4 3 5 5.2 3.9 6 4.8 3.6 Note: Table made from bar graph.…
Questia, a part of Gale, Cengage Learning. www.questia.com
Publication information: Article title: A Comparison of Case Study and Traditional Teaching Methods for Improvement of Oral Communication and Critical-Thinking Skills. Contributors: Noblitt, Lynnette - Author, Vance, Diane E. - Author, Smith, Michelle L. DePoy - Author. Journal title: Journal of College Science Teaching. Volume: 39. Issue: 5 Publication date: May-June 2010. Page number: 26. © 2009 National Science Teachers Association. COPYRIGHT 2010 Gale Group.
This material is protected by copyright and, with the exception of fair use, may not be further copied, distributed or transmitted in any form or by any means.