Academic journal article Journal of STEM Education : Innovations and Research

Preliminary Validation of a Critical Thinking Rubric for Fluid Mechanics and Heat Transfer

Academic journal article Journal of STEM Education : Innovations and Research

Preliminary Validation of a Critical Thinking Rubric for Fluid Mechanics and Heat Transfer

Article excerpt

Introduction

One key task in developing or evaluating an educational intervention is that of identifying or developing assessment tools that can measure the intended improvements in student learning that will be produced by the intervention. To this end the authors have developed a rubric to measure critical thinking in the specific realm of chemical engineering fluid mechanics and heat transfer design and problem solving. We are certain that the rubric measures things that we, the educators who developed it, value for assigning grades and assessing competence. However, from the perspective that the purpose of undergraduate engineering education is to prepare students for their future careers, it is important to determine whether the skills and thinking characterized by the rubric are similarly of interest to our student's potential future industrial employers.

Specific need for validating our rubric

Our rubric is based on a critical thinking rubric (CTR) developed by an assessment group at Washington State University (WSU). The original rubric is broad and multidisciplinary, has been used throughout WSU (Brown, 2004), and has been a major assessment tool for an NSF grant at another institution (Damron & High, 2008). This original rubric is based on formal analysis, which is rooted in classical rhetoric, and has concurrent validity with multiple definitions of critical thinking (Facione, 1990, 2000; Paul & Elder, 2005; Toulmin, 1979). Though we have modified the CTR in a way that maintains a logical connection to the construct validity of the original and performed the modification in a way that we believe builds content and expert validity for the purpose of assessing critical thinking in our particular course, we are only meeting a portion of the broader need. We have not ensured that the measurement matters to the future employers of the students. Specifically this is the sub-category of content validity known as content relevance (Chatterji, 2003); some frameworks also refer to this as criterion related validity (Moskal & Leydens, 2000).

In addition to relevance, we need to examine issues that relate to the usefulness of the rubric, both as a teaching tool (Wiggins, 1998; Wiggins & McTighe, 2005) and as a tool that can be distributed and used beyond the pool of individuals who helped with the development. Are there systematic biases in between different groups of raters? Is the measurement credible to all stakeholders? Is it both honest and fair?

In order to begin addressing these questions, the authors undertook a study to utilize our CTR as a major assessment component in a project based junior level chemical engineering course on fluid mechanics and heat transfer. The rubric was used for instructor assessments of student work at various milestones during the semester. As a means to help them understand the use and meaning of the rubric, students also used the rubric to assess a student project from the previous semester. The final project reports were rated by the instructors for grades, the students as their final, and by a group of alumni who were willing to take part.

In this article we begin to examine the content relevance of our rubric by examining the differences and similarities between how the different groups rated the final project reports. These results, and supporting survey responses, provide some insight into probable biases between groups, the credibility of the rubric to different groups, and the portability of the rubric to new users.

Background

Why Critical Thinking?

For at least the past 20 years academics have been claiming that engineers need more training in what are frequently referred to as "soft skills" (Adams et al., 2011; Dickson & Grant, 2007; Felder, Brent, & Prince, 2011; Litzinger, Lattuca, Hadgraft, & Newstetter, 2011; Varma, 2003). Engineering educators generally think of soft skills as the non-technical items in ABET criterion 3 (ABET, 2003), which include items such as "an understanding of professional and ethical responsibility" and "the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.