During the last decade, business school students have seen information technology being integrated into all aspects of their curricula at an ever-accelerating pace. Virtually all business schools now ask that their students meet certain computer literacy requirements before graduation, and these requirements are often part of the prerequisites for students enrolled in advanced courses of their majors, such as accounting, marketing, management science, and of course, information systems. There appears to be a strong belief, shared among the instructional faculty and prospective employers, that a higher level of computer literacy can lead to enhanced student academic performance, increased employment opportunities, and perhaps future success on the job (Jaderstrom, 1995; Tanyel, Mitchell, & McAlum, 1999; Trauth, Farwell, & Lee, 1993; Zhao, Ray, Dye, & David, 1998). Anecdotal evidence also seems to support such conventional wisdom: better computer skills should lead to more productive use of the technology, which in turn should lead to improved academic and job performance.
Surprisingly, there has been little formal research effort aimed at evaluating the effectiveness of the computer literacy requirement within academic settings. We do not know, for example, if students who have satisfied the requirement necessarily perform better than those who do not yet meet the requirement. Students may also question the validity of the requirement as a prerequisite for other courses. The problem appears to be twofold. First, there is no universal definition of what constitutes computer literacy (Jones & Pearson, 1996). As a result, we design our evaluative criteria based largely on individual judgments and group consensus. Second, we do not fully understand the process by which students' technology skills influence their academic performance and, ultimately, their job performance. Consequently, it is difficult to determine what specific performance indicators are most closely linked to an individual student's level of technology skills.
In this study, we examine empirically the correlation between students' level of computer literacy and their performance in an introductory information systems course. The research is seen as a first step in a series of studies designed to explore the predictive validity of the computer literacy requirement.
COMPUTER LITERACY AND PERFORMANCE
"Computer literacy" is a commonly used term in the business world, but it is not precisely defined. Computer literacy, in general, is being knowledgeable about the computer and its applications (Rochester & Rochester, 1991). Such knowledge appears to have two dimensions: conceptual, and operational (Winter, Chudoba, & Gutek,1997). The conceptual dimension includes an understanding of the inner workings of a computer or general computer terminology. Without such knowledge a user would find it difficult to figure out any system problems, or to learn to adapt quickly to new systems or software. The operational dimension refers to the necessary skills a user acquires, through training and practice, in order to operate specific systems to complete specific tasks.
While prior research did not evaluate the performance impact of computer literacy empirically, there is evidence that such a performance impact is likely to be task-dependent (Goodhue & Thompson, 1995; Lonstreet & Sorant, 1985; Rhodes, 1985; Thompson, Higgins, & Howell, 1994). For example, if we considered a student to be highly computer literate because s/he demonstrated a high level of proficiency in using a word processor or a spreadsheet program, we would also expect the student to perform well on tasks involving the use of a word processor or a spreadsheet program. We could not predict, however, how the student would perform on tasks involving the use of a database program, if s/he had not received training in database software. …