Pedagogical Changes in the Delivery of the First-Course in Computer Science: Problem Solving, Then Programming
Deek, Fadi P., Kimmel, Howard, McHugh, James A., Journal of Engineering Education
A teaching reform initiative, started in the spring semester of 1993 at the New Jersey Institute of Technology (NJIT), is described. The program seeks to increase student success in a freshman computer science course, and ultimately in the entire NJIT curriculum. The traditional teaching methods where the teacher presided over a lecture session supplying facts and figures, providing ideas, and presenting problems and their solutions, has been altered. The new learning environment described in this paper aims to create an all-inclusive setting inviting the students to make the transformation from passive learners to active participants. Rather than merely listening to lectures, students formulate problems and devise their own approaches to answering questions and finding solutions. Such a teaching/learning methodology requires instructional redesign and role redefinition. The presentation of class material is reordered as the teacher and students cross each other's confines becoming a more cohesive entity.
I. INTRODUCTION Several studies indicate that the number of students in the United States majoring in mathematics, science, and engineering drops by 40% between the freshman and senior years. The US has the lowest proportion of university degrees in science and engineering among market economy countries, and numbers of such degrees awarded continue to decline. A large number of students make many of the decisions that affect their educational career during their first year. The decision to reject mathematics, science, or engineering study is often based on the complexity of the curriculum and the lack of stimulation.
Students can no longer be expected to learn with little regard for the world in which the subjects are actually applied. A modern curriculum and innovative pedagogy are needed that foster intellectual independence (through group learning and problem solving) as opposed to memorizing lecture and textbook materials. Textbooks must be correspondingly revised to reflect these necessary changes in curriculum and pedagogy. Students must master disciplines by a deep and lasting internalization of the subject matter.34
Our premise is that meaning of a subject cannot simply be passively conveyed to the student by the teacher. Meaning must be constructed by the learner from experience. It is clear that learning can be accomplished not only by "scanning" but also by "processing" of information, as well as by participation in the learning process.5
There is much truth in the old adage that "teachers teach the way they were taught."' In fact, post-secondary teaching is a traditional profession that copies the behavior of professors. Unfortunately, traditional methods of teaching engineering, science, mathematics and computer science, especially at the post-secondary level, remove instruction in these disciplines from the natural conditions of both practice and learning. Engineering, science, mathematics, and computer science are often taught as static bodies of knowledge; practice of these disciplines, on the other hand, reveals their fluidity and dynamism. There is a critical need to restructure the methodology for all subject areas. The subjects are taught as if composed solely of abstract and rarefied conceptualizations, rather than from their historical context, applications or other concrete frameworks.
What is clear is that the method of teaching engineering, scientific and mathematical disciplines must be fundamentally altered to provide students with a good understanding or appreciation of these subjects. Numerous studies, for example, suggest that content and pedagogy must be integrated for effective instruction and learning,7-9 and that teaching styles should be compatible with learning styles.lo Further, potentially effective teaching methodologies have been studied, including problem-solving,ll analogies and metaphor use,l2 anchored instruction,l3 integration oftechnology,l4. and others.16
A major concern in education is the relationship of achievement and persistence in different subject areas at all grade levels. In engineering and science, the relationship between prior knowledge, formal reasoning ability, and achievement has been a research topic of interest for some time."7 Introduction of effective strategies and methodologies will allow students to foster intellectual independence through student-centered and group learning and problem solving, while mastering disciplines by a deep and lasting internalization of the subject matter.
Another concern of the higher education community is retention. Here, retention has two connotations. We talk about students staying in school to earn their degree. There is also the concern that students retain the knowledge and skills that they acquire so that they can be applied in later courses and in the real world. For example, improvement in problem solving skills at the freshman level should improve the grades of the students in their current courses and prepare the students to succeed in future courses as they complete the work towards their degrees. Instruments for measuring how well students retain the subject matter by application of content in upper level courses, is being sought. II. EDUCATIONAL REFORM There has been widespread agreement that undergraduate education, especially engineering, science, mathematics, and computer science is in need of major change. However, no uniform solution has yet been developed and agreed upon. Expert diagnosis and prescriptions for a necessary cure vary. Some experts assert that new teaching methods are needed to make these disciplines more challenging and interesting for students with an assortment of pre-college preparation levels.
Perhaps the first formal effort to study and reform the way undergraduates learn took place in mathematics. In 1978, the Mathematics Workshop Project (MWP) for calculus"8 was created at the University of California, Berkeley, in response to concerns about the low achievement in mathematics among black undergraduates. The MWP was based upon the results of studies by Trainman which suggested that study groups could provide an efficient vehicle for mastering the challenges of calculus. The basis for the success of this model is the intense discussion and debate between students around difficult problems that would occur in study groups. Using a workshop format, students first worked alone and then in groups of four or five, all of whom had been working with the same problems. The group work was meant to encourage students to communicate with each other about their efforts to develop solutions. The success of the Trainman model has been demonstrated by significantly improved achievement of MWP students as compared to non-MWP students, at other institutions as well as at Berkeley.l9 Further participation in the workshops has also been associated with high retention rates and graduation rates. Other efforts have been documented in physics,2" chemistry,2" computer science,23 and engineering.4
III. CHANGES IN THE FIRST COURSE IN COMPUTER SCIENCE The first course in computer science commonly covers programming and problem solving, and is taught in a lecture/recitation-laboratory format. In the lecture, language syntax and simple examples are presented to illustrate programming concepts. In the recitationlaboratory, the instructor presents examples, demonstrates the algorithmic solution, and applies the syntax presented in the lecture to the algorithm. This method and sequence follow the typical textbook approach to the subject and students and instructors do interact during class somewhat, but the instructor essentially presents the material. Students often have difficulties with syntax concepts, and thus, their participation in the problem-solving process using the new material is hindered.
The alternative method is to introduce the problem in the lecture, engage the students in defining the statement of the problem, and allow the students to seek possible solutions independent of the programming language. Once the problem is solved, the language features necessary to implement the solution are presented. Finally, equipped with both the algorithmic solution, which the students develop, and the language syntax, the complete solution is translated into code and is then tested. A. TheAlternativeMethod
A sequence of well-defined activities makes up a complete class session. A comparison of the traditional and the alternative methods23 reveals two differences: (1) the "ordering" of the class activities and (2) the fact that the class session "problem driven."
The lecture begins where it would be expected to end: the examination of a problem to be solved (programming textbooks do this last). With the introduction of the problem as the first activity of the lecture session, we allow the students to engage in defining and analyzing the question of the problem and to seek, discuss, and consider possible solutions with total independence of, and without any concern about, the programming language syntax. The students concentrate and focus on formulating the problem, planning and designing the solution, and outlining the testing strategies. Once the algorithmic solution for the problem is constructed, the language syntax (i.e., control structures, data representation) necessary to carry out the solution to its final stages is then presented. This is facilitated by what we refer to as the algorithmic walkthrough. Each line in the algorithm is visually scanned, by the students and the instructor, to identify new language constructs required for the solution but have not yet been introduced, and need to be explained in order to have a complete mapping of the algorithm into code.
Next, the syntax presentation takes place with examples (programming textbooks do this first). Using both the algorithmic solution which the students develop in collaboration with the instructor and the language's syntax-now more meaningful and appreciated - the complete solution is implemented and tested.
This change in lecture mode makes it possible for the students to get involved in class discussions from the beginning and throughout the duration of the class session. Student-teacher and student-student interaction is promoted by postponing the introduction of new and often abstract material such as language syntax until the implementation phase, a later stage in software life cycle. The activities of a typical class session using the alternative method are described:
1) Presentation of the Problem: The instructor presents a problem that is designed specifically to take into consideration the use of new material as defined in the syllabus for that specific class session.
2) Problem Formulation: Students, with instructor acting as facilitator, formulate the problem. This requires the understanding of the question as well as the meaning of the problem's terminology, and the identification of its facts: goal, givens, unknowns, condition and constraints. Problem understanding requires the processing of information. The techniques of verbalization and visualization are used to create an initial understanding of the problem. For example, making a drawing, talking, or answering questions about the problem aid the task of problem understanding. Problem formulation evolves with the transformation of the given problem statement into a precisely formulated model. Developing a precise model of the problem is completed by the elicitation and organization of all relevant information and the elimination of irrelevant information.
3) Solution Planning: Students, with instructor acting as facilitator, plan the solution beginning with the development of an appropriate solution strategy. The students consider various alternatives to determine the course of action suited to achieve the goal of the problem, subdivide the goal into subgoals, and identify the tasks needed to accomplish each subgoal. The relevant information identified in the previous activity is related to the various subgoals and its role and meaning are defined. This enables the students to begin the process of carrying out the strategy to progress toward meeting each subgoal of the problem and eventually producing the complete solution.
4) Solution Design: Students, with instructor acting as facilitator, organize and refine the components of the solution strategy, and define specifications to be translated into code. There are two levels of design. The first is a high-level design where a framework structure for a solution to the problem is produced, typically in visual or outline form. This involves the organization and sequencing of subgoals, the determination of whether the subgoals require further refinement, the establishment of relationships among the various solution components, and the association between data and subgoals. Subsequently, detailed design transforms subgoals into corresponding algorithmic specifications and the solution logic is readied to be translated into programming language code. 5) Algorithmic Walkthrough: Students, with instructor acting as facilitator, prepare to map the algorithmic solution into programming language syntax by going over each line in the algorithm and pointing out the exact language construct required. Any reference to what appears unclear or not yet used or previously covered is marked to be the subject of next activity. A list of such references is maintained.
6) Presentation of Syntax: The only "instructor dominant" activity. This activity brings the class back to the beginning of the textbook chapter of the subject at hand. This part is driven by two objectives: meet the stated goal for this lesson take the solution to its final stages (write the complete program).
7) Implementation: Students, with instructor acting as facilitator, proceed to fill in the gaps and produce a complete program that will run on the computer. This is done by translating the detailed design into programming language code that will tested and can be executed to produce the result.
8) Testing Students with instructor acting as facilitator perform some testing by hand in the planning and design phases to verify that a reasonable logic is used in the algorithm. However, after the implementation phase, each module developed is tested with data for that module. Once we are satisfied that each module does what it is supposed to, the whole program is tested as a unit. The implementation and testing tasks are evaluated in terms of solution quality and correctness. IV. DISCUSSION The instructional staff for the course consisted of a full-time lecturer and two teaching assistants (TAs). All sections received the same lecture coverage, testing, and programming assignments. It is important to note that the TAs who graded the students' work in all sections were not told of the details of the changes in course structure. Likewise, the instructor who prepared and delivered the class material, quizzes, assignments, and exams was not involved in evaluating student performance.
In the trial semester, the instructor taught four sections of the first-semester computer science course. One section received the alternative methodology, while a second section received the traditional approach and was used as the control section. Comparisons were made also with an honors section that received the alternative methodology.
The methodology was tested with six topics: Program Development Cycle
Control Structures: Selection and Repetition Modular Design Lists
Searching and Sorting Records and Files A ten-question instrument was developed to survey the students for their beliefs and perceptions. A sample instrument for one of the topics is included in table 1. Due to the small sampling of students, no statistical significance could be attached to the findings. But the results do provide some implications for appropriate classroom learning environments and indications for future directions.
This brief discussion will only look at the first six questions of the instrument. These questions were selected because they represent current reform efforts in science, mathematics, and engineering across the K-16 grade levels. The basis for these efforts is that students learn and develop better understanding of concepts, that result in long-term knowledge retention, in a student centered learning environment. Three of the questions focus on students' beliefs of their understanding of the concept and the process and for their confidence of performance; in other words, the students selfefficacy.25 The other three consider students' attitudes towards participation in the learning process. The six questions should provide some measure of how well the alternative methodology provides the appropriate learning environment. The remaining questions attempted to compare this course with math or science courses that students were taking. The number of different courses made it difficult to obtain useful comparisons.
Figures 1 and 2 provide a comparison for the different classes for Questions 2 and 3, respectively. Question 2 dealt with student understanding of the basic concepts. The results indicate that students receiving the alternative methodology have a greater belief that they understand the concepts than those receiving the traditional mode. The differences between the two populations range from 0.04 (topic 2) to about 0.20 for four of the other topics. Student confidence on how well they will do on the corresponding quiz (Question 3) showed similar results, although perhaps not as pronounced for a couple of topics. For the topics on control structures and modular design, there seemed to be no discernible difference, whereas differences range from 0.15 for topic 6 to 0.33 for topic 2. More striking is the comparison of results for Questions 2 and 3, where for all classes, within each topic, students' beliefs in their understanding of the concepts were always higher than confidence in their performance on the corresponding quiz. For example, the difference between beliefs in understanding and confidence in performance was more than double (0.75 to 0.36) in the traditional classes for topic 1 and a difference of 93% (0.93 to 0.53) in the alternative classroom for the same topic. These results can be explained by recognizing that the students' responses on understanding reflect a different decision process than the expression of confidence, and would be consistent with the concept of self-efficacy of the students.25 Generally, students feel comfortable about expressing their understanding (i.e., decisions about current events), whereas, generally they feel uncomfortable about trying to predict their grade on an upcoming quiz (i.e., decisions about upcoming events). The present is perceived as a relative certainty whereas the future might be viewed, to varying degrees, as a gamble. Question 6 reflected on how much students believed that they were able to follow the analysis of the problem and design of the solution. Our hypothesis was that there should be a correspondence of the results for this question with the results for Question 2 on students' belief of their understanding of the concept (figure 3). The results do not necessarily validate the hypothesis, as the comparison of the students receiving the alternative methodology with those receiving the traditional approach shows inconclusive results. This could be due to an incorrect assumption on our part that Question 6 should reflect how well the students believe they understand the concept. However, we can view these results also from the perspective of the students' self-efficacy for learning,25 where the students can judge their capacity for learning to understand the concept, rather than the belief that they can successfully complete the task of solving the problem. In this situation, the ability to follow the analysis of the problem and design of the solution does not necessarily demonstrate an understanding of the concept. The real demonstration of understanding is application and retention.26 Thus being able to follow the analysis of the problem and design of the solution is not indicative of being able to retain and apply acquired knowledge.
The other three questions focused on student participation in class. The questions were designed to ascertain the perceived effects of greater student participation, student contribution to the implementation of the programming phase, and student confidence about participation in class. The hypothesis was that there would be a relationship between student participation and student performance. Thus the responses for questions 1 and 4 should be higher for the class receiving the alternative methodology than for the class receiving the traditional approach, and lower for the alternative approach on question 5. A comparison of the topics across the three questions shows mixed results. For example, for topic 1, the alternative approach class was 37% lower than the class receiving the traditional approach for question 5, but 18% lower than the traditional approach for question 1. There was no significant difference between the two groups for question 4 for this topic. The only correlation among the three questions occurred for topic 3, where the alternative approach was 24% lower for question 5, and 28% higher for question 4 than the class receiving the traditional approach. There was no difference in response for the two groups to question 1. Looking at the three questions (#1, #4, and #5) individually, question 1 shows a reverse of the expected trend in comparing the two groups, except for topic 3 where there is no difference. For question 4, topics 2, 3, 4, and 5 showed the expected higher responses for the alternative approach compared to the traditional approach. But for topic 1, there was no difference, and for topic 6, the responses for the alternative approach were 23% lower than the traditional approach. While the group receiving the alternative approach was expected to have lower numbers than the class receiving the traditional approach for question 6, the results were mixed. It would appear that this instrument needs modification, since the students' responses were not consistent within the context of the three questions used in this study.
The student overall performance on major exams, the midterm and final exam are reflected in their final grades for the course, for the students receiving the alternative methodology, as seen in figure 4. It is important to mention here that students receiving traditional treatment and those receiving the alternative treatment were all given the same assignments, quizzes, and exams. The high performance on major exams is reflective of retention of acquired knowledge, and the ability to apply that knowledge, through understanding of the fundamental concepts. Also, it would appear that self-efficacy is greater for those students receiving the alternative methodology, since it is recognized that self-efficacy, as well as ability, knowledge, and skill, influences achievement behavior. Overall, the results of this study were encouraging and suggest future directions. As figure 4 indicates, the frequency of final grades for the class receiving the alternative methodologies is skewed towards the higher grades (71% of the students received a grade of "A," "B+," or "B."). But the class of students using the traditional approach received grades skewed towards the lower end of the scale (Approximately 56% received a grade of "D," "F," an Incomplete, or a Withdrawal.) V. CONCLUDING REMARKS
This work was the precursor for an extensive research study of beginning students learning problem solving and programming and the tools available to help them. The research led to the development of an integrated environment for problem solving and program development and a yearlong classroom evaluation involving large student population.2' Following is a brief description of the research and initial results.
A prototype for an integrated environment supporting the problem solving and program development approach starting with the initial activity of understanding the problem and continuing through program implementation and testing has been designed. This model, which combines the methodology and the supporting tools, takes into consideration the cognitive skills that must be gained by students and the tasks performed in problem solving and program development. The process and essential facilities to assist the student in learning these skills and accomplishing these tasks are provided.
This environment, which we call SOLVEIT (Specification Oriented Language in Visual Environment for Instruction Translation) encapsulates the functionality of the traditional programming environment with a workbench facility to provide the student with a host of utilities that can be used to aid in performing problem solving and program development tasks.
The system was integrated and evaluated in the classroom. Hypotheses and research questions were designed to assess whether the tools within the SOLVEIT environment aid the students in their search for the solution, producing better results, enhancing their perception, attitude, and motivation, and in the development of skills and knowledge necessary for problem solving and program development. The hypotheses were designed to test the relationship between the various tools of the system and students' performance. The major assumption of the hypotheses was that students using SOLVEIT would perform better on problem solving and program development tasks than students not using the system. The research questions were designed to examine unforeseen effects not directly related to the system. The evaluation took place in the fall 96 and spring 97 semesters. Results from fall 1996 semester suggest that students in the experimental group acquired a higherlevel of competence in both problem solving and program development skills than the control group. While the experimental group's scores on quizzes and programming assignments were statistically similar to the control group, the experimental group's midterm and final exam scores showed statistically significant improvements; indeed, some of the differences were dramatic. Initial results from spring 1997 semester are comparable to the fall's, but with more significantly positive outcomes for the subjective research questions dealing with perception, attitude and motivation.
1. National Science Board, Science and Engineering Indicators, US Government Printing Office, Washington, DC, 1991.
2. National Science Foundation, Science and Engineering Degrees, by Race/Ethnicity of Recipients: 1977-90, US Government Printing Office, Washington, DC, 1992.
3. Halloun, I.A., and D. Hestenes, "The Initial Knowledge State of
College Physics Students," American Journal of Physics, vol. 53, 1985, pp. 1043-1055.
4. McDermott, L.C., "How We Teach and How Students Learn - A Mismatch?",American Journal of Physics, vol. 61,1993, pp. 295-298. 5. Mayer, R.E., "Learners as Information Processors: Legacies and Limitations of Educational Psychology's Second Metaphor," Educational Psychologist, vol. 31,1996, pp.151-161.
6. Collins, A., and J. Hastings, "Teaching Teachers: Practice What You Teach," Science and Children, vol. 27,1990, pp. 38-39. 7. Doyle, W., "Content Representations in Teachers' Definitions of Academic Work," Journal of Curriculum Studies, vol. 18, 1986, pp. 365379.
8. Shulman, L.S., "Those Who Understand: Knowledge Growth in Teaching," Educational Researcher, vol.15,1986, pp. 4-14.
9. Grossman, P., "The Research Agenda" in N. Smith (Chair) "Pedagogical Content Knowledge: Usefully Wrong?," Annual Meeting of the American Educational Research Association, San Francisco, CA.,1992.
10. Felder, R M., and L.K. Silverman, "Learning and Teaching Styles in Engineering Education," Engineering Education, vol. 78,1988, pp. 674-681. 11. Polya, G., How to Solve It (2nd edition), Princeton University Press, Princeton, NJ,1957.
12. Clement, J., and D. Brown, "Fostering Conceptual Change in Mechanics," Annual Meeting of the American Educational Research Association, Chicago, IL,1991.
13. Bransford, J., R. Sherwood, N. J. Vye, and J. Rieser, "Teaching Thinking and Problem Solving: Suggestions From Research," American Psychologist, vol. 41,1986, pp.1078-1089.
14. Hawkins, J., and R.D. Pea, "Tools For Bridging the Cultures of Everyday and Scientific Thinking," Journal of Research in Science Teaching, vol. 24,1987, pp. 291-307.
15. Schoenfeld, A., "Mathematics, Technology, and Higher-Order Thinking," In R Nickerson (ed.) Technology in Education in 2020: ThinkingAbout the Not Too Distant Future, Erlbaum, Hillsdale, NJ,1988.
16. Brown, S., and M. Walter, The Art of Problem Posing, Erlbaum, Hillsdale, NJ,1983.
17. Baird, W.E., and G.D. Borich, "Validity Considerations for Research on Integrated-Science Process Skills and Formal Reasoning Ability," Science Education, vol. 71,1987, pp. 259-269.
18. Treisman, U., "Studying Students Studying Calculus: A Look at the Lives of Minority Mathematics Students in College, " The College MathematicsJournal, vol. 23,1992, pp. 362-372.
19. Bonsangue, M.V., and D.E. Drew, "Long-term Effectiveness of the Calculus Workshop Model," Chautauqua Short Course: Increasing Minority Participation in Math-Based Disciplines, Long Beach, NJ,1992.
20. van Heuvelen, A., "Learning to Think Like a Physicist: A Review of Research Based Instructional Strategies,"American Journal of Physics, vol. 59,1991, pp.891-897.
21. Bodner, G.M., "Why Changing the Curriculum May Not Be Enough,"Journal of Chemical Education, vol. 69,1992, pp.186-190. 22. Deek, F.P., and H. Kimmel, "Changing the Students Role: From Passive Listeners to Active Participants," Proceedings, 1993 Frontiers in Education Conference, IEEE/ASEE,1993, pp. 321-325.
23. Kimmel, H., and F.P. Deek, "Teaching for Understanding: Redesigning Introductory Courses to Focus on the Learner," Proceedings, 1994 Frontiers in Education Conference, IEEE/ASEE,1994, pp. 336-340. 24. Catalano, G.D., "Some Ideas on the Teaching of Engineering Science: A Student Centered Approach," Journal of Engineering Education, vol. 84, no.1,1995, pp. 21-23.
25. Schunk, D.H., "Self-Efficacy for Learning and Performance," An
nual Meeting of the American Educational Research Association, New York, N.Y., 1991.
26. Ishida, J., "The Teaching of General Solution Methods to Pattern Finding Problems through Focusing on an Evaluation and Improvement Process," School Science and Mathematics, vol. 97,1997, pp. 155-162.
27. Deek, F.P., An Integrated Problem Solving and Program Development Environment, Ph.D. Dissertation, New Jersey Institute of Technology, Newark, NJ, 1997.
FADI P. DEEK Department of Computer and Information Science New Jersey Institute of Technology
HOWARD KIMMEL Department of Chemistry and Chemical Engineering New Jersey Institute of Technology
JAMES A. MCHUGH Department of Computer and Information Science New Jersey Institute of Techology…
Questia, a part of Gale, Cengage Learning. www.questia.com
Publication information: Article title: Pedagogical Changes in the Delivery of the First-Course in Computer Science: Problem Solving, Then Programming. Contributors: Deek, Fadi P. - Author, Kimmel, Howard - Author, McHugh, James A. - Author. Journal title: Journal of Engineering Education. Volume: 87. Issue: 3 Publication date: July 1998. Page number: 313+. © AMERICAN SOCIETY FOR ENGINEERING EDUCATION Oct 2008. Provided by ProQuest LLC. All Rights Reserved.
This material is protected by copyright and, with the exception of fair use, may not be further copied, distributed or transmitted in any form or by any means.