Academic journal article Journal of Information Technology Education

Interaction and Feedback in Automatically Assessed Algorithm Simulation Exercises

Academic journal article Journal of Information Technology Education

Interaction and Feedback in Automatically Assessed Algorithm Simulation Exercises

Article excerpt


What I hear, I forget.

What I see, I remember.

What I do, I understand.


In this paper we discuss different learning styles in the context of studying in virtual learning environments. The key idea of such environments is that learners construct their own mental models of knowledge (Norman, 1983) by studying course material, by interacting with the teacher, other learners or the system only, and by solving exercises. An essential aid for carrying out such a construction process successfully is getting feedback while the mental model is applied to solve problems. Based on the feedback the learner can verify the correctness of his or her mental model or tune it to better match the observations.

Feedback is a concept with a wide range of interpretations, ranging from simple confirmation of successful commands to critical comments and explanations given by an expert human tutor. In this paper, however, our focus lies on giving automatic feedback on non-trivial assignments. In many institutions basic computing courses have hundreds of students, and providing feedback on exercises that support the mental model construction is a very laborious process. Therefore many automatic assessment systems have been developed during the last 10 years to aid assessing exercises in large courses. Areas of interest include checking programming exercises (Vihtonen & Ageenko, 2002; Benford, Burke, Foxley, Gutteridge, & Zin, 1993; Jackson & Usher, 1997; Saikkonen, Malmi, & Korhonen, 2001), assessment of algorithm simulation exercises (Bridgeman, Goodrich, Kobourov, & Tamassia, 2000; Hyvonen & Malmi, 1993; Korhonen & Malmi, 2000), and analyzing object-oriented designs and flowcharts (Higgins, Symeonidis, & Tsintsifas, 2002). All the example systems can provide non-trivial specialized assignments. Therefore general purpose virtual learning environments and online training systems that can only provide structurally simple exercises such as fill-in forms or multiple choice questions are left out of the scope of this study.

Compared to human instructors, however, the feedback provided by automatic assessment systems can be very limited. In its simplest form, the feedback may include only textual descriptions. For example, in program analysis systems such as Ceilidh (Benford et al., 1993) and SchemeRobo (Saikkonen et al., 2001), the systems test submitted programs against test data, and report correct or incorrect functioning of the program. In addition, an evaluation of the program structure may follow, again in textual form. Another example is the TRAKLA system (Hyvonen & Malmi, 1993; Korhonen & Malmi, 2000) that is used for solving algorithm simulation exercises by showing how the given algorithm changes the given data structure. The system compares the solution to the correct model solution and tells how many points the student got from the exercise. Moreover, even though the students are able to simulate the algorithms in graphical form, most of the feedback received, including the model solutions, is still in textual form.

As pointed out by Felder and Silverman (1988), most people are able to grasp information in graphical form better than in textual form. Therefore in many occasions, graphical feedback supported by more advanced systems, is more useful than textual feedback. For example, the VIOPE (Vihtonen & Ageenko, 2002) system analyses C/C++/Java programs and gives graphical hints of the problematic areas in the target program. PILOT (Bridgeman et al., 2000) allows the user to simulate graph algorithms by clicking the nodes and edges on the screen. Incorrect selections are highlighted in a different color. TRAKLA2, which is the successor of TRAKLA, can provide model solutions also in a graphical form (see The model solution for each simulation exercise can be viewed as an algorithm animation. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.