Academic journal article Educational Technology & Society

Affective Behavior and Nonverbal Interaction in Collaborative Virtual Environments

Academic journal article Educational Technology & Society

Affective Behavior and Nonverbal Interaction in Collaborative Virtual Environments

Article excerpt

Introduction

Emotions influence our behavior (Kantor, 1921a; Kantor, 1921b). And as such, during the last two decades, the interest in the affective component in computers has grown under the research area of affective computer; aimed to the study of the relation that involves emotions and computers (Picard, 1997, p. 50). For learning, a special effort has been made to automatize the important tutor capability of the humans, to comprise the emotions of their students in the teaching process (Heyward, 2010; Moridis & Economides, 2008), that is, affective learning. However, due to their measure complexity, the automated sensing of emotions remains as an open issue.

According to Piccard & Daily (2005), the methods to evaluate affect for computer-users include the classic questionnaires. This self-report on emotions presents inconveniences such as interruption, and the fact that the emotional state of a person can change from moment to moment. The evaluation of affective behavior also includes body measures. Both, based on physiological signals through sensors such as for body-worn, EEG (electroencephalogram) or ECG (electrocardiogram) (e.g., Agrafioti, Hatzinakos, & Anderson, 2012; Bamidis, Papadelis, Kourtidou-Papadeli, Pappas, & Vivas, 2004), or through body activity such as facial expressions, posture, hand tension, gestures or vocal expressions (e.g., Ammar, Neji, Alimi, & Gouarderesc, 2010; see Picard & Daily, 2005). Typically, this type of evaluation requires special equipment, and for the user to be greatly aware of it. A third approach is the one based on task-measures, that as indicated by Picard & Daily (2005), is an indirect evaluation that considers that our affective state influences our behavior on subsequent tasks (e.g., Lerner, Small, & Loewenstein, 2004; Liu, Lieberman, & Selker, 2003). Although, this type of measure is not intrusive, because task-measures are indirect, their results are usually applicable in populations and not in individuals (Picard & Daily, 2005).

On the other hand, what a person does is not an expression of internal or innate entities, but a direct effect of what is happening in the environment (Kantor, 1921a; Kantor, 1921b). Almost a hundred years ago, Kantor (1921a; 1921b) was already concerned about the recurrent causal connection between mental and physiological states in the study of emotions. On this regard, Boehner, DePaula, Dourish & Sengers (2007) recommended for affect computing to incorporate an interactionist approach, emotions as constructed through interaction and expression, and not as an objective natural fact. Boehner et al. (2007) proposal is mainly intended for the human-computer interactions (HCI), and not particularly for computer-mediated human interactions.

For Kantor (1929), the elements to consider in the analysis of affective behavior are: the stimulus or events that the individuals face, the individual's behavioral repertoire, the reaction speed, the physiological conditions of the person, the familiarity with the stimulus object, the context and the interaction circumstances, and the presence of certain persons in the situation to be analyzed. These last elements turn out to be particularly important when the aim is to analyze the affective component of the individual's behavior, within a group, while they solve a collaborative task.

People express themselves through their interaction with the others; and in computer-meditated interactions, every user intervention can be recorded and analyzed in time. However, automatic understanding of unstructured dialogue has not being completely accomplished yet (Jermann, Soller, & Lesgold, 2004) or it represents a high computer cost. Nevertheless, in Collaborative Virtual Environments (CVEs) users interact via their graphical representation, their avatar, which includes the capability of the display of nonverbal interaction. Our nonverbal behavior comprises most of what we do but the word meaning, including patterns of verbal interchange like gaps or pauses (Heldner & Edlund, 2010), the objects when they are part of the task at hand or proxemics behavior, to name some among other nonverbal cues. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.