Creating Computers That Lie like Humans: Virtual-Reality Interrogation Training

Article excerpt

Article Reviewed:

McKenzie, F., Scerbo, M., Catanzaro, J., & Phillips, M. (2003). Nonverbal indicators of malicious intent: Affective components for interrogative virtual reality training. International Journal of Human-Computer Studies, 59(1-2), 237-244.

In light of the current climate in North America and around the world, it has become increasingly important for security personnel to correctly identify suspicious behaviour. Whether working with airport security, as a border guard, or peacekeeping in a foreign country, the ability to correctly recognize and discern between various levels of anger, anxiety, and deception could mean the difference between life and death. As a result, programs aimed at generating more realistic and effective training methods have become of increased interest to researchers and various security agencies, including the U.S. Armed Forces and the Department of Defense.

In their article, McKenzie, Scerbo, Catanzaro, and Phillips (2003) described ongoing research at Old Dominion University (ODU) aimed at advancing virtualreality technology by recreating various affective component behaviours related to nervousness, anger, and deception in virtual environments. The authors stated that the addition of these components to training programs that utilize virtual reality could one-day result in programs designed to prepare and instruct trainees in social interactions scenarios. Although the training program described by the authors involved military checkpoint duty in a foreign country, the authors suggested such technology could one day be applied to training various forms of police and security work (i.e., police interrogation, airport security, and border crossings).

Virtual-reality training programs have begun to be more widely used for various forms of instruction. They are believed to be a more flexible, safer, and potentially less expensive way to train individuals for a variety of tasks (e.g., battlefield weapons use, police negotiations, flight simulation, and hazardous material emergencies). Supporters of virtual-reality training have argued that these types of programs present instructors with an infinite variety of possible scenarios, thus allowing them the opportunity to better customize a training program for the specific needs of the trainees. Further, it has been argued that virtual-reality training environments are much safer than "real life" training environments because more aspects of the physical environment can be controlled (e.g., weather factors, ammunition misfires, or mechanical failures).

The virtual-reality training environment described in this article involves the trainee being on duty at a checkpoint station in a "typical third world urban area". Utilizing motion tracking technology (Ascension Flock of Birds magnetic tracking system) and speech recognition software, the trainee's movements and communications are tracked, captured, and digitally replicated by the system, allowing for high levels of interaction between the human trainee and the virtual agents (computer generated characters) within the scenario. The primary goal of the specific program being developed at ODU is to improve training in the recognition of behavioural cues that are related to possible aggressive, hostile, and deceptive actions. Specifically, the authors' current research involves generating in the virtual agents the many subtle cues associated with deception. In other words, their intention is to advance the realism of the training program by digitally replicating the affective (or emotional) behaviours associated with anger, nervousness, and deception.

In the article, the authors reported that past research in the detection of deception has shown that the majority of the information communicated by suspicious behaviour is not verbal. Non-verbal behaviours such as facial expressions are often the most readily available source of information, however, individuals can train themselves to control and disguise such cues of deception. …