Human computer interaction (HCI) often relates purely to user input output at the keyboard, mouse and screen level. Advances in multimedia have allowed further sensory input to be brought into the interaction process such as the use of advanced 3D graphics, sound, tactile and force feedback I/O devices. A relatively new area of research in HCI with proposed potential benefits is the detection of the position and movement of the entire human body and the processing of this information to provide meaningful interaction with the computer.
A necessary starting point for HCI, utilising the human body as the data input device, is a detection device that will accurately measure the limb configuration of the entire body. The system should capture the motion of each limb and the main corpus in real time and pass the data to the computer for analysis. Several systems are available to study body motion however most of them simply video record the motion of illuminated markers placed on the subject and rely on the expert to analyse the predominantly 2D motion patterns by eye. For computer interaction a more direct approach is required since computer analysis of 3D position is required in real time. A small group of companies have developed full body motion capture systems which allow the researcher to place sensors at points on the body such as the hands, wrist, elbow, shoulder hips, knees ankle and so on ( Coco 1997). These systems can sense and provide to the computer the exact position of each sensor in 3 dimensions. However two problems