Academic journal article American Academic & Scholarly Research Journal

Fingertip Interaction with Large Screen Display System Using Virtual Touch Screen

Academic journal article American Academic & Scholarly Research Journal

Fingertip Interaction with Large Screen Display System Using Virtual Touch Screen

Article excerpt

ABSTRACT: Existing large scale display systems generally adopt an indirect approach to user interaction. This is due to the use of standard desktop-oriented devices, such as a mouse on a desk, to control the large wall-sized display. By using an infrared laser pointer and an infrared tracking device, a more direct interaction with the large display can be achieved, thereby reducing the cognitive load of the user and improving their mobility. This paper introduces a novel approach towards direct interaction with large display systems. However, the computer mouse remains the most common interaction tool for such displays. We propose a new approach for fingertip interaction with large display using virtual touchscreen. By taking into account the location of the user and the interaction area available, we can estimate an interaction surface virtual touchscreen between the display and the user. Users can use their pointing finger to interact with the display as if it was brought forward and presented directly in front of them, while preserving viewing angle. An interaction model is presented to describe the interaction with the virtual touchscreen, using the head-hand line method.

Keywords: Large Screen, Face Detection, Hand Gesture Recognition, Virtual touchscreen, Hand Pointing, Pointing Accuracy, Computer Vision, Fingertip Interaction, Human Computer Interaction(HCI), Bare-hand Control, Hand-posture Recognition.

1 INTRODUCTION

Large displays are now more widely used than ever before, yet in most cases, the user interaction still consists of the computer mouse a device that constraint users' mobility. One common approach is to use our own hand as the input method, making pointing as easy as pointing to real world objects. Computer vision can be used to detect and track the user's pointing gesture and is an active area of research. Computer vision based systems have the advantages of being a non-invasive input technique and do not require a physically touchable surface, which is highly suitable for interaction at a distance such as public spaces. To use the mouse, the user needs to place it on a desk or a flat surface. This constrains the user to stay within arm's reach of the table and thus reducing the mobility of the user. To interact with the system, the user moves the mouse horizontally across the table, while the cursor on the display moves vertically. The user also needs to spend a small fraction of time considering how their mouse movements will be mapped onto the large display, although this effect is reduced with practice. Yet another problem is the need to turn around every time to see where the pointer is on the large display. Thus the mouse is not optimal for interacting with large displays A better approach is a system that allows for direct interaction between the user and the object seen on the large display. For example, pointing at the display using fingers or a laser pointer, or to rotate objects by twisting, pushing or turning of the hands. In general, we need a device that allows the user to interact directly with the display, without the need for an intermediary device. Such systems are more natural and easier to use. The computer mouse that is currently being used is a pointing device. It is an intermediary device that facilitates the user, providing a means for humans to interact with the computer. It is a tool for mapping hand movements into an on-screen cursor so that on-screen objects can be manipulated. However, this mapping only provides an indirect interaction. This indirectness comes about because the output space is not the same as the input space. The output space is the display (the monitor in most cases sitting on top of a desk in front of a user) while the input space is the table (the horizontal desk that the mouse lies on). This indirectness causes reduced freedom as well as reduced efficiency. Eye tracking devices follow the movement of the pupils and can in principle be used for cursor control. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.