Human-Computer Interaction: Ergonomics and User Interfaces - Vol. 1

By Hans-Jörg Bullinger; Jürgen Ziegler | Go to book overview
Save to active project

A Distributed System for Device Diagnostics
Utilizing Augmented Reality, 3D Audio, and
Speech Recognition

Venkataraman Sundareswaran,Reinhold Behringer,Steven Chen,and
Kenneth Wang
Rockwell Science Center, Thousand Oaks, CA 91360, USA


1 Motivation

Human Computer Interaction can be greatly enhanced if information from the computer can be projected directly in the field of view of a user as he goes about performing his tasks in the real world. This is the goal of Augmented Reality (AR): to display computer-generated information to appear embedded within the view of the real world.

Consider this future application scenario: a maintenance worker, wearing a light-weight see-through head-mounted display (HMD) approaches a complex machine; a CAD model of the machine appears in the HMD visually superimposed on the machine. As the maintenance worker moves around the machine, the pose and orientation of the display is dynamically modified so that the model appears to cling to the real machine. Using speech commands the maintenance worker asks for the machine error and diagnostics to be displayed. The error and diagnostics are displayed at 3D locations seemingly tethered to the problem spot, and 3D audio played over the headphones directs the user's attention to an adjacent machine that is part of the problem. At Rockwell Science Center (RSC), we have developed a system which is a conceptual prototype for HCI-rich AR applications. The novelty of our conceptual prototype is the integration of a new video-based AR technique, speech recognition, and 3D audio in a networked PC environment. The system demonstration is geared to address the needs in maintenance and training, to provide information not only just in time, but also just in place.

-466-

Notes for this page

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
Loading One moment ...
Project items
Notes
Cite this page

Cited page

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited page

Bookmark this page
Human-Computer Interaction: Ergonomics and User Interfaces - Vol. 1
Table of contents

Table of contents

Settings

Settings

Typeface
Text size Smaller Larger
Search within

Search within this book

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

While we understand printed pages are helpful to our users, this limitation is necessary to help protect our publishers' copyrighted material and prevent its unlawful distribution. We are sorry for any inconvenience.
Full screen
/ 1356

matching results for page

Cited passage

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited passage

Welcome to the new Questia Reader

The Questia Reader has been updated to provide you with an even better online reading experience.  It is now 100% Responsive, which means you can read our books and articles on any sized device you wish.  All of your favorite tools like notes, highlights, and citations are still here, but the way you select text has been updated to be easier to use, especially on touchscreen devices.  Here's how:

1. Click or tap the first word you want to select.
2. Click or tap the last word you want to select.

OK, got it!

Thanks for trying Questia!

Please continue trying out our research tools, but please note, full functionality is available only to our active members.

Your work will be lost once you leave this Web page.

For full access in an ad-free environment, sign up now for a FREE, 1-day trial.

Already a member? Log in now.

Are you sure you want to delete this highlight?