Magazine article National Defense

Body Language Takes on New Meaning in PC World

Magazine article National Defense

Body Language Takes on New Meaning in PC World

Article excerpt

With the advent of touch-screen technologies, navigating the digital world is no longer limited to typing on a keyboard and clicking on a mouse. Today, we drag our fingers across handheld devices and tabletop displays to interact with electronic data. In the near future, we may command such systems without touching them at all.

The same researchers who developed the touch capability for Microsoft Surface are exploring ways to interface with computers and other digital gadgets using hand gestures.

"We're tracking specific kinds of motions," says Hrvoje Benko, a researcher on the Redmond, Wash.-based Microsoft Research team, during a recent road show.

Inside a small makeshift planetarium, he stands behind an omni-directional projector that is producing a 360-degree image of the nighttime sky. He places his hands into the light beam so that they cast a shadow on the dome above--the better to demonstrate his motions in the darkened space.


"We can literally reach into the image and move the sky," he says as he touches his index fingers to his thumbs in a pinching motion and moves his hands apart to "pull" the edges of the sky outward. The overhead image expands, zooming in on a duster of distant stars. He reverses the motion, pushing his hands towards each other. This time, the sky contracts and zooms out beyond the Milky Way galaxy.

The secret lies inside the projector where a wide-angle lens focuses the data onto the 360-degree environment. The team has added an infrared camera that shares the projector's lens to detect a person's gestures in mid-air. Anywhere the images are projected, the camera can also "see." Algorithms allow the system to correlate the holes made by a person's hands to computer functions.

Touching the thumbs and index fingers together to form a large teardrop-shaped hole signals to the computer to listen for a vocal command. "Crab Nebula," Benko says into a headset microphone. The planetarium fills with the requested telescope imagery. Using the same method, he switches to a replica of his team's office building, pulls up real-time video footage of the planetarium's exterior and immerses viewers in a snow globe simulation.

"More and more, we're seeing the emergence of other kinds of data that might be more suitable for other form factors and displays," he says. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.