OPERATORS OF GROUND ROBOTS typically have relied upon laptop computers or game controller devices to navigate their unmanned vehicles and direct sensor movements. But several companies have developed technologies that untether troops from immobile controllers and give them the ability to hold their weapons and multitask while commanding their robots.
Thlnk-A-Move Ltd., based In Beachwood, Ohio, has created a human-machine interface system that allows operators to control a robot through vocal commands.
When a person speaks or moves the tongue, sound waves are generated through the ear canal.
"Our technology picks up those signals that come through the ear canal through an ear piece, and then we process those signals to eliminate ambient noise and use them for voice control or communications," says Jonathan Brown, vice president of sales and marketing.
An earpiece, similar to an iPod earbud, connects to a Sony Vaio computer the size of a paperback book. Speaking commands such as "forward," "left," and "right," operators can guide a robot's movements while keeping their hands on a weapon and their heads up.
"Not only do they have their hands free to do something else, but they're not looking at a screen as they're trying to control the robot. It's better from a situational awareness standpoint, and also from a multi-tasking standpoint," says Jim Harris, president of the company.
Commands can be spoken at different volumes, which is important, depending on the mission, they say. If troops are operating In a situation where radio silence is required, the technology gives them the ability to give subvocal commands using their tongues.
"Just as they might communicate with other members of their squad using hand signals silently, this enables them to communicate with the UGV silently," says Brown.
The device also works accurately in noisy environments, up to 80 or 90 decibels. It can distinguish between the operator's commands and those given by someone else in close proximity. Developers also have produced an audio feedback capability by adding a speaker. If a robot is equipped with a microphone, an operator can listen to what the robot hears in the same earpiece controller.
Additionally, the mobile PC allows users to view images and telemetry information via the robot's cameras, says Harris.
The technology has been integrated onto an iRobot Packbot. It has been demonstrated to scientists at the Army's Tank Automotive Research Development and Engineering Center, which has approved the company to move forward with a field-deployable version. Harris says a prototype will be ready for deployment next year.
As unmanned ground vehicle technologies improve, the military expects to incorporate more robots into its forces for use by smaller units, such as platoons and squads. Though these robots are seen as force multipliers, their bulky and complex controller interfaces generally demand the full attention of operators - a situation that can be deadly on patrol or during covert operations.
"Marines and soldiers are going to need simple, intuitive ways to control these assets," says Jack Vice, president and chief technology officer for AnthroTronix Inc. in Silver Spring, Md.
The company has developed several human-machine interface technologies that are designed for use by dismounted troops in battlefield conditions.
The visually integrated sensors unit, which gives troops control of multiple unmanned vehicles, comes in a camcorder format that is held up to the eye like binoculars. By pressing buttons on top of the unit, operators can navigate through menus to use various functions, including target designation, mapping and robot operation.
For example, if a team is on patrol outside a base and stumbles upon a suspicious parked car, an operator can pick up the VIS unit and use its laser rangefinder to designate that car as a potential threat. …