Indoor mobile robots are becoming reliable enough in navigation tasks to consider working with teams of robots. Using SRI International's open-agent architecture (OAA) and SAPHIRA robotcontrol system, we configured three physical robots and a set of software agents on the internet to plan and act in coordination. Users communicate with the robots using a variety of multimodal input: pen, voice, and keyboard. The robust capabilities of the OAA and SAPHIRA enabled us to design and implement a winning team in the six weeks before the Fifth Annual AAAI Mobile Robot Competition and Exhibition.
At the SRI International Artificial IntelliAgence Lab, we have a long history of building autonomous robots, from the original SHAKEY (remember the STRIPS planner?) through FLAKEY (Congdon et al. 1993) and, more recently, the Pioneer class of small robots. Our current research focuses on realtime vision for robots and multirobot planning using an agent-oriented architecture.
For the Fifth Annual AAAI Mobile Robot Competition and Exhibition, held as part of the Thirteenth National Conference on Artificial Intelligence (AAAI-96), we wanted to showcase our research, especially the ability to control multiple robots using a distributed set of software agents on the internet. The agent technology, called the open-agent architecture (OAA), was developed at SRI as a way of accessing many different types of information available in computers at different locations.
In the Office Navigation event, a robot starts from the director's office, determines which of two conference rooms is empty, notifies two professors where and when the meeting will be held, and then returns to tell the director. Points are awarded for accomplishing the different parts of the task, communicating effectively about its goals, and finishing the task quickly. Our strategy was simple: Use as many robots as we could to cut down on the time to find the rooms and notify the professors. We decided that three robots was an optimal choice: enough to search for the rooms efficiently but not too many to get in each other's way or strain our resources. We would have two robots searching for the rooms and professors and one remaining behind in the director's office and tell him/ her when the meeting would be. We were concerned that leaving one robot behind as a mobile telephone was stretching things a bit; so, we cleared our strategy with the judges well before the competition.
The two search robots are Pioneer-class robots, portable robots first developed by SRI for classroom use and now manufactured by Real World Interfaces, Inc. (RWI). They run SRI's SAPHIRA control software, which navigates the robots around an office environment, keeping track of where they are using perceptual cues from the robot sensors. Each Pioneer robot has seven sonar sensors, a fast-track vision system from Newton Labs, and a portable computer on top with a radio ethernet for communication to a base station (figure 1). The fast-track system is an interesting device: It consists of a small color video camera and a low-power processor. It can be trained to recognize particular colors and will indicate the position of any object with this color. We decided to use the vision system to find people in the conference rooms and trained it to recognize red. If the judges would wear red shorts, the vision system would easily pick them out.
The robot in the director's office didn't have to move, just relay information to the director. We used a Koala robot, a small, sixwheeled vehicle under development at the Swiss Federal Institute of Technology (figure 2). The Koala has infrared sensors, which enable it to avoid obstacles but make it difficult to map the environment because they do not give a reliable range estimate. We just kept the Koala stationary, with a portable PC on top to communicate with the other robots and talk to the director.
Each robot, by virtue of the radio ethernet, is a node on the internet and an agent in the OAA. …