The Seventh International Conference on Intelligent User Interfaces (IUI-2003)

Article excerpt

Intelligent user interfaces (IUIs) mediate between user and system .to increase the ease and effectiveness of user-system interactions. Research in IUIs draws on areas such as AI and human-computer interaction to study methods for supporting varied users for a wide range of tasks, task environments, platforms, and interaction paradigms. The Intelligent User Interfaces Conference is the annual meeting of the IUI community and the principal international forum for reporting research and development in this area. IUI-2003, the seventh IUI conference, was held in Miami Beach, Florida, from 12 to 15 January 2003.

A special theme at this year's conference was interfaces whose focus goes beyond rational "intelligence" to address psychological concerns such as emotion, personality, and motivation. The conference received an all-time record number of submissions, covering a wide range of areas and approaches. Oral presentation sessions were organized into seven major topics: (1) adaptive and collaborative interfaces, (2) affective interfaces, (3) agent-based interfaces, (4) knowledge acquisition and visualization, (5) model-based interface design, (6) multimodal input, and (7) natural language interfaces.

The conference program included three invited talks, each reflecting a different direction for developing the intelligent interfaces of the future. In "What Users Want," Daniel Weld, the University of Washington, discussed the tension between interfaces that adapt automatically and their users' need for stability to maintain an accurate mental model, predict system behavior, and feel in control. Weld presented principles for addressing this tension, described algorithms for mining user action traces, and proposed the algorithms' role in dynamically transforming interfaces. Hi roshi Ishii, the Massachusetts Institute of Technology MediaLab, presented "Tangible Bits: Designing the Seamless Interface between People, Bits, and Atoms," a vision of how to give physical form to digital information to take advantage of multimodal human senses and interaction skills. His presentation was illustrated with examples of projects aimed at realizing this vision by enabling interactions with graspable objects, augmented surfaces, and "ambient media" involving light and sound. Allen Gorin, AT&T Research, presented "Semantic Information Processing of Spoken Language in How MAY I HELP YOU?" His talk described a new generation of voice-based user interfaces. Unlike menu-driven systems, in which the user must conform to the menu and respond to narrow questions, the deployed "How MAY I HELP YOU?" system enables the users to drive the interaction by saying what they want, and the system understands and adapts to what the user says. …