Magazine article AI Magazine
Transparency and trust play an important role in the acceptance of information systems. Systems able to explain their decisions, concepts, and information sources to users can demonstrate their trustworthiness and support users' understanding. In addition, systems able to generate explanations for their own internal use and to reason about their explanations may be able to increase their robustness in dealing with unexpected situations, as well as to improve their future performance, by using explanations to refine their internal models and reasoning processes. All of these benefits depend on the ability of the systems to generate high-quality explanations and to exploit explanations in their processing.
Advancing the capability of systems to generate and use explanations depends on advances in the models, methods, and tools available for managing explanation-relevant information, as well as on developing effective methods for retrieving explanation-relevant information and integrating explanation and application knowledge. Beyond technical considerations, the design of effective explanation systems must reflect fundamental insights about the nature and use of explanations, as illuminated? for example? in philosophical and psychological investigations, as well as by social perspectives on the context and use of information technology applications.
Disciplines such as artificial intelligence, cognitive science, linguistics, philosophy of science, and education have all considered varying aspects of explanation. Each has insights to share, but few opportunities exist for interaction and clarification of their different views. This AAAI workshop, known as ExaCt 2007, provided such a forum. This two-day workshop, a sequel to the AAAI Fall Symposium (ExaCt 2005), brought together researchers and practitioners from diverse groups to study explanation issues and explore the requirements for effective explanation in information technology applications. …