Formalizations of Commonsense Psychology
Gordon, Andrew S., Hobbs, Jerry R., AI Magazine
Among the more challenging problems in the field of artificial intelligence are those that require computers to engage in commonsense reasoning. In representational areas where robust content theories exist, a whole suite of applications becomes possible. For example, given a commonsense ontology of time (as in Hobbs 2002), we can construct automated reasoning systems to tackle real-world problems associated with transportation logistics, event planning, and factory process scheduling that are robust in the face of real-world concerns like time zones, daylight savings time, and international calendar variations.
Given the importance of an ontology of time across so many different commonsense reasoning tasks, it is appropriate to devote effort and special attention to this representational area so as to develop an inferential basis that is logically sound. The same is true of many of the other content theories that have traditionally defined the scope of knowledge representation research, especially ontologies of events (Shanahan 1995), space (Cohn and Hazarika 2001), and physical entities (Davis 1993).
Despite the progress that has been made in engineering automated reasoning systems and expressive logical languages, the bottleneck continues to be the lack of large-scale content theories across the full breadth of commonsense representational areas. There have been significant efforts in the last few years in developing large-scale commonsense resources. Two such efforts are OpenCyc (www.opencyc.org) and the Suggested Upper Merged Ontology (Niles and Pease 2001). For the most part, however, these efforts have lacked a coherent empirical methodology for determining what content they should cover, and in part as a result of this, they are weak in areas of commonsense psychology, for example, an area that is critical for many aspects of strategic planning.
Indeed, when surveying the field of knowledge representation as a whole, one gets the sense that most knowledge representation researchers are more comfortable with concepts related to the natural sciences (for example, physics) than the social sciences (for example, psychology). Considering that tremendous progress has been made in commonsense reasoning in specialized topics such as thermodynamics in physical systems (Collins and Forbus 1989), it is surprising that our best content theories of people are still struggling to get past simple notions of belief and intentionality (van der Hoek and Wooldridge 2003). However, systems that can successfully reason about people are likely to be substantially more valuable than those that reason about thermodynamics in most future applications.
Content theories for reasoning about people are best characterized collectively as a theory of commonsense psychology, in contrast to those that are associated with commonsense (naive) physics. The scope of commonsense physics, best outlined in Patrick Hayes's first and second "Naive Physics Manifestos" (Hayes 1979, 1984), includes content theories of time, space, physical entities, and their dynamics. Commonsense psychology, in contrast, concerns all of the aspects of the way that people think they think. It should include notions of plans and goals, opportunities and threats, decisions and preferences, emotions and memories, along with all of the other mental states and processes that people attribute to themselves and others (Clark 1987).
Our contemporary understanding of commonsense psychology has been less informed by AI than by cognitive psychology, where reasoning about the mental states of other people has been studied as theory of mind abilities, than by AI. Developmental psychologists have noted that these abilities are strongly age-dependent (Wellman and Lagattuta 2000; Happe, Brownell, and Winner 1998) and have argued that they are central in explaining cognitive deficiencies associated with autism (Baron-Cohen 2000) and schizophrenia (Corcoran 2001). …