Ontology is a discipline of philosophy whose name dates back to 1613 and whose practice dates back to Aristotle. It is the science of what is, the kinds and structures of objects, properties, events, processes, and relations in every area of reality. Ontology is, put simply, about existence. Like so many things, the term was borrowed by computer science and is rapidly becoming a buzzword in industry, tossed about by salesfolk, like all buzzwords, as if it were something everyone knew about. As it turns out, of course, very few people who use the word actually know what it means, and as a result, the actual meaning has changed, and is changing, over time.
All computer scientists who claim allegiance to this field are constantly peppered with the same question, "What is an ontology?" The answer is often argued back and forth by well-meaning people to clarify confusion, but often, the argument causes more confusion than it eliminates. Like many things, one must actually do ontology to understand what it is. I have, however, found that a little history lesson and some discussion can be informative, though not definitive.
Probably the most common citation in ontology is attributed to Gruber, who, in 1993, offered, "An ontology is a specification of a conceptualization." This definition has led the way in causing more confusion than it has eliminated. Others point to Gruber's article as the start of ontology research in computer science; however, the term was already in widespread use by that time, having been used first by John McCarthy in 1980 and subsequently by Hayes in 1984, Sowa in 1984, and Alexander et al. in 1986. The article by Alexander et al. appears to be the first published departure from the philosophical meaning of ontology and the start of a new computer science sense of the word. There is then a steady increase in mentions of ontology in the AI literature after 1986.
In fact, what the field of ontology research attempts to capture is a notion that is common to a number of disciplines: software engineering, databases, and AI to name but a few. In each of these areas, developers are faced with the problem of building an artifact that represents some portion of the world in a fashion that can be processed by a machine. Since at least 1976, the database community has recognized the key role this process must play in the design of information systems and named it conceptual modeling. In software engineering, the introduction of object-oriented systems also led to this realization some time in the early to mid-1980s, and it was named domain modeling. It is far more difficult to pin down when this realization was made in AI; certainly scientists were modeling the world in a logical form from the very first days in the late 1950s; however, these models tended to be examples that were used merely to test systems and theories. It was not, it seems, until the era of expert systems that knowledge engineering came into the light as a specific area of study.
In each of these areas, however, it was not until the mid-1990s that it became commonly understood that information systems, built on sound engineering principles, should be able to interoperate but could not. Each of these fields encountered the same problem and realized a similar solution: The meaning of what has been expressed in some formal system is embedded in operational semantics that cannot be divulged easily by inspection. Asking what a symbol such as author means in a library system is like asking what the red button on the dashboard does in one of James Bond's supercars: to find out you have to push it. Moreover, you have to push it again and again in a variety of circumstances until you feel you've gained an understanding.
Computer science ontology is, therefore, about meaning. Even more, it is about making meaning as clear as possible. This is an important and crucial point--it is certainly a departure from philosophy--yet it is still fairly vague. …