Cognitive science faces a philosophical challenge
John R. Searle sees gaping cracks in the edifice of the mind constructed by cognitive scientists. Searle, a philosopher at the University of California, Berkeley, peruses the mental rules and representations and computer programs that buttress the cognitive citadel with the eye of a skeptical contractor. Watch out for falling bricks, he warns; the structure lacks the mortar of consciousness to hold it together.
"More than anything else, it is the neglect of consciousness that accounts for so much barrenness and sterility in psychology, the philosophy of mind, and cognitive science," Searle asserts.
Although Searle's remark will win him no popularity contests among scientists of the mind, it nevertheless reflects the recently renewed interest in deciphering the nature of consciousness. From a variety of perspectives, scientists are now trying to define more clearly what they mean when they refer to "conscious" and "unconscious" mental activity.
Searle first rankled cognitive scientists in 1980 when he published his widely cited "Chinese Room" argument, an attack on the notion, promoted by advocates of "strong artificial intelligence," that the mind corresponds to a computer program implemented in the hardware of the brain.
Searle compared the computers favored by artificial intelligence enthusiasts to a person who does not speak Chinese but sits in a room with Chinese dictionaries and a filing system. If an outsider slips questions written in Chinese under the door, the person uses the reference works to compose answers in Chinese. Responses emerging from the room might prove indistinguishable from those of a native Chinese speaker, Searle contended, even though the person toiling in the Chinese Room understands neither the questions nor the answers.
The moral of this exercise: A system such as a computer can successfully employ a set of logical rules without knowing the meaning of any of the symbols it manipulates using those rules.
Supporters of strong artificial intelligence view the Chinese Room as a flimsy sanctuary from the argument that a properly programmed computer possesses a "mind." Philosopher Daniel C. Dennett of Tufts University in Medford, Mass., calls Searle's analogy simplistic and irrelevant. A computer program that could hold its own in a conversation would contain layers of complex knowledge about the world, its own responses, likely responses of a questioner, and much more, Dennett contends. Indeed, computers have displayed a growing conversational prowess in the last several years. Their increasingly deft dialogue stems from the interactions among various strands of information, each of which comprehends nothing on its own, Dennett maintains.
Put another way, proper programming transforms a bunch of unreflective parts into a thinking system, whether they reside in a mainframe or a human skull.
For Searle, the Chinese Room debate lies behind him as he aims his new assault on what he calls the "much deeper" mistake of cognitive scientists--their neglect of conscious experience. He describes his views in the December 1990 BEHAVIORAL AND BRAIN SCIENCES (with sometimes heated responses from more than 30 cognitive scientists) and in his book The Rediscovery of the Mind (1992, MIT Press).
Cognitive science tends to regard the mind as a collection of relatively independent faculties, or modules, that contain unconscious rules for language, perception, and other domains of thought, Searle argues. Consciousness, in the sense that one can pay attention to, reason about, or describe these rules, rarely gets attention in their theories. Other facets of the unconscious, such as memories and repressed conflicts, sometimes enter awareness but more often influence thought and behavior in surreptitious ways, according to cognitive researchers.
Searle spurns this approach, with its reliance on what he calls a "deep unconscious" unable to pierce the surface of awareness. …