Back Propagation: Theory, Architectures, and Applications

Back Propagation: Theory, Architectures, and Applications

Back Propagation: Theory, Architectures, and Applications

Back Propagation: Theory, Architectures, and Applications

Synopsis

Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics, machine learning, and dynamical systems. The second presents a number of network architectures that may be designed to match the general concepts of Parallel Distributed Processing with backpropagation learning. Finally, the third section shows how these principles can be applied to a number of different fields related to the cognitive sciences, including control, speech recognition, robotics, image processing, and cognitive psychology. The volume is designed to provide both a solid theoretical foundation and a set of examples that show the versatility of the concepts. Useful to experts in the field, it should also be most helpful to students seeking to understand the basic principles of connectionist learning and to engineers wanting to add neural networks in general -- and backpropagation in particular -- to their set of problem-solving methods.

Excerpt

Almost ten years have passed since the publication of the now classic volumes Parallel Distributed Processing: Explorations in the Microstructure of Cognition. These volumes marked a renewal in the study of brain-inspired computations as models of human cognition. Since the publication of these two volumes, thousands of scientists and engineers have joined the study of Artificial Neural Networks (or Parallel Distributed Processing) to attempt to respond to three fundamental questions: (1) how does the brain work? (2) how does the mind work? (3) how could we design machines with equivalent or greater capabilities than biological (including human) brains?

Progress in the last 10 years has given us a better grasp of the complexity of these three problems. Although connectionist neural networks have shed a feeble light on the first question, it has become clear that biological neurons and computations are more complex than their metaphorical connectionist equivalent by several orders of magnitude. Connectionist models of various brain areas, such as the hippocampus, the cerebellum, the olfactory bulb, or the visual and auditory cortices have certainly helped our understanding of their functions and internal mechanisms. But by and large, the biological metaphor has remained a metaphor. And neurons and synapses still remain much more mysterious than hidden units and weights.

Artificial neural networks have inspired not only biologists but also psychologists, perhaps more directly interested in the second question. Although the need for brain-inspired computations as models of the workings of the mind is still controversial, PDP models have been successfully used to model a number of behavioral observations in cognitive, and more rarely, clinical or social psvchology. Most of the results are based on models of perception, language, memory, learning, categorization, and control. These results, however, cannot pretend to represent the beginning of a general understanding of the human psyche. First, only a small fraction of the large quantity of data amassed by experimental psychologists has been examined by neural network researchers. Second, some higher levels of human cognition, such as problem solving, judgment, reasoning, or decision making rarely have been addressed by the connectionist community. Third, most models of experimental data remain qualitative and limited in scope: No general connectionist theory has been proposed to link the various aspects of cognitive processes into a general computational framework. Overall, the . . .

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.