Back Propagation: Theory, Architectures, and Applications

Synopsis

Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics, machine learning, and dynamical systems. The second presents a number of network architectures that may be designed to match the general concepts of Parallel Distributed Processing with backpropagation learning. Finally, the third section shows how these principles can be applied to a number of different fields related to the cognitive sciences, including control, speech recognition, robotics, image processing, and cognitive psychology. The volume is designed to provide both a solid theoretical foundation and a set of examples that show the versatility of the concepts. Useful to experts in the field, it should also be most helpful to students seeking to understand the basic principles of connectionist learning and to engineers wanting to add neural networks in general -- and backpropagation in particular -- to their set of problem-solving methods.

Additional information

Includes content by:
  • Richard Durbin
  • Richard Golden
  • Yves Chauvin
  • Alexander Waibel
  • Charles Schley
Publisher: Place of publication:
  • Hillsdale, NJ
Publication year:
  • 1995

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.