Prentice Hall, 1999. — 864 p.
Neural Networks, or artificial neural networks to be more precise, represent a technology that is rooted in many disciphnes: neurosciences, mathematics, statistics, physics, computer science, and engineering. Neural networks find applications in such diverse fields as modeling, time series analysis, pattern recognition, signal processing, and control by virtue of an important property: the abihty to learn from input data with or without a teacher.
This book provides a comprehensive foundation of neural networks, recognizing the multidisciplinary nature of the subject. The material presented in the book is supported with examples, computer-oriented experiments, end-of-chapter problems, and a bibliography.
The book concludes with an epilogue that briefly describes the role of neural networks in the construction of intelligent machines for pattern recognition, control, and signal processing.
The organization of the book offers a great deal of flexibility for use in graduate courses on neural networks. The final selection of topics can only be determined by the interests of the instructors using the book. To help in this selection process, a study guide is included in the accompanying manual.
Learning Processes
Single Layer Perceptrons
Multilayer Perceptrons
Radial-Basis Function Networks
Support Vector Machines
Committee Machines
Principal Components Analysis
Self-Organizing Maps
Information-Theoretic Models
Stochastic Machines And Their Approximates Rooted In Statistical Mechanics
Neurodynamic Programming
Temporal Processing Using Feedforward Networks
Neurodynamics
Dynamically Driven Recurrent Networks