Springer, 2004. — 244 p.
Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners.
RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of nonlinear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.
RNNs have been found many applications, inc1uding applications relating to associative memory, image processing, pattern recognition, etc. Their dynamical properties playa crucial and important role to enable these successful practical applications of RNNs. From the engineering point of view, neural network models, which possess simple structures and well understood dynamic properties are of particular interest to consider in a practical application. It is essential and necessary to understand these dynamical properties in order to be able to choose and design effective control parameters.
Among the dynamical properties, convergence is one of the most important issues of concern in a practical application. In fact, the viability of many applications ofRNNs strongly depend on their convergence properties. Without a proper understanding of the convergence properties of RNNs, many of these applications would not be possible. Furthermore, convergence conditions are essential for the actual design of RNNs.
EssentiaIly, the study of convergence of RNN s broadly seeks to derive conditions which can guarantee that a RNN will be convergent. Since these conditions will be used in the design of RNNs, simple and relaxed convergent conditions of RNNs, which can be easily verified, are more amenable to practical use. Although many RNNs are described in terms of nonlinear systems, few of the existing results in the stability of nonlinear theory can meet the requirements. In fact, there have arised certain problems where RNNs can pose new challenges to mathematicians.
This book is focused mainly on the study of the convergence of RNNs. The RNN models explored in the book includes Hopfield RNNs, Cellular RNNs, Lotka-Volterra RNNs, RNNs with unsaturating activation functions, and discrete time RNNs, etc. RNNs with time delays will be of particular interest as it is well known that delays are important and commonly occurring parameters in RNNs. In biological neural networks, real neurons have integrative time delays due to capacitive effects. In artificial neural networks, time delays are inherently present due to hardware characteristics such as switching delays, parameter variability, parasitic capacitance, etc. RNNs with delays have found applications in the processing and compression of images, etc. Besides the direet applieations, it is also useful to analyze networks with delays since delays can drastically change the dynamics of a RNN. A stable RNN may become unstable when a small time delay is added. The convergence analysis of RNNs with delays aims to develop conditions for convergence which can be robust to the destabilising effects of delays.
Two kinds of convergence of RNNs will be studied in this book: monostability and multistability. Both of these convergence characteristics have direct implications on the applications of RNNs. Three methods will be used in the study: Lyapunov functions method, energy functions method, and inequalities analysis method. The analysis is focused on the nonlinear characteristics of RNNs. Rigorous analysis will be carried out where possible in these two kinds of convergence analysis of RNNs.
Hopfield Recurrent Neural Networks
Cellular Neural Networks
Recurrent Neural Networks with Unsaturating Piecewise Linear Activation Functions
Lotka-Volterra Recurrent Neural Networks with Delays
Delayed Recurrent Neural Networks with Global Lipschitz Activation Functions
Other Models of Continuous Time Recurrent Neural Networks
Discrete Recurrent Neural Networks