Wiley-IEEE Press, 2007, 681 p., ISBN: 047005669X The Handbook of Neural Engineering provides theoretical foundations in computational neural science and engineering and current applications in wearable and implantable neural sensors/probes. Inside, leading experts from diverse disciplinary groups representing academia, industry, and private and government organizations present...
Neural Computation 22, 1–32 (2010). Neurons receive thousands of presynaptic input spike trains while emitting a single output spike train. This drastic dimensionality reduction suggests considering a neuron as a bottleneck for information transmission. Extending recent results, we propose a simple learning rule for the weights of spiking neurons derived from the information...
Paper. — Appl. Sci, 2023. — Vol. 13, №4475. — p. 1 - 33. Several machine learning (ML) methodologies are gaining popularity as artificial intelligence (AI) becomes increasingly prevalent. An artificial neural network (ANN) may be used as a “black-box” modeling strategy without the need for a detailed system physical model. It is more reasonable to solely use the input and...
This paper proposes a new neocognitron that accepts incremental learning, without giving a severe damage to old memories or reducing learning speed. The new neocognitron uses a competitive learning, and the learning of all stages of the hierarchical network progresses simultaneously. To increase the learning speed, conventional neocognitrons of recent versions sacrificed the...
The author previously proposed a neural network model neocognitron for robust visual pattern recognition. This paper proposes an improved version of the neocognitron and demonstrates its ability using a large database of handwritten digits (ETL1). To improve the recognition rate of the neocognitron, several modifications have been applied: such as, the inhibitory surround in...
Rijeka: InTech, 2011. — 586 p. — ISBN: 978-953-307-188-6. This book covers 27 articles in the applications of artifi cial neural networks (ANN) in various disciplines which includes business, chemical technology, computing, engineering, environmental science, science and nanotechnology. They modeled the ANN with verifi cation in diff erent areas. They demonstrated that the ANN...
Publication details not specified NVIDIA. - 2019 Abstract We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the...
Neural Computation 21, 911–959 (2009) Independent component analysis (or blind source separation) is assumed to be an essential component of sensory processing in the brain and could provide a less redundant representation about the external world. Another powerful processing strategy is the optimization of internal representations according to the information bottleneck...
/Proceedings of the 1988 Connectionist Models Summer School, pages 21-28, CMU, Pittsburgh, Pa, 1988 Abstract: Among all the supervised learning algorithms, back propagation (BP) is probably the most widely used. Although numerous experimental works have demonstrated its capabilities, a deeper theoretical understanding of the algorithm is definitely needed. We present a...
Neural Computation, 9 (1997) We show that networks of relatively realistic mathematical models for biological neurons in principle can simulate arbitrary feedfornard sigmoidal neural neb in away that has previoudv not been considered. This new approach is based on temporal coding b; single spikes (mspectively bv the timin.e, of svnchmnous firin= in ~ o a l osf neurons) rather...
Neural Computation 8, 1-40 (1996) We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phase differences between spike-trains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of...
Network: Comput. Neural Syst. 8 (1997) 355–371. A theoretical model for analogue computation in networks of spiking neurons with temporal coding is introduced and tested through simulations in GENESIS. It turns out that the use of multiple synapses yields very noise robust mechanisms for analogue computations via the timing of single spikes in networks of detailed compartmental...
Neural Computation 13, 2477–2494 (2001) Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We...
Hamburg Institute of International Economics, 2018. — 17 p. Artificial neural networks have become increasingly popular for statistical model fitting over the last years, mainly due to increasing computational power. In this paper, an introduction to the use of artificial neural network (ANN) regression models is given. The problem of predicting the GDP growth rate of 15...
Article. — Bulletin of Mathematical Biology Vol. 52, No. 1/2. - 1990. - P. 99-115. Because of the “all-or-none” character of nervous activity, neural events and the relations among them can be treated by means of propositional logic. It is found that the behavior of every net can be described in these terms, with the addition of more complicated logical means for nets...
IEEE Transactions on Neural Networks. — Volume 1, No. 1, March 1990. — Pages 4-27. In the literature a large variety of neural nets has been proposed all having the capability of modeling the dynamic behavior of a system. In this paper a neural net is used to build a predictor for such a dynamical system. This neural net is then used in a model based predictive control...
Theoretical Computer Science 287 (2002), - P. 251 – 265. We discuss in this short survey article some current mathematical models from neurophysiology for the computational units of biological neural systems: neurons and synapses. These models are contrasted with the computational units of common arti$cial neural networkmodels, which re.ect the state of knowledge in...
Article. - Acta Mechanica et Automatica. - 2008. - No. 4. — p. 81-85. This paper presents an adaptive neural network approach to control of mechatronics objects. This approach is applied in adaptive control of DC motor in SISO-system and 3-DOF robot arm actuators in MIMO system. Results of computer simulation and comparison with other control techniques are introduced.
Institute for Theoretical Computer Science, Graz University of Technology The principles by which spiking neurons contribute to the astounding computational power of generic cortical microcircuits, and how spike-timing-dependent plasticity (STDP) of synaptic weights could generate and maintain this computational function, are unknown. We show here that STDP, in conjunction with...
Building effective classification systems is a central task in data mining and machine learning. Usually, a classification algorithm builds a model from a given set of data records in which the labels are known, and later, the learned model is used to assign labels to new data points. Applications of such classification setting abound in many fields, for instance, in text...
Neural Networks for Signal Processing [1994] IV. Proceedings of the 1994 IEEE Workshop. The paper presents a contribution to the analysis of wavelet transfer function use in neural network systems and the discussion of some possible learning algorithms of such structures. Wavelets local properties both in time and frequency domains are stated at first giving motivation for...
Technical Report IDSIA-03-14 / arXiv:1404.7828 v4 [cs.NE] (88 p., 888 references). The Swiss AI Lab IDSIA. Istituto Dalle Molle di Studi sull’Intelligenza Artificiale. University of Lugano & SUPSI. Galleria 2, 6928 Manno-Lugano. Switzerland. 8 October 2014. Abstract. In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in...
Neural Computation 21, 2502–2523 (2009) From a theoretical point of view, statistical inference is an attractivemodel of brain operation. However, it is unclear how to implement these inferential processes in neuronal networks.We offer a solution to this problem by showing in detailed simulations how the belief propagation algorithm on a factor graph can be embedded in a...
Google Research, Brain Team, Mountain View, CA. Preprint, to apear in ICML 2019. Abstract. Convolutional Neural Networks (ConvNets) are commonly developed at a fixed resource budget, and then scaled up for better accuracy if more resources are available. In this paper, we systematically study model scaling and identify that carefully balancing network depth, width, and...
Springer, 2010. — 921 p. This book is a part of the Proceedings of the Seventh International Symposium on Neural Networks (ISNN 2010), held on June 6-9, 2010 in Shanghai, China. Over the past few years, ISNN has matured into a well-established premier international symposium on neural networks and related fields, with a successful sequence of ISNN series in Dalian (2004),...
Comments