Sign up
Forgot password?
FAQ: Login

Cisek P., Drew T., Kalaska J.F. (eds.) Computational Neuroscience. Theoretical Insights into Brain Function

  • pdf file
  • size 10,01 MB
  • added by
  • info modified
Cisek P., Drew T., Kalaska J.F. (eds.) Computational Neuroscience. Theoretical Insights into Brain Function
Elsevier, 2007. — 571.
In recent years, computational approaches have become an increasingly prominent and influential part of neuroscience research. From the cellular mechanisms of synaptic transmission and the generation of action potentials, to interactions among networks of neurons, to the high-level processes of perception and memory, computational models provide new sources of insight into the complex machinery which underlies our behaviour. These models are not merely mathematical surrogates for experimental data. More importantly, they help us to clarify our understanding of a particular nervous system process or function, and to guide the design of our experiments by obliging us to express our hypotheses in a language of mathematical formalisms. A mathematical model is an explicit hypothesis, in which we must incorporate all of our beliefs and assumptions in a rigorous and coherent conceptual framework that is subject to falsification and modification. Furthermore, a successful computational model is a rich source of predictions for future experiments. Even a simplified computational model can offer insights that unify phenomena across different levels of analysis, linking cells to networks and networks to behaviour. Over the last few decades, more and more experimental data have been interpreted from computational perspectives, new courses and graduate programs have been developed to teach computational neuroscience methods and a multitude of interdisciplinary conferences and symposia have been organized to bring mathematical theorists and experimental neuroscientists together.
This book is the result of one such symposium, held at the Université de Montréal on May 8–9, 2006. It was organized by the Groupe de Recherche sur le Système Nerveux Central (GRSNC) as one of a series of annual international symposia held on a different topic each year. This was the first symposium in that annual series that focused on computational neuroscience, and it included presentations by some of the pioneers of computational neuroscience as well as prominent experimental neuroscientists whose research is increasingly integrated with computational modeling. The symposium was a resounding success, and it made clear to us that computational models have become a major and very exciting aspect of neuroscience research. Many of the participants at that meeting have contributed chapters to this book, including symposium speakers and poster presenters. In addition, we invited a number of other well-known computational neuroscientists, who could not participate in the symposium itself, to also submit chapters.
Of course, a collection of 34 chapters cannot cover more than a fraction of the vast range of computational approaches which exist. We have done our best to include work pertaining to a variety of neural systems, at many different levels of analysis, from the cellular to the behavioural, from approaches intimately tied with neural data to more abstract algorithms of machine learning. The result is a collection which includes models of signal transduction along dendrites, circuit models of visual processing, computational analyses of vestibular processing, theories of motor control and learning, machine algorithms for pattern recognition, as well as many other topics. We asked all of our contributors to address their chapters to a broad audience of neuroscientists, psychologists, and mathematicians, and to focus on the broad theoretical issues which tie these fields together.
The neuronal transfer function: contributions from voltage- and time-dependent mechanisms.
A simple growth model constructs critical avalanche networks.
The dynamics of visual responses in the primary visual cortex.
A quantitative theory of immediate visual recognition.
Attention in hierarchical models of object recognition.
Towards a unified theory of neocortex: laminar cortical circuits for vision and cognition.
Real-time neural coding of memory.
Beyond timing in the auditory brainstem: intensity coding in the avian cochlear nucleus angularis.
Neural strategies for optimal processing of sensory signals.
Coordinate transformations and sensory integration in the detection of spatial orientation and self-motion: from models to experiments.
Sensorimotor optimization in higher dimensions.
How tightly tuned are network parameters? Insight from computational and experimental studies in small rhythmic motor networks.
Spatial organization and state-dependent mechanisms for respiratory rhythm and pattern generation.
Modeling a vertebrate motor system: pattern generation, steering and control of body orientation.
Modeling the mammalian locomotor CPG: insights from mistakes and perturbations.
The neuromechanical tuning hypothesis.
Threshold position control and the principle of minimal interaction in motor actions.
Modeling sensorimotor control of human upright stance.
Dimensional reduction in sensorimotor systems: a framework for understanding muscle coordination of posture.
Primitives, premotor drives, and pattern generation: a combined computational and neuroethological perspective.
A multi-level approach to understanding upper limb function.
How is somatosensory information used to adapt to changes in the mechanical environment?
Trial-by-trial motor adaptation: a window into elemental neural computation.
Towards a computational neuropsychology of action.
Motor control in a meta-network with attractor dynamics.
Computing movement geometry: a step in sensory-motor transformations.
Dynamics systems vs. optimal control — a unifying view.
The place of ‘codes’ in nonlinear neurodynamics.
From a representation of behavior to the concept of cognitive syntax: a theoretical framework.
A parallel framework for interactive behavior.
Statistical models for neural encoding, decoding, and optimal stimulus design.
Probabilistic population codes and the exponential family of distributions.
On the challenge of learning complex functions.
To recognize shapes, first learn to generate images.
  • Sign up or login using form at top of the page to download this file.
  • Sign up
Up