Sign up
Forgot password?
FAQ: Login

Cherkassky V., Mulier F. Learning from Data. Concepts, Theory, and Methods

  • pdf file
  • size 3,10 MB
  • added by
  • info modified
Cherkassky V., Mulier F. Learning from Data. Concepts, Theory, and Methods
IEEE Press/John Wiley, 2007. — 557 p.
A learning method is an algorithm (usually implemented in software) that estimates an unknown mapping (dependency) between a system’s inputs and outputs from the available data, namely from known (input, output) samples. Once such a dependency has been accurately estimated, it can be used for prediction of future system outputs from the known input values. This book provides a unified description of principles and methods for learning dependencies from data.
Methods for estimating dependencies from data have been traditionally explored in diverse fields such as statistics (multivariate regression and classification), engineering (pattern recognition), and computer science (artificial intelligence, machine learning, and, more recently, data mining). Recent interest in learning from data has resulted in the development of biologically motivated methodologies, such as artificial neural networks, fuzzy systems, and wavelets.
Unfortunately, developments in each field are seldom related to other fields, despite the apparent commonality of issues and methods. The mere fact that hundreds of ‘‘new’’ methods are being proposed each year at various conferences and in numerous journals suggests a certain lack of understanding of the basic issues common to all such methods.
The premise of this book is that there are just a handful of important principles and issues in the field of learning dependencies from data. Any researcher or practitioner in this field needs to be aware of these issues in order to successfully apply a particular methodology, understand a method’s limitations, or develop new techniques.
This book is an attempt to present and discuss such issues and principles (common to all methods) and then describe representative popular methods originating from statistics, neural networks, and pattern recognition. Often methods developed in different fields can be related to a common conceptual framework. This approach enables better understanding of a method’s properties, and it has methodological advantages over traditional ‘‘cookbook’’ descriptions of various learning algorithms.
Problem Statement, Classical Approaches, and Adaptive Learning
Regularization Framework
Statistical Learning Theory
Nonlinear Optimization Strategies
Methods for Data Reduction and Dimensionality Reduction
Methods for Regression
Classification
Support Vector Machines
Noninductive Inference and Alternative Learning Formulations
Concluding Remarks
A: Review of Nonlinear Optimization
B: Eigenvalues and Singular Value Decomposition
  • Sign up or login using form at top of the page to download this file.
  • Sign up
Up