Sign up
Forgot password?
FAQ: Login

Jordan M.I. (ed.) Learning in Graphical Models

  • pdf file
  • size 9,04 MB
  • added by
  • info modified
Jordan M.I. (ed.) Learning in Graphical Models
Springer, 1998. — 623 p.
Graphical models are a marriage between probability theory and graph theory. They provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering – uncertainty and complexity – and in particular they are playing an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity - a complex system is built by combining simpler parts. Probability theory provides the glue whereby the parts are combined, insuring that the system as a whole is consistent, and providing ways to interface models to data. The graph theoretic side of graphical models provides both an intuitively appealing interface by which humans can model highly-interacting sets of variables as well as a data structure that lends itself naturally to the design of efficient general-purpose algorithms.
Many of the classical multivariate probabilistic systems studied in fields such as statistics, systems engineering, information theory, pattern recognition and statistical mechanics are special cases of the general graphical model formalism – examples include mixture models, factor analysis, hidden Markov models, Kalman filters and Ising models. The graphical model framework provides a way to view all of these systems as instances of a common underlying formalism. This has many advantages – in particular, specialized techniques that have been developed in one field can be transferred between research communities and exploited more widely. Moreover, the graphical model formalism provides a natural framework for the design of new systems.
This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four of the chapters are tutorial articles (those by Cowell, MacKay, Jordan, et al., and Heckerman). The remaining articles cover a wide spectrum of topics of current research interest.
The book is divided into four main sections: Inference, Independence, Foundations for Learning, and Learning from Data. While the sections can be read independently of each other and the articles are to a large extent self-contained, there also is a logical flow to the material. A full appreciation of the material in later sections requires an understanding of the material in the earlier sections.
Part I: Inference
Introduction to Inference for Bayesian Networks
Advanced Inference in Bayesian Networks
Inference in Bayesian Networks using Nested Junction Trees
Bucket Elimination: A Unifying Framework for Probabilistic Inference
An Introduction to Variational Methods for Graphical Models
Improving the Mean Field Approximation via the Use of Mixture Distributions
Introduction to Monte Carlo Methods
Suppressing Random Walks in Markov Chain Monte Carlo using Ordered Overrelaxation
Part II: Independence
Chain Graphs and Symmetric Associations
The Multiinformation Function as a Tool for Measuring Stochastic Dependence
Part III: Foundations for Learning
A Tutorial on Learning with Bayesian Networks
A View of the EM Algorithm that Justifies Incremental, Sparse, and Other Variants
Part IV: Learning from Data
Latent Variable Models
Stochastic Algorithms for Exploratory Data Analysis: Data Clustering and Data Visualization
Learning Bayesian Networks with Local Structure
Asymptotic Model Selection for Directed Networks with Hidden Variables
A Hierarchical Community of Experts
An Information-Theoretic Analysis of Hard and Soft Assignment Methods for Clustering
Learning Hybrid Bayesian Networks from Data
A Mean Field Learning Algorithm for Unsupervised Neural Networks
Edge Exclusion Tests for Graphical Gaussian Models
Hepatitis B: A Case Study in MCMC
Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond
  • Sign up or login using form at top of the page to download this file.
  • Sign up
Up