Sign up
Forgot password?
FAQ: Login

Knox S.W. Machine Learning: a Concise Introduction

  • pdf file
  • size 16,13 MB
  • added by
  • info modified
Knox S.W. Machine Learning: a Concise Introduction
Newark: John Wiley & Sons, Incorporated, 2018. — 330 p.
Organization.
How to Use This Book
About the Companion Website.
Introductionâ.
Examples from Real Life.
The Problem of Learning.
Domain.
Range.
Data.
Loss.
Risk.
The Reality of the Unknown Function.
Training and Selection of Models, and Purposes of Learning.
Regression.
General Framework.
Loss.
Estimating the Model Parameters.
Properties of Fitted Values.
Estimating the Variance.
A Normality Assumption.
Computation Categorical Features Feature Transformations, Expansions, and Interactions.
Variations in Linear Regression.
Nonparametric Regression.
Survey of Classification Techniques.
The Bayes Classifier.
Introduction to Classifiers.
A Running Example.
Likelihood Methods.
Prototype Methods.
Logistic Regression.
Neural Networks.
Classification Trees.
Support Vector Machines.
Postscript: Example Problem Revisited.
Bias.
Variance Trade-off.
Squared-Error Loss.
Arbitrary Loss.
Combining Classifiers Ensembles Ensemble Design.
Bootstrap Aggregation (Bagging).
Bumping.
Random Forests.
Boosting.
Arcing.
Stacking and Mixture of Experts.
Risk Estimation and Model Selection.
Risk Estimation via Training Data.
Risk Estimation via Validation or Test Data.
Cross-Validation.
Improvements on Cross-Validation.
Out-of-Bag Risk Estimation.
Akaike.
Information Criterion.
Schwartz.
Bayesian Information Criterion.
Rissanen.
Minimum Description Length Criterion.
R and Adjusted R.
Stepwise Model Selection Occam.
Razor: Consistency.
Convergence of Sequences of Random Variables.
Consistency for Parameter Estimation.
Consistency for Prediction.
There Are Consistent and Universally Consistent Classifiers.
Convergence to Asymptopia Is Not Uniform and May Be Slow.
: Clustering.
Gaussian Mixture Models.
k-Means.
Clustering by Mode-Hunting in a Density Estimate.
Using Classifiers to Cluster.
Dissimilarity.
k-Medoids.
Agglomerative Hierarchical Clustering.
Divisive Hierarchical Clustering How Many Clusters Are There? Interpretation of Clustering An Impossibility Theorem.
Optimization.
Quasi-Newton Methods.
The Nelder.
Mead Algorithm.
Simulated Annealing.
Genetic Algorithms.
Particle Swarm Optimization.
General Remarks on Optimization.
The Expectation-Maximization Algorithm.
High-Dimensional Data.
The Curse of Dimensionality.
Two Running Examples.
Reducing Dimension While Preserving Information.
Model Regularization.
Communication with Clients.
  • Sign up or login using form at top of the page to download this file.
  • Sign up
Up