统计学习基础-第2版

首页 > 图书 > 科技/2020-07-13 / 加入收藏 / 阅读 [打印]
统计学习基础-第2版

统计学习基础-第2版

作者:黑斯蒂

开 本:24开

书号ISBN:9787510084508

定价:119.0

出版时间:2015-01-01

出版社:世界图书出版公司

统计学习基础-第2版 本书特色

本书是springer统计系列丛书之一,旨在让读者深入了解数据挖掘和预测。   随着计算机和信息技术迅猛发展,医学、生物学、金融、以及市场等各个领域的大量数据的产生,处理这些数据以及挖掘它们之间的关系对于一个统计工作者显得尤为重要。本书运用共同的理论框架将这些领域的重要观点做了很好的阐释,重点强调方法和概念基础而非理论性质,运用统计的方法更是突出概念而非数学。另外,书中大量的彩色图例可以帮助读者更好地理解概念和理论。   目次:导论; 监督学习概述; 线性回归模型; 线性分类方法; 基展开与正则性; 核方法; 模型评估与选择; 模型参考与平均; 可加性模型,树与相关方法; 神经网络; 支持向量机器与弹性准则; 原型法和*近邻居; 无监督学习。  

统计学习基础-第2版 目录

preface to the second edition
preface to the first edition
1 introduction
2 overview of supervised learning
 2.1 introduction
 2.2 variable types and terminology
 2.3 two simple approaches to prediction least squares and nearest neighbors
  2.3.1 linear models and least squares
  2.3.2 nearest-neighbor methods
  2.3.3 from least squares to nearest neighbors
 2.4 statistical decision theory
 2.5 local methods in high dimensions
 2.6 statistical models, supervised learning and function approximation
  2.6.1 a statistical model for the joint distribution pr(x,y)
  2.6.2 supervised learning
  2.6.3 function approximation
 2.7 structured regression models
  2.7.1 difficulty of the problem
 2.8 classes of restricted estimators
  2.8.1 roughness penalty and bayesian methods
  2.8.2 kernel methods and local regression
  2.8.3 basis functions and dictionary methods
 2.9 model selection and the bias-variance rlyadeoff bibliographic notes
 exercises
3 linear methods for regression
 3.1 introduction
 3.2 linear regression models and least squares
  3.2.1 example: prostate cancer
  3.2.2 the gauss-markov theorem
  3.2.3 multiple regression from simple univariate regression
  3.2.4 multiple outputs
 3.3 subset selection
  3.3.1 best-subset selection
  3.3.2 forward- and backward-stepwise selection
  3.3.3 forward-stagewise regression
  3.3.4 prostate cancer data example (continued)
 3.4 shrinkage methods
  3.4.1 ridge regression
  3.4.2 the lasso
  3.4.3 discussion: subset selection, ridge regression and the lasso
  3.4.4 least angle regression
 3.5 methods using derived input directions
  3.5.1 principal components regression
  3.5.2 partial least squares
 3.6 discussion: a comparison of the selection and shrinkage methods
 3.7 multiple outcome shrinkage and selection
 3.8 more on the lasso and related path algorithms
  3.8.1 incremental forward stagewise regression
  3.8.2 piecewise-linear path algorithms
  3.8.3 the dantzig selector
  3.8.4 the grouped lasso
  3.8.5 further properties of the lasso
  3.8.6 pathwise coordinate optimization
 3.9 computational considerations bibliographic notes
 exercises
 ……
4 linear methods for classification
5 basis expansions and regularization
6 kernel smoothing methods
7 model assessment and selection
8 modellnference and averaging
9 additive models, trees, and related methods
10 boosting and additive trees
11 neural networks
12 support vector machines and flexible discriminants
13 prototype methods and nearest-neighbors
14 unsupervised learning
15 random forests
16 ensemble learning
17 undirected graphical models
18 high-dimensional problems: p≥n
references
author index
index

 1/2    1 2 下一页 尾页

自然科学 数学 概率论与数理统计

在线阅读

上一篇:声空化物理     下一篇:经济应用数学基础