000 | 03146aam a2200193 4500 | ||
---|---|---|---|
008 | 210209b2020 ||||| |||| 00| 0 eng d | ||
020 | _a9781138484696 | ||
082 |
_a006.310727 _bG6S8 |
||
100 |
_aGolden, Richard M. _91521860 |
||
245 | _aStatistical machine learning: a unified framework | ||
260 |
_bCRC Press _c2020 _aBoca Raton |
||
300 |
_axviii, 506p.: ill. _bIncludes bibliographical references and index |
||
440 |
_aChapman & Hall/CRC: texts in statistical science _91159338 |
||
504 | _aTable of content 1 A statistical machine learning framework 2 Set theory for concept modeling 3 Formal machine learning algorithms 4 Linear algebra for machine learning 5 Matrix calculus for machine learning 6 Convergence of time-invariant dynamical systems 7 Batch learning algorithm convergence 8 Random vectors and random functions 9 Stochastic sequences 10 Probability models of data generation 11 Monte Carlo Markov chain algorithm convergence 12 Adaptive learning algorithm convergence 13 Statistical learning objective function design 14 Simulation methods for evaluating generalization 15 Analytic formulas for evaluating generalization 16 Model selection and evaluation | ||
520 | _aThe recent rapid growth in the variety and complexity of new machine learning architectures requires the development of improved methods for designing, analyzing, evaluating, and communicating machine learning technologies. Statistical Machine Learning: A Unified Framework provides students, engineers, and scientists with tools from mathematical statistics and nonlinear optimization theory to become experts in the field of machine learning. In particular, the material in this text directly supports the mathematical analysis and design of old, new, and not-yet-invented nonlinear high-dimensional machine learning algorithms. Features: Unified empirical risk minimization framework supports rigorous mathematical analyses of widely used supervised, unsupervised, and reinforcement machine learning algorithms Matrix calculus methods for supporting machine learning analysis and design applications Explicit conditions for ensuring convergence of adaptive, batch, minibatch, MCEM, and MCMC learning algorithms that minimize both unimodal and multimodal objective functions Explicit conditions for characterizing asymptotic properties of M-estimators and model selection criteria such as AIC and BIC in the presence of possible model misspecification This advanced text is suitable for graduate students or highly motivated undergraduate students in statistics, computer science, electrical engineering, and applied mathematics. The text is self-contained and only assumes knowledge of lower-division linear algebra and upper-division probability theory. Students, professional engineers, and multidisciplinary scientists possessing these minimal prerequisites will find this text challenging yet accessible. https://www.routledge.com/Statistical-Machine-Learning-A-Unified-Framework/Golden/p/book/9781138484696 | ||
650 |
_aMachine learning - Statistical methods _92509794 |
||
650 |
_aComputer algorithms _92509795 |
||
942 |
_2ddc _cBK |
||
999 |
_c809948 _d809948 |