Free Shipping to Singapore

Shop over 1 Million Toys in our Huge New Range

Introduction to Statistical Machine Learning
By

Rating

Product Description
Product Details

Promotional Information

Bridges the gap between theory and practice by providing a general introduction to machine learning that covers a wide range of topics concisely.

Table of Contents

Part I: Introduction to Statistics and Probability 1. Random variables and probability distributions 2. Examples of discrete probability distributions 3. Examples of continuous probability distributions 4. Multi-dimensional probability distributions 5. Examples of multi-dimensional probability distributions 6. Random sample generation from arbitrary probability distributions 7. Probability distributions of the sum of independent random variables 8. Probability inequalities 9. Statistical inference 10. Hypothesis testing Part II: Generative Approach to Statistical Pattern Recognition 11. Fundamentals of statistical pattern recognition 12. Criteria for developing classifiers 13. Maximum likelihood estimation 14. Theoretical properties of maximum likelihood estimation 15. Linear discriminant analysis 16. Model selection for maximum likelihood estimation 17. Maximum likelihood estimation for Gaussian mixture model 18. Bayesian inference 19. Numerical computation in Bayesian inference 20. Model selection in Bayesian inference 21. Kernel density estimation 22. Nearest neighbor density estimation Part III: Discriminative Approach to Statistical Machine Learning 23. Fundamentals of statistical machine learning 24. Learning Models 25. Least-squares regression 26. Constrained least-squares regression 27. Sparse regression 28. Robust regression 29. Least-squares classification 30. Support vector classification 31. Ensemble classification 32. Probabilistic classification 33. Structured classification Part IV: Further Topics 34. Outlier detection 35. Unsupervised dimensionality reduction 36. Clustering 37. Online learning 38. Semi-supervised learning 39. Supervised dimensionality reduction 40. Transfer learning 41. Multi-task learning

About the Author

Masashi Sugiyama received the degrees of Bachelor of Engineering, Master of Engineering, and Doctor of Engineering in Computer Science from Tokyo Institute of Technology, Japan in 1997, 1999, and 2001, respectively. In 2001, he was appointed Assistant Professor in the same institute, and he was promoted to Associate Professor in 2003. He moved to the University of Tokyo as Professor in 2014. He received an Alexander von Humboldt Foundation Research Fellowship and researched at Fraunhofer Institute, Berlin, Germany, from 2003 to 2004. In 2006, he received a European Commission Program Erasmus Mundus Scholarship and researched at the University of Edinburgh, Edinburgh, UK. He received the Faculty Award from IBM in 2007 for his contribution to machine learning under non-stationarity, the Nagao Special Researcher Award from the Information Processing Society of Japan in 2011 and the Young Scientists' Prize from the Commendation for Science and Technology by the Minister of Education, Culture, Sports, Science and Technology Japan for his contribution to the density-ratio paradigm of machine learning. His research interests include theories and algorithms of machine learning and data mining, and a wide range of applications such as signal processing, image processing, and robot control.

Reviews

"The probabilistic and statistical background is well presented, providing the reader with a complete coverage of the generative approach to statistical pattern recognition and the discriminative approach to statistical machine learning." --Zentralblatt MATH

Ask a Question About this Product More...
Write your question below:
Look for similar items by category
Item ships from and is sold by Fishpond World Ltd.
Back to top