Conformal prediction is a valuable new method of machine learning. Conformal predictors are among the most accurate methods of machine learning, and unlike other state-of-the-art methods, they provide information about their own accuracy and reliability.
This new monograph integrates mathematical theory and revealing experimental work. It demonstrates mathematically the validity of the reliability claimed by conformal predictors when they are applied to independent and identically distributed data, and it confirms experimentally that the accuracy is sufficient for many practical problems. Later chapters generalize these results to models called repetitive structures, which originate in the algorithmic theory of randomness and statistical physics. The approach is flexible enough to incorporate most existing methods of machine learning, including newer methods such as boosting and support vector machines and older methods such as nearest neighbors and the bootstrap.
Topics and Features:
* Describes how conformal predictors yield accurate and reliable predictions, complemented with quantitative measures of their accuracy and reliability
* Handles both classification and regression problems
* Explains how to apply the new algorithms to real-world data sets
* Demonstrates the infeasibility of some standard prediction tasks
* Explains connections with Kolmogorov's algorithmic randomness, recent work in machine learning, and older work in statistics
* Develops new methods of probability forecasting and shows how to use them for prediction in causal networks
Researchers in computer science, statistics, and artificial intelligence will find the book an authoritative and rigorous treatment of someof the most promising new developments in machine learning. Practitioners and students in all areas of research that use quantitative prediction or machine learning will learn about important new methods.
This new monograph integrates mathematical theory and revealing experimental work. It demonstrates mathematically the validity of the reliability claimed by conformal predictors when they are applied to independent and identically distributed data, and it confirms experimentally that the accuracy is sufficient for many practical problems. Later chapters generalize these results to models called repetitive structures, which originate in the algorithmic theory of randomness and statistical physics. The approach is flexible enough to incorporate most existing methods of machine learning, including newer methods such as boosting and support vector machines and older methods such as nearest neighbors and the bootstrap.
Topics and Features:
* Describes how conformal predictors yield accurate and reliable predictions, complemented with quantitative measures of their accuracy and reliability
* Handles both classification and regression problems
* Explains how to apply the new algorithms to real-world data sets
* Demonstrates the infeasibility of some standard prediction tasks
* Explains connections with Kolmogorov's algorithmic randomness, recent work in machine learning, and older work in statistics
* Develops new methods of probability forecasting and shows how to use them for prediction in causal networks
Researchers in computer science, statistics, and artificial intelligence will find the book an authoritative and rigorous treatment of someof the most promising new developments in machine learning. Practitioners and students in all areas of research that use quantitative prediction or machine learning will learn about important new methods.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.