29,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
payback
15 °P sammeln
  • Broschiertes Buch

Ensemble methods are based on the idea of combining the predictions of several classifiers for a better generalization and to compensate for the possible defects of individual predictors.We distinguish two families of methods: Parallel methods (Bagging, Random forests) in which the principle is to average several predictions in the hope of a better result following the reduction of the variance of the average estimator.Sequential methods (Boosting) in which the parameters are iteratively adapted to produce a better mixture.In this work we argue that when the members of a predictor make…mehr

Produktbeschreibung
Ensemble methods are based on the idea of combining the predictions of several classifiers for a better generalization and to compensate for the possible defects of individual predictors.We distinguish two families of methods: Parallel methods (Bagging, Random forests) in which the principle is to average several predictions in the hope of a better result following the reduction of the variance of the average estimator.Sequential methods (Boosting) in which the parameters are iteratively adapted to produce a better mixture.In this work we argue that when the members of a predictor make different errors it is possible to reduce the misclassified examples compared to a single predictor. The performance obtained will be compared using criteria such as classification rate, sensitivity, specificity, recall, etc.
Autorenporträt
Marcel KATULUMBA MBIYA NGANDU è laureato in Ingegneria Informatica all'Università di Mbujimayi. Dal 2018, è assistente presso il Dipartimento di Informatica dell'Università di Mbujimayi. È un ricercatore in ingegneria del software e costruzione di programmi, sistemi informativi e intelligenza artificiale.