Nov 4, 2012

AdaBoost

Boosting is a technique to improve the performance of any given learning algorithm, gener-
ally consisting of sequentially learning classifiers1 with respect to a distribution and adding
them to an ensemble. When classifiers are added to the ensemble, they are typically weighted
in some way that is related to the their accuracy. After adding a classifier, the data is also
reweighted: examples that are misclassified gain weight and examples that are classified
correctly lose weight, thus forcing the next classifier to focus on previously hard to classify
data points. This basic idea has been surprisingly successful, with performances comparable
to more complex methods such as Support Vector Machines. In fact, a particular boosting
algorithm, AdaBoost, when used with trees as the component classifiers, has been referred
to as the “best off-the-shelf classifier in the world [3].”

No comments:

Carlo Cipolla's Laws of Stupidity

    "By creating a graph of Cipolla's two factors, we obtain four groups of people. Helpless people contribute to society but are...