Revision

Back to Boosting


Definition

AdaBoost is a classification boosting algorithm which iteratively forces the learners to predict accurately the sample wrongly predicted.

It does this by setting weights on each sample. At initialisation the weights of every sample are equal but after an iteration the weights of wrongly predicted sample are greater than the correctly predicted samples.

Here are the steps of AdaBoost:


Where:


The final prediction for an unseen sample \(x\) is:

\[T_A(x)=\frac{1}{n_{trees}}\sum_{k=0}^{n_{trees}} w_k \cdot \left[t_k(x) \leq 0.5\right]\]


AdaBoost as a Gradient Boosting algorithm

Using the exponential loss \(L(h(X), Y)=e^{-h(X)Y}\) and \(\lambda=1\) it can be shown that Gradient Boosting is almost equivalent to AdaBoost.

See:


Ressources

See: