Adaboost :

  • first really successful boosting algorithm developed for binary classification
  • Modern boosting methods build on AdaBoost, most notably stochastic gradient boosting machines. Youtube Video

Creates stumps (decision trees with one decisions) and uses the accuracy to assign the weightage to the decisions of this stump Training data that is hard to predict is given more weight Models are created sequentially

Because so much attention is put on correcting mistakes by the algorithm it is important that you have clean data with outliers removed.