XgBoost is an advanced implementation of gradient boosting algorithms. It is different from gradient boosting in its calculations as it applies the regularization technique internally. Xgboost is referred to as a regularized boosting technique.

Pros

  1. It is much faster than the gradient boosting mechanism.

  2. XGBoost allows users to define custom optimization objectives and evaluation criteria.

  3. XgBoost has techniques to handle missing values

Cons

  1. Difficult interpretation

  2. Overfitting is possible

  3. Harder to tune