XgBoost is an advanced implementation of gradient boosting algorithms. It is different from gradient boosting in its calculations as it applies the regularization technique internally. Xgboost is referred to as a regularized boosting technique.
Pros
-
It is much faster than the gradient boosting mechanism.
-
XGBoost allows users to define custom optimization objectives and evaluation criteria.
-
XgBoost has techniques to handle missing values
Cons
-
Difficult interpretation
-
Overfitting is possible
-
Harder to tune