Boosting

Classification

Decision Tree

Ensemble Technique

Machine Learning

Regression

January 14, 2021

Why is XGBoost so popular?

Following are the advantages of XGBoost that contribute to its popularity:

  1. Exception to Bias Variance Trade off rule i.e. bias and variance can be reduced at the same time
  2. Computationally cost effective as it allows multi threading and parallel processing
  3. Allows hardware optimizations
  4. Allows early stopping
  5. Built in regularization factor
  6. Unaffected by multicollinearity. This becomes even more relevant given the vulnerability of Decision Trees to multicollinearity.
  7. Robust to outliers
  8. Robust to missing values
  9. Adept to sparse dataset. Feature engineering steps like one-hot encoding and others make data sparse. XGBoost uses a sparsity aware split finding algorithm to that manages sparsity patterns in the data
  10. Handles noise adeptly in case of classification problem
  11. Works both for regression and classification problems
  12. Works with both categorical and continuous variables.
  13. Package is just 7 years old and new features are added everyday.

by : Monis Khan

Quick Summary:

Following are the advantages of XGBoost that contribute to its popularity: Exception to Bias Variance Trade off rule i.e. bias and variance can be reduced at the same time Computationally cost effective as it allows multi threading and parallel processing Allows hardware optimizations Allows early stopping Built in regularization factor Unaffected by multicollinearity. This becomes […]