Skip to content

Latest commit

 

History

History
105 lines (88 loc) · 7.33 KB

README.md

File metadata and controls

105 lines (88 loc) · 7.33 KB

Python Version GitHub tag (with filter) Documentation status badge Unit test status badge Code coverage status badge Pepy Total Downlods

XGBoostLSS - An extension of XGBoost to probabilistic modelling

We introduce a comprehensive framework that models and predicts the full conditional distribution of univariate and multivariate targets as a function of covariates. Choosing from a wide range of continuous, discrete, and mixed discrete-continuous distributions, modelling and predicting the entire conditional distribution greatly enhances the flexibility of XGBoost, as it allows to create probabilistic forecasts from which prediction intervals and quantiles of interest can be derived.

Features

✅ Estimation of all distributional parameters.
✅ Normalizing Flows allow modelling of complex and multi-modal distributions.
✅ Mixture-Densities can model a diverse range of data characteristics.
✅ Multi-target regression allows modelling of multivariate responses and their dependencies.
✅ Zero-Adjusted and Zero-Inflated Distributions for modelling excess of zeros in the data.
✅ Automatic derivation of Gradients and Hessian of all distributional parameters using PyTorch.
✅ Automated hyper-parameter search, including pruning, is done via Optuna.
✅ The output of XGBoostLSS is explained using SHapley Additive exPlanations.
✅ XGBoostLSS provides full compatibility with all the features and functionality of XGBoost.
✅ XGBoostLSS is available in Python.

News

💥 [2024-01-19] Release of XGBoostLSS to PyPI.
💥 [2023-08-25] Release of v0.4.0 introduces Mixture-Densities. See the release notes for an overview.
💥 [2023-07-19] Release of v0.3.0 introduces Normalizing Flows. See the release notes for an overview.
💥 [2023-06-22] Release of v0.2.2. See the release notes for an overview.
💥 [2023-06-21] XGBoostLSS now supports multi-target regression.
💥 [2023-06-07] XGBoostLSS now supports Zero-Inflated and Zero-Adjusted Distributions.
💥 [2023-05-26] Release of v0.2.1. See the release notes for an overview.
💥 [2023-05-18] Release of v0.2.0. See the release notes for an overview.
💥 [2021-12-22] XGBoostLSS now supports estimating the full predictive distribution via Expectile Regression.
💥 [2021-12-20] XGBoostLSS is initialized with suitable starting values to improve convergence of estimation.
💥 [2021-12-04] XGBoostLSS now supports automatic derivation of Gradients and Hessians.
💥 [2021-12-02] XGBoostLSS now supports pruning during hyperparameter optimization.
💥 [2021-11-14] XGBoostLSS v0.1.0 is released!

Installation

To install the development version, please use

pip install git+https://github.com/StatMixedML/XGBoostLSS.git

For the PyPI version, please use

pip install xgboostlss

Available Distributions

Our framework is built upon PyTorch and Pyro, enabling users to harness a diverse set of distributional families. XGBoostLSS currently supports the following distributions.

How to Use

Please visit the example section for guidance on how to use the framework.

Documentation

For more information and context, please visit the documentation.

Feedback

We encourage you to provide feedback on how to enhance XGBoostLSS or request the implementation of additional distributions by opening a new discussion.

How to Cite

If you use XGBoostLSS in your research, please cite it as:

@misc{Maerz2023,
  author = {Alexander M\"arz},
  title = {{XGBoostLSS: An Extension of XGBoost to Probabilistic Modelling}},
  year = {2023},
  note = {GitHub repository, Version 0.4.0},
  howpublished = {\url{https://github.com/StatMixedML/XGBoostLSS}}
}

Reference Paper

Arxiv link
Arxiv link
Arxiv link

Star History