Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

pgbm

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pgbm

Probabilistic Gradient Boosting Machines

  • 2.3.0
  • PyPI
  • Socket score

Maintainers
1

PGBM Airlab Amsterdam

PyPi version Python version GitHub license

Probabilistic Gradient Boosting Machines (PGBM) is a probabilistic gradient boosting framework in Python based on PyTorch/Numba, developed by Airlab in Amsterdam. It provides the following advantages over existing frameworks:

  • Probabilistic regression estimates instead of only point estimates. (example)
  • Auto-differentiation of custom loss functions. (example, example)
  • Native GPU-acceleration. (example)
  • Distributed training for CPU and GPU, across multiple nodes. (examples)
  • Ability to optimize probabilistic estimates after training for a set of common distributions, without retraining the model. (example)
  • Full integration with scikit-learn through a fork of HistGradientBoostingRegressor (examples)

It is aimed at users interested in solving large-scale tabular probabilistic regression problems, such as probabilistic time series forecasting.

For more details, read the docs or our paper or check out the examples.

Below a simple example to generate 1000 estimates for each of our test points:

from pgbm.sklearn import HistGradientBoostingRegressor
from sklearn.model_selection import train_test_split
from sklearn.datasets import fetch_california_housing

X, y = fetch_california_housing(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1)
model = HistGradientBoostingRegressor().fit(X_train, y_train) 
yhat_test, yhat_test_std = model.predict(X_test, return_std=True)
yhat_dist = model.sample(yhat_test, yhat_test_std, n_estimates=1000)

See also this example where we compare PGBM to standard gradient boosting quantile regression methods, demonstrating that we can achieve comparable or better probabilistic performance whilst only training a single model.

Installation

See Installation section in our docs.

Support

In general, PGBM works similar to existing gradient boosting packages such as LightGBM or xgboost (and it should be possible to more or less use it as a drop-in replacement).

In case further support is required, open an issue.

Reference

Olivier Sprangers, Sebastian Schelter, Maarten de Rijke. Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic Regression. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 21), August 14–18, 2021, Virtual Event, Singapore.

The experiments from our paper can be replicated by running the scripts in the experiments folder. Datasets are downloaded when needed in the experiments except for higgs and m5, which should be pre-downloaded and saved to the datasets folder (Higgs) and to datasets/m5 (m5).

License

This project is licensed under the terms of the Apache 2.0 license.

Acknowledgements

This project was developed by Airlab Amsterdam.

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc