Socket
Socket
Sign inDemoInstall

github.com/dmitryikh/leaves

Package Overview
Dependencies
0
Alerts
File Explorer

Install Socket

Detect and block malicious and high-risk dependencies

Install

    github.com/dmitryikh/leaves

Package leaves is pure Go implemetation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks. General All loaded models exibit the same interface from `Ensemble struct`. One can use method `Name` to get string representation of model origin. Possible name values are "lightgbm.gbdt", "lightgbm.rf", "xgboost.gbtree", "xgboost.gblinear", etc. Example: binary classification build_breast_cancer_model.py: predict_breast_cancer_model.go: Output: example: Multiclass Classification build_iris_model.py predict_iris_model.go: Output: Please note that one must not provide nEstimators = 0 when predict with DART models from xgboost. For more details see xgboost's documentation. Models trained with 'boosting_type': 'dart' options can be loaded with func `leaves.LGEnsembleFromFile`. But the name of the model (given by `Name()` method) will be 'lightgbm.gbdt', because LightGBM model format doesn't distinguish 'gbdt' and 'dart' models.


Version published

Readme

Source

leaves

version Build Status GoDoc Coverage Status Go Report Card

Logo

Introduction

leaves is a library implementing prediction code for GBRT (Gradient Boosting Regression Trees) models in pure Go. The goal of the project - make it possible to use models from popular GBRT frameworks in Go programs without C API bindings.

NOTE: Before 1.0.0 release the API is a subject to change.

Features

  • General Features:
    • support parallel predictions for batches
    • support sigmoid, softmax transformation functions
    • support getting leaf indices of decision trees
  • Support LightGBM (repo) models:
    • read models from text format and from JSON format
    • support gbdt, rf (random forest) and dart models
    • support multiclass predictions
    • addition optimizations for categorical features (for example, one hot decision rule)
    • addition optimizations exploiting only prediction usage
  • Support XGBoost (repo) models:
    • read models from binary format
    • support gbtree, gblinear, dart models
    • support multiclass predictions
    • support missing values (nan)
  • Support scikit-learn (repo) tree models (experimental support):
    • read models from pickle format (protocol 0)
    • support sklearn.ensemble.GradientBoostingClassifier

Usage examples

In order to start, go get this repository:

go get github.com/dmitryikh/leaves

Minimal example:

package main

import (
	"fmt"

	"github.com/dmitryikh/leaves"
)

func main() {
	// 1. Read model
	useTransformation := true
	model, err := leaves.LGEnsembleFromFile("lightgbm_model.txt", useTransformation)
	if err != nil {
		panic(err)
	}

	// 2. Do predictions!
	fvals := []float64{1.0, 2.0, 3.0}
	p := model.PredictSingle(fvals, 0)
	fmt.Printf("Prediction for %v: %f\n", fvals, p)
}

In order to use XGBoost model, just change leaves.LGEnsembleFromFile, to leaves.XGEnsembleFromFile.

Documentation

Documentation is hosted on godoc (link). Documentation contains complex usage examples and full API reference. Some additional information about usage examples can be found in leaves_test.go.

Compatibility

Most leaves features are tested to be compatible with old and coming versions of GBRT libraries. In compatibility.md one can found detailed report about leaves correctness against different versions of external GBRT libraries.

Some additional information on new features and backward compatibility can be found in NOTES.md.

Benchmark

Below are comparisons of prediction speed on batches (~1000 objects in 1 API call). Hardware: MacBook Pro (15-inch, 2017), 2,9 GHz Intel Core i7, 16 ГБ 2133 MHz LPDDR3. C API implementations were called from python bindings. But large batch size should neglect overhead of python bindings. leaves benchmarks were run by means of golang test framework: go test -bench. See benchmark for mode details on measurments. See testdata/README.md for data preparation pipelines.

Single thread:

Test CaseFeaturesTreesBatch sizeC APIleaves
LightGBM MS LTR137500100049ms51ms
LightGBM Higgs28500100050ms50ms
LightGBM KDD Cup 99*411200100070ms85ms
XGBoost Higgs28500100044ms50ms

4 threads:

Test CaseFeaturesTreesBatch sizeC APIleaves
LightGBM MS LTR137500100014ms14ms
LightGBM Higgs28500100014ms14ms
LightGBM KDD Cup 99*411200100019ms24ms
XGBoost Higgs285001000?14ms

(?) - currenly I'm unable to utilize multithreading form XGBoost predictions by means of python bindings

(*) - KDD Cup 99 problem involves continuous and categorical features simultaneously

Limitations

  • LightGBM models:
    • limited support of transformation functions (support only sigmoid, softmax)
  • XGBoost models:
    • limited support of transformation functions (support only sigmoid, softmax)
    • could be slight divergence between C API predictions vs. leaves because of floating point convertions and comparisons tolerances
  • scikit-learn tree models:
    • no support transformations functions. Output scores is raw scores (as from GradientBoostingClassifier.decision_function)
    • only pickle protocol 0 is supported
    • could be slight divergence between sklearn predictions vs. leaves because of floating point convertions and comparisons tolerances

Contacts

In case if you are interested in the project or if you have questions, please contact with me by email: khdmitryi at gmail.com

FAQs

Last updated on 08 Jul 2023

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc