Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

unique-uncertainty

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

unique-uncertainty

UNIQUE is a Python package for benchmarking uncertainty estimation and quantification methods for Machine Learning models predictions.

  • 0.2.2
  • PyPI
  • Socket score

Maintainers
1
UNIQUE Logo UNcertaInty QUantification bEnchmark: a Python library for benchmarking uncertainty estimation and quantification methods for Machine Learning models predictions.

Python PyPI version Conda version License chemRxiv PyPI downloads Conda downloads Docs Build

Introduction

UNIQUE provides methods for quantifying and evaluating the uncertainty of Machine Learning (ML) models predictions. The library allows to combine and benchmark multiple uncertainty quantification (UQ) methods simultaneously, generates intuitive visualizations, evaluates the goodness of the UQ methods against established metrics, and in general enables the users to get a comprehensive overview of their ML model's performances from an uncertainty quantification perspective.

UNIQUE is a model-agnostic tool, meaning that it does not depend on any specific ML model building platform or provides any ML model training functionality. It is lightweight, because it only requires the user to input their model's inputs and predictions.

UNIQUE High Level Schema High-level schema of UNIQUE's components.

Installation

Python PyPI version Conda version PyPI downloads Conda downloads Build

UNIQUE is currently compatible with Python 3.8 through 3.12.1. To install the latest release and use the package as is, run the following in a compatible environment of choice:

pip install unique-uncertainty

or:

conda install -c conda-forge unique-uncertainty
# mamba install -c conda-forge unique-uncertainty

Check out the docs for more installation instructions.

Getting Started

Check out the docs for a complete set of instructions on how to prepare your data and the possible configurations offered by UNIQUE.

Usage

Finally, once the data and configuration files have been prepared, you can run UNIQUE in the following way:

from unique import Pipeline

# Prepare UNIQUE pipeline
pipeline = Pipeline.from_config("/path/to/config.yaml")

# Run UNIQUE pipeline
uq_methods_outputs, uq_evaluation_outputs = pipeline.fit()
# Returns: (Dict[str, np.ndarray], Dict[str, pd.DataFrame])

Fitting the Pipeline will return two dictionaries:

  • uq_methods_outputs: contains each UQ method's name (as in "UQ_Method_Name[Input_Name(s)]") and computed UQ values.
  • uq_evaluation_outputs: contains, for each evaluation type (ranking-based, proper scoring rules, and calibration-based), the evaluation metrics outputs for all the corresponding UQ methods organized in pd.DataFrame.

Additionally, UNIQUE also generates graphical outputs in the form of tables and evaluation plots (if display_outputs is enabled and the code is running in a JupyterNotebook cell).

Examples

For more hands-on examples and detailed usage, check out some of the examples in the docs.

Deep Dive

Check out the docs for an in-depth overview of UNIQUE's concepts, functionalities, outputs, and references.

Contributing

Any and all contributions and suggestions from the community are more than welcome and highly appreciated. If you wish to help us out in making UNIQUE even better, please check out our contributing guidelines.

Please note that we have a Code of Conduct in place to ensure a positive and inclusive community environment. By participating in this project, you agree to abide by its terms.

License

License

UNIQUE is licensed under the BSD 3-Clause License. See the LICENSE file.

Cite Us

chemRxiv

If you find UNIQUE helpful for your work and/or research, please consider citing our work:

@misc{lanini2024unique,
  title={UNIQUE: A Framework for Uncertainty Quantification Benchmarking},
  author={Lanini, Jessica and Huynh, Minh Tam Davide and Scebba, Gaetano and Schneider, Nadine and Rodr{\'\i}guez-P{\'e}rez, Raquel},
  year={2024},
  doi={https://doi.org/10.26434/chemrxiv-2024-fmbgk},
}

Contacts & Acknowledgements

For any questions or further details about the project, please get in touch with any of the following contacts:

Novartis Logo

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc