Socket
Socket
Sign inDemoInstall

tensorboard-plugin-fairness-indicators

Package Overview
Dependencies
0
Maintainers
3
Alerts
File Explorer

Install Socket

Detect and block malicious and high-risk dependencies

Install

    tensorboard-plugin-fairness-indicators

Fairness Indicators TensorBoard Plugin


Maintainers
3

Readme

Evaluating Models with the Fairness Indicators Dashboard [Beta]

Fairness Indicators

Fairness Indicators for TensorBoard enables easy computation of commonly-identified fairness metrics for binary and multiclass classifiers. With the plugin, you can visualize fairness evaluations for your runs and easily compare performance across groups.

In particular, Fairness Indicators for TensorBoard allows you to evaluate and visualize model performance, sliced across defined groups of users. Feel confident about your results with confidence intervals and evaluations at multiple thresholds.

Many existing tools for evaluating fairness concerns don’t work well on large scale datasets and models. At Google, it is important for us to have tools that can work on billion-user systems. Fairness Indicators will allow you to evaluate across any size of use case, in the TensorBoard environment or in Colab.

Requirements

To install Fairness Indicators for TensorBoard, run:

python3 -m virtualenv ~/tensorboard_demo
source ~/tensorboard_demo/bin/activate
pip install --upgrade pip
pip install fairness_indicators
pip install tensorboard-plugin-fairness-indicators

Nightly Packages

Tensorboard Plugin also hosts nightly packages at https://pypi-nightly.tensorflow.org on Google Cloud. To install the latest nightly package, please use the following command:

pip install --extra-index-url https://pypi-nightly.tensorflow.org/simple tensorboard-plugin-fairness-indicators

This will install the nightly packages for the major dependencies of Tensorboard Plugin such as TensorFlow Model Analysis (TFMA).

Demo Colab

Fairness_Indicators_TensorBoard_Plugin_Example_Colab.ipynb contains an end-to-end demo to train and evaluate a model and visualize fairness evaluation results in TensorBoard.

Usage

To use the Fairness Indicators with your own data and evaluations:

  1. Train a new model and evaluate using tensorflow_model_analysis.run_model_analysis or tensorflow_model_analysis.ExtractEvaluateAndWriteResult API in model_eval_lib. For code snippets on how to do this, see the Fairness Indicators colab here.

  2. Write a summary data file using demo.py, which will be read by TensorBoard to render the Fairness Indicators dashboard (See the TensorBoard tutorial for more information on summary data files).

    Flags to be used with the demo.py utility:

    • --logdir: Directory where TensorBoard will write the summary
    • --eval_result_output_dir: Directory containing evaluation results evaluated by TFMA
    python demo.py --logdir=<logdir> --eval_result_output_dir=<eval_result_dir>`
    

    Or you can also use tensorboard_plugin_fairness_indicators.summary_v2 API to write the summary file.

    writer = tf.summary.create_file_writer(<logdir>)
    with writer.as_default():
        summary_v2.FairnessIndicators(<eval_result_dir>, step=1)
    writer.close()
    
  3. Run TensorBoard

    Note: This will start a local instance. After the local instance is started, a link will be displayed to the terminal. Open the link in your browser to view the Fairness Indicators dashboard.

    • tensorboard --logdir=<logdir>
    • Select the new evaluation run using the drop-down on the left side of the dashboard to visualize results.

Compatible versions

The following table shows the package versions that are compatible with each other. This is determined by our testing framework, but other untested combinations may also work.

tensorboard-pluginntensorflowtensorflow-model-analysis
GitHub masternightly (2.x)0.46.0
v0.46.02.15.00.46.0
v0.44.02.12.00.44.0
v0.43.02.11.00.43.0
v0.42.02.10.00.42.0
v0.41.02.9.00.41.0
v0.40.02.9.00.40.0
v0.39.02.8.00.39.0
v0.38.02.8.00.38.0
v0.37.02.7.00.37.0
v0.36.02.7.00.36.0
v0.35.02.6.00.35.0
v0.34.02.6.00.34.0
v0.33.02.5.00.33.0
v0.30.02.4.00.30.0
v0.29.02.4.00.29.0
v0.28.02.4.00.28.0
v0.27.02.4.00.27.0
v0.26.02.3.00.26.0
v0.25.02.3.00.25.0
v0.24.02.3.00.24.0
v0.23.02.3.00.23.0

Keywords

FAQs


Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc