You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

gradient_metrics

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

gradient_metrics

Neural Network Gradient Metrics with PyTorch

0.5.0
pipPyPI
Maintainers
1

PyPI GitHub Workflow Status (branch) PyPI - License PyPI - Downloads

This package implements utilities for computing gradient metrics for measuring uncertainties in neural networks based on the paper "Classification Uncertainty of Deep Neural Networks Based on Gradient Information, Oberdiek et al., 2018".
An application of this can also be found in "On the Importance of Gradients for Detecting Distributional Shifts in the Wild, Huang et al., 2021"

Documentation and examples can be found on GitHub pages.

Table of Contents

Installation

pip install gradient-metrics

Usage

Example of computing the maximum, minimum, mean and standard deviation of gradient entries as in Classification Uncertainty of Deep Neural Networks Based on Gradient Information:

from gradient_metrics import GradientMetricCollector
from gradient_metrics.metrics import Max, Min, MeanStd
import torch.nn.functional as tfunc

# Initialize a network
mynet = MyNeuralNetwork()

# Initialize the GradientMetricCollector
mcollector = GradientMetricCollector(
    [
        Max(mynet),
        Min(mynet),
        MeanStd(mynet),
    ]
)

# Predict your data
out = mynet(x)

# Construct pseudo labels
y_pred = out.argmax(1).clone().detach()

# Construct the sample wise loss for backpropagation
sample_loss = tfunc.cross_entropy(out, y_pred, reduction="none")

# Compute the gradient metrics
metrics = mcollector(sample_loss)

Example of computing the L1-Norm from On the Importance of Gradients for Detecting Distributional Shifts in the Wild:

from gradient_metrics import GradientMetricCollector
from gradient_metrics.metrics import PNorm
import torch
import torch.nn.functional as tfunc

# Initialize a network
mynet = MyNeuralNetwork()

# Initialize the GradientMetricCollector
mcollector = GradientMetricCollector(PNorm(mynet))

# Predict your data
out = mynet(x)

# Construct the sample wise loss for backpropagation
sample_loss = torch.log(tfunc.softmax(out, dim=1)).mean(1).neg()

# Compute the gradient metrics
metrics = mcollector(sample_loss)

Contributing

Requirements:

Contributions in the form of PRs or issues are welcome. To install the development environment run

make setup

Before you open your pull-request, make sure that all tests are passing in your local copy by running make test.

Citing

@inproceedings{OberdiekRG18,  
  author    = {Philipp Oberdiek and  
               Matthias Rottmann and  
               Hanno Gottschalk},  
  editor    = {Luca Pancioni and  
               Friedhelm Schwenker and  
               Edmondo Trentin},  
  title     = {Classification Uncertainty of Deep Neural Networks Based on Gradient  
               Information},  
  booktitle = {Artificial Neural Networks in Pattern Recognition - 8th {IAPR} {TC3}  
               Workshop, {ANNPR} 2018, Siena, Italy, September 19-21, 2018, Proceedings},  
  series    = {Lecture Notes in Computer Science},  
  volume    = {11081},  
  pages     = {113--125},  
  publisher = {Springer},  
  year      = {2018},  
  url       = { https://doi.org/10.1007/978-3-319-99978-4_9 },  
  doi       = { 10.1007/978-3-319-99978-4\_9 },  
}

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts