
Product
Introducing Webhook Events for Pull Request Scans
Add real-time Socket webhook events to your workflows to automatically receive pull request scan results and security alerts in real time.
The Simple Pytorch-Like Auto Differentiation Toolkit is an automatic differentiation package for calculating gradients of a function in forward and reverse mode.
With the rapid development of deep learning, auto differentiation has become an indispensable part of multiple optimization algorithms like gradient descent. Numerical means such as Newton's Method and finite-difference method is useful in some situations, we desire to compute the analytical solutions by applying chain rules with our automatic differentiation SPLADTool (Simple Pytorch-Like Auto Differentiation Toolkit), which will be faster and more accurate than numerical methods.
Create a virtual environment: Conda
conda create --name spladtool_env python
Activate the environment:
conda activate spladtool_env
Deactivate the envrionment after use:
conda deactivate
Install spladtool
pip install spladtool
Try out an example from test.py
on arithmetic functions:
import spladtool.spladtool_forward as st
x = st.tensor([[1., 2.], [3., 4.]])
# Define output functions y(x) and z(x)
y = 2 * x + 1
z = - y / (x ** 3)
w = st.cos((st.exp(z) + st.exp(-z)) / 2)
# Print out the values calculated by our forward mode automatic differentiation SPLADTool
print('x : ', x)
print('y : ', y)
print('y.grad : ', y.grad)
print('z: ', z)
print('z.grad: ', z.grad)
print('w: ', w)
print('w.grad: ', w.grad)
Try out an example training a linear classifier on a dataset
import spladtool.spladtool_reverse as str
from spladtool.utils import SGD
import numpy as np
# We chose a simple classification model with decision boundary being 4x1 - 3x2 > 0
x = np.random.randn(200, 2)
y = ((x[:, 0] - 3 * x[:, 1]) > 0).astype(float)
# define a linear regression module
np.random.seed(42)
class MyModel(str.Module):
def __init__(self):
super().__init__()
self.register_param(w1=str.tensor(np.random.randn()))
self.register_param(w2=str.tensor(np.random.randn()))
self.register_param(b=str.tensor(np.random.randn()))
def forward(self, x):
w1 = self.params['w1'].repeat(x.shape[0])
w2 = self.params['w2'].repeat(x.shape[0])
b = self.params['b'].repeat(x.shape[0])
y = w1 * str.tensor(x[:, 0]) + w2 * str.tensor(x[:, 1]) + b
return y
# define loss function and optimizer
model = MyModel()
criterion = str.BCELoss()
opt = SGD(model.parameters(), lr=0.1, momentum=0.9)
# training
for epoch in range(100):
outputs = model(x)
targets = str.tensor(y)
loss = criterion(targets, outputs)
opt.zero_grad()
loss.backward()
opt.step()
print(loss.data)
FAQs
The Simple Pytorch-Like Auto Differentiation Toolkit is an automatic differentiation package for calculating gradients of a function in forward and reverse mode.
We found that spladtool demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Add real-time Socket webhook events to your workflows to automatically receive pull request scan results and security alerts in real time.
Research
The Socket Threat Research Team uncovered malicious NuGet packages typosquatting the popular Nethereum project to steal wallet keys.
Product
A single platform for static analysis, secrets detection, container scanning, and CVE checks—built on trusted open source tools, ready to run out of the box.