Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Adas is an adaptive optimizer for scheduling the learning rate in training Convolutional Neural Networks (CNN)
This repository contains a PyTorch implementation of the Adas learning rate scheduler algorithm as well as the Knowledge Gain and Mapping Condition metrics.
Visit the paper
branch to see the paper-related code. You can use that code to replicate experiments from the paper.
Adas is released under the MIT License (refer to the LICENSE file for more information)
Permissions | Conditions | Limitations |
---|---|---|
Commerical use | License and Copyright Notice | Liability |
Distribution | Warranty | |
Modification | ||
Private Use |
@article{hosseini2020adas,
title={Adas: Adaptive Scheduling of Stochastic Gradients},
author={Hosseini, Mahdi S and Plataniotis, Konstantinos N},
journal={arXiv preprint arXiv:2006.06587},
year={2020}
}
Figure 1: Training performance using different optimizers across three datasets and two CNNs
Table 1: Image classification performance (test accuracy) with fixed budget epoch of ResNet34 training
Please refer to QC on Wiki for more information on two metrics of knowledge gain and mapping condition for monitoring training quality of CNNs
We use Python 3.7
.
Please refer to Requirements on Wiki for complete guideline.
Adas introduces no overhead (very minimal) over adaptive optimizers e.g. all mSGD+StepLR, mSGD+Adas, AdaM consume 40~43 sec/epoch to train ResNet34/CIFAR10 using the same PC/GPU platform
src/adas
into your local code base and use them directly. Note that you will probably need to modify the imports to be consistent with however you perform imports in your codebase.All source code can be found in src/adas
For more information, also refer to Installation on Wiki
The use Adas, simply import the Adas(torch.optim.optimier.Optimizer)
class and use it as follows:
from adas import Adas
optimizer = Adas(params=list(model.parameters()),
lr: float = ???,
beta: float = 0.8
step_size: int = None,
gamma: float = 1,
momentum: float = 0,
dampening: float = 0,
weight_decay: float = 0,
nesterov: bool = False):
...
for epoch in epochs:
for batch in train_dataset:
...
loss.backward()
optimizer.step()
optimizer.epoch_step(epoch)
Note, optipmizer.epoch_step()
is just to be called at the end of each epoch.
Note the following:
/tmp
, so if you don't have a /tmp
folder (i.e. you're on Windows), then correct this if you wish to run the tests yourselfFAQs
Python package for AdaS: Adaptive Scheduling of Stochastic Gradients
We found that adas demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.