
Security News
npm Adopts OIDC for Trusted Publishing in CI/CD Workflows
npm now supports Trusted Publishing with OIDC, enabling secure package publishing directly from CI/CD workflows without relying on long-lived tokens.
litewave-ml-models-signature
Advanced tools
A comprehensive Python package for signature verification and classification using deep learning models. This module supports both ResNet Siamese networks for verification and Vision Transformers (ViT) with ArcFace for classification.
pip install torch torchvision torchaudio
pip install timm scikit-image scikit-learn pandas pillow boto3 pydantic scipy
from signature import SignatureConfig, load_model, classify_signatures
# Configure the module
config = SignatureConfig(
model_type="vit",
num_classes=100,
model_s3_path="s3://my-bucket/models/signature_vit.pth"
)
# Load a trained model
model = load_model()
# Classify signatures
results = classify_signatures(
model=model,
reference_path="path/to/reference/signatures",
detected_path="path/to/detected/signatures"
)
print(f"Accepted signatures: {results['accepted_signatures']}")
from signature import train_model, create_dataset_csv
# Create a dataset CSV from image directory
create_dataset_csv(
image_directory="path/to/signature/images",
output_csv="dataset.csv",
dataset_type="single" # or "pairs" for ResNet
)
# Train a model
model, metrics = train_model(
model_type="vit",
dataset_path="dataset.csv",
save_path="s3://my-bucket/models/trained_model.pth",
num_classes=100,
epochs=20
)
print(f"Test accuracy: {metrics['test_accuracy']:.4f}")
The module supports configuration through environment variables or programmatically:
# Model configuration
export SIGNATURE_MODEL_TYPE=vit
export SIGNATURE_NUM_CLASSES=100
export SIGNATURE_MODEL_S3_PATH=s3://my-bucket/models/model.pth
# AWS configuration
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_DEFAULT_REGION=us-east-2
# Training configuration
export SIGNATURE_BATCH_SIZE=32
export SIGNATURE_LEARNING_RATE=5e-5
export SIGNATURE_EPOCHS=15
from signature import SignatureConfig, set_config
config = SignatureConfig(
model_type="vit",
num_classes=100,
batch_size=32,
learning_rate=5e-5,
aws_access_key_id="your_access_key",
aws_secret_access_key="your_secret_key"
)
set_config(config)
Best for signature verification (determining if two signatures are from the same person):
from signature import ResNet50Siamese, train_resnet
# Create model
model = ResNet50Siamese(embedding_dim=512)
# Train model
model, metrics = train_resnet(
dataset_path="pairs_dataset.csv",
epochs=25,
margin=17.5
)
Best for signature classification (identifying which person signed):
from signature import SignatureViT, train_vit
# Create model
model = SignatureViT(num_classes=100, embedding_dim=512)
# Train model
model, metrics = train_vit(
num_classes=100,
dataset_path="single_image_dataset.csv",
epochs=15
)
Organize your signature images in the following structure:
signatures/
├── person1/
│ ├── signature1.png
│ ├── signature2.png
│ └── signature3.png
├── person2/
│ ├── signature1.png
│ └── signature2.png
└── person3/
├── signature1.png
├── signature2.png
└── signature4.png
from signature import create_dataset_csv
# For ViT (single image classification)
create_dataset_csv(
image_directory="signatures/",
output_csv="vit_dataset.csv",
dataset_type="single"
)
# For ResNet (image pairs)
create_dataset_csv(
image_directory="signatures/",
output_csv="resnet_dataset.csv",
dataset_type="pairs",
pairs_per_person=50
)
The module seamlessly works with S3 for storing models, datasets, and reference signatures:
from signature import S3Manager
# Create S3 manager
s3_manager = S3Manager()
# Download model weights
s3_manager.download_file(
"s3://my-bucket/models/model.pth",
"local_model.pth"
)
# Upload trained model
s3_manager.upload_file(
"local_model.pth",
"s3://my-bucket/models/new_model.pth"
)
from signature import get_fused_features, extract_deep_features
# Extract fused features (deep + HOG)
features = get_fused_features(model, "signature.png")
# Extract only deep features
deep_features = extract_deep_features(model, "signature.png")
from signature import (
create_model, ContrastiveLoss, create_dataloader,
get_config
)
import torch
config = get_config()
# Create model and loss
model = create_model(model_type="resnet")
criterion = ContrastiveLoss(margin=17.5)
optimizer = torch.optim.Adam(model.parameters(), lr=1e-3)
# Create data loaders
train_loader, val_loader, test_loader = create_dataloader(
"dataset.csv",
dataset_type="pairs",
batch_size=32
)
# Custom training loop
for epoch in range(10):
for img1, img2, labels in train_loader:
optimizer.zero_grad()
emb1, emb2 = model(img1, img2)
loss = criterion(emb1, emb2, labels)
loss.backward()
optimizer.step()
load_model()
: Load a trained model with optional S3 weightsclassify_signatures()
: Classify detected signatures against referencestrain_model()
: Train a model with automatic type detectioncreate_dataset_csv()
: Create dataset CSV from image directoryResNet50Siamese
: Siamese ResNet50 for signature verificationSignatureViT
: Vision Transformer with ArcFace for classificationContrastiveLoss
: Contrastive loss for Siamese networksArcFace
: Angular margin loss implementationSignatureDataset
: Dataset for single signature imagesSignaturePairDataset
: Dataset for signature image pairscreate_dataloader()
: Create train/val/test data loadersSignatureConfig
: Configuration managementS3Manager
: S3 operations managerSIGNATURE_DEVICE=cuda
for faster training and inferenceEnable detailed logging:
from signature.utils import setup_logging
setup_logging("DEBUG")
See the examples/
directory for complete examples:
train_vit_example.py
: Training a ViT modeltrain_resnet_example.py
: Training a ResNet Siamese modelinference_example.py
: Running inference on new signaturess3_integration_example.py
: Working with S3 storageThis module is part of the LiteWave ML Models repository.
FAQs
A package for signature verification and classification.
We found that litewave-ml-models-signature demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
npm now supports Trusted Publishing with OIDC, enabling secure package publishing directly from CI/CD workflows without relying on long-lived tokens.
Research
/Security News
A RubyGems malware campaign used 60 malicious packages posing as automation tools to steal credentials from social media and marketing tool users.
Security News
The CNA Scorecard ranks CVE issuers by data completeness, revealing major gaps in patch info and software identifiers across thousands of vulnerabilities.