just a simple gradient tool
growingnn is a cutting-edge Python package that introduces a dynamic neural network architecture learning algorithm. This innovative approach allows the neural network to adapt its structure during training, optimizing both weights and architecture. Leveraging a Stochastic Gradient Descent-based optimizer and guided Monte Carlo tree search, the package provides a powerful tool for enhancing model performance. Now with PyTorch support for enhanced performance and GPU acceleration!
Influence Estimation for Gradient-Boosted Decision Trees
A package compatible with scikit-learn.
High throughput asynchronous reinforcement learning framework
llama-index llms gradient integration
CLI for printing text in gradient color using the rich-gradient library.
A plug-and-use python library to monitor learning of PyTorch neural networks
Method PROTES (PRobabilistic Optimizer with TEnsor Sampling) for derivative-free optimization of the multidimensional arrays and discretized multivariate functions based on the tensor train (TT) format
Field and gradient calculations for magnetic coils
Python Image Displacement Identification.
Instance-Based Uncertainty Estimation for Gradient-Boosted Regression Trees
PyTorch optimizer based on nonlinear conjugate gradient method
feature and feature interaction analyzer for gradient boosting
Calculations
A Python package for determining the structures of materials with gradient-based methods.
A Python package for gradient colors
Neural Network Gradient Metrics with PyTorch
A library which Predicts the Y value(output) for given test data points using gradient descend algorithm.
Gradient descent implementation for ML
GPU memory-efficient training with gradient compression for PyTorch
Makes learning gradient descent easy
A CLI for maxludden/maxgradient
API of gradient descent.
Ingradient - A labeling and dataset management tool
Hybrid time series forecasting library combining Prophet with gradient boosting models
BIRT is an implementation of Beta3-irt using gradient descent.
Complete Python SDK for DigitalOcean's Gradient AI Platform - serverless inference, knowledge bases, agents, and more
CEEMDAN-LSTM-GradientBoosting model for state-of-the-art time series forecasting
Gradient Boosting libraries integrated with PyTorch
A python package containing implementations of two widely used algorithms to minimise any function - Gradient Descent algorithm and the Normal Equation method.
Sionna - A hardware-accelerated differentiable open-source library for research on communication systems
Learning Environment for Agent Reasoning Networks — Unified framework for specifying, composing, optimizing, and evaluating LLM-based systems
Advanced numerical function interpolation and differentiation with universal API, multivariate calculus, window functions, and stochastic extensions
Signal-Adaptive Residual Boosting — two-phase gradient boosting with per-tree OOB step optimization
Gradient Boosting and Probabilistic Regression with categorical structure
SVG Frame-by-Frame Animation Generator - Create optimized FBF SVG animations from SVG sequences
CSS gradient generator
Where you find all the state-of-the-art cooking utensils (salt, pepper, gradient descent... the usual).
An implementation of Mesh Adaptive Direct Search (MADS) for gradient-free optimization.
A Python ML library to use and visualize gradient descent for linear and logistic regression optimization.
Sobel Gradient Image Deduplication
Fully automated LC gradient optimization of optimal compound separation in nontargeted metabolomics
Medical Deep Learning Framework
A flet component for gradient borders
A package for applying differential privacy to model training using gradient shuffling and membership inference attack detection.
A small machine learning library for doing gradient boosting
A modular approach to topology optimization
AI governance layer. Local-first. Open source. Three layers: hard constraints, gradient decisions, self-audit.