llama-index llms gradient integration
Calculations
High throughput asynchronous reinforcement learning framework
Python package + CLI to generate stylistic wordclouds, including gradients and icon shapes!
Automatic-Class-Balanced MSE Loss for PyTorch (ACB-MSE) to combat class imbalanced datasets and stabilise fluctuating loss gradients.
A package for applying differential privacy to model training using gradient shuffling and membership inference attack detection.
ASDL: Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library
Polynomial approximations
A package for performing stochastic gradient descent (arXiv:1710.04626) to layout graphs
A library which Predicts the Y value(output) for given test data points using gradient descent algorithm.
A neural network architecture for building fully explainable neural network for arithmetic and gradient logic expression approximation.
Proximal Gradient Methods for Pytorch
Gradient boosting libraries integrated with pytorch
ADAO: A module for Data Assimilation and Optimization
A wrapper of scipy minimize with automatic gradient and hessian computation.
StructureBoost is a Python package for gradient boosting using categorical structure. See documentation at: https://structureboost.readthedocs.io/
Python Image Displacement Identification.
A library which Predicts the Y value(output) for given test data points using gradient descend algorithm.
A JAX-based L-BFGS optimizer
A package for evaluating dimensionality reduction algorithms
Makes learning gradient descent easy
Neural Network Gradient Metrics with PyTorch
A python package containing implementations of two widely used algorithms to minimise any function - Gradient Descent algorithm and the Normal Equation method.
Gradient boosted decision tree palindrome predictor, used to locate regions for further investigation thru http://palindromes.ibp.cz/
Official implementation of Conflict-free Inverse Gradients method
growingnn is a cutting-edge Python package that introduces a dynamic neural network architecture learning algorithm. This innovative approach allows the neural network to adapt its structure during training, optimizing both weights and architecture. Leveraging a Stochastic Gradient Descent-based optimizer and guided Monte Carlo tree search, the package provides a powerful tool for enhancing model performance.
Medical Deep Learning Framework
Method PROTES (PRobabilistic Optimizer with TEnsor Sampling) for derivative-free optimization of the multidimensional arrays and discretized multivariate functions based on the tensor train (TT) format
Medical Deep Learning Framework
A package compatible with scikit-learn.
GrafoRVFL: A Gradient-Free Optimization Framework for Boosting Random Vector Functional Link Network
PulPy: Pulses in Python. A package for MRI RF and gradient pulse design.
A python machine learning library to use and visualize gradient descent for linear regression and logistic regression optimization.
RNN and general weights, gradients, & activations visualization in Keras & TensorFlow
Medical Deep Learning Framework.
An ensemble xgboost model made by bright-rookie.
PyTorch optimizer based on nonlinear conjugate gradient method
WAsserstein Global Gradient-free OptimisatioN (WAGGON) methods library.
Magnetic Field Coil Generator for Python.
Stochastic optimization routines for Theano
Field and gradient calculations for magnetic coils
A python package for the Cyclic Gradient Boosting Machine algorithm
calculate gravity anomalies from multi-interfaces and estimate double-interfaces model from gravity anomalies, given prior information about constraints between the density interfacesthe using Crust1.0 model, where Vertical Gravity Gradient data are used asinputs
Python Sensitivity Analysis - Gradient DataFrames and Hex-Bin Plots
FUNGI: Features from UNsupervised GradIents
Fast (and cheeky) differentially private gradient-based optimisation in PyTorch
Functions and classes for gradient-based robot motion planning, written in Ivy.