gradient-descent
gradient-descent is a package that contains different gradient-based algorithms, usually used to optimize Neural Networks and other machine learning models. The package contains the following algorithms:
- Gradients Descent
- Momentum
- RMSprop
- Nasterov accelerated gradient
- Adam
The package purpose is to facilitate the user experience when using optimization algorithms and to allow the users to have a better intuition about how this black-boxes algorithms works.
This is an open-source project, any feedback, improvement ideas, and contributors are welcome.
Installation
Dependencies
- Python (>= 3.6)
- NumPy (>= 1.13.3)
- Matplotlib (>=3.2.1)
User installation
pip install gradient-descent
Development
All contributors of all levels are welcome to help in any possible away.
Souce Code
git clone https://github.com/DanielDaCosta/gradient-descent.git
Tests
pytest tests
TO DO
The package is still on its early days and there are a lot of improvements to make:
Acknowledgements
First of all I would like to thank Hammad Shaikh by his well documented and very well explained GitHub repository Math of Machine Learning Course by Siraj
I would like to appreciate the help of the following contents and articles in the package development: