![Create React App Officially Deprecated Amid React 19 Compatibility Issues](https://cdn.sanity.io/images/cgdhsj6q/production/04fa08cf844d798abc0e1a6391c129363cc7e2ab-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Create React App Officially Deprecated Amid React 19 Compatibility Issues
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.
.. image:: https://travis-ci.org/lmjohns3/downhill.svg .. image:: https://coveralls.io/repos/lmjohns3/downhill/badge.svg :target: https://coveralls.io/r/lmjohns3/downhill .. image:: http://depsy.org/api/package/pypi/downhill/badge.svg :target: http://depsy.org/package/python/downhill
DOWNHILL
The downhill
package provides algorithms for minimizing scalar loss
functions that are defined using Theano_.
Several optimization algorithms are included:
Equilibrated SGD
_Nesterov's Accelerated Gradient
_Resilient Backpropagation
_Stochastic Gradient Descent
_All algorithms permit the use of regular or Nesterov-style momentum as well.
.. _Theano: http://deeplearning.net/software/theano/
.. _Stochastic Gradient Descent: http://downhill.readthedocs.org/en/stable/generated/downhill.first_order.SGD.html .. _Nesterov's Accelerated Gradient: http://downhill.readthedocs.org/en/stable/generated/downhill.first_order.NAG.html .. _Resilient Backpropagation: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.RProp.html .. _ADAGRAD: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.ADAGRAD.html .. _RMSProp: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.RMSProp.html .. _ADADELTA: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.ADADELTA.html .. _Adam: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.Adam.html .. _Equilibrated SGD: http://downhill.readthedocs.org/en/stable/generated/downhill.adaptive.ESGD.html
Let's say you have 100 samples of 1000-dimensional data, and you want to represent your data as 100 coefficients in a 10-dimensional basis. This is pretty straightforward to model using Theano: you can use a matrix multiplication as the data model, a squared-error term for optimization, and a sparse regularizer to encourage small coefficient values.
Once you have constructed an expression for the loss, you can optimize it with a
single call to downhill.minimize
:
.. code:: python
import downhill import numpy as np import theano import theano.tensor as TT
FLOAT = 'df'[theano.config.floatX == 'float32']
def rand(a, b): return np.random.randn(a, b).astype(FLOAT)
A, B, K = 20, 5, 3
u = theano.shared(rand(A, K), name='u') v = theano.shared(rand(K, B), name='v') z = TT.matrix() err = TT.sqr(z - TT.dot(u, v)) loss = err.mean() + abs(u).mean() + (v * v).mean()
y = np.dot(rand(A, K), rand(K, B)) + rand(A, B)
monitors = (('err', err.mean()), ('|u|<0.1', (abs(u) < 0.1).mean()), ('|v|<0.1', (abs(v) < 0.1).mean()))
downhill.minimize( loss=loss, train=[y], patience=0, batch_size=A, # Process y as a single batch. max_gradient_norm=1, # Prevent gradient explosion! learning_rate=0.1, monitors=monitors, monitor_gradients=True)
print('u =', u.get_value()) print('v =', v.get_value())
If you prefer to maintain more control over your model during optimization, downhill provides an iterative optimization interface:
.. code:: python
opt = downhill.build(algo='rmsprop', loss=loss, monitors=monitors, monitor_gradients=True)
for metrics, _ in opt.iterate(train=[[y]], patience=0, batch_size=A, max_gradient_norm=1, learning_rate=0.1): print(metrics)
If that's still not enough, you can just plain ask downhill for the updates to your model variables and do everything else yourself:
.. code:: python
updates = downhill.build('rmsprop', loss).get_updates( batch_size=A, max_gradient_norm=1, learning_rate=0.1) func = theano.function([z], loss, updates=list(updates)) for _ in range(100): print(func(y)) # Evaluate func and apply variable updates.
Source: http://github.com/lmjohns3/downhill
Documentation: http://downhill.readthedocs.org
Mailing list: https://groups.google.com/forum/#!forum/downhill-users
FAQs
Stochastic optimization routines for Theano
We found that downhill demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Create React App is officially deprecated due to React 19 issues and lack of maintenance—developers should switch to Vite or other modern alternatives.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.