HGDL

HGDL is an API for HPC distributed constrained function optimization.
At the core, the algorithm uses local and global optimization
and bump-function-based deflation to provide a growing list of unique optima of a differentiable function.
This tackles the common problem of non-uniquness of optimization problems, especially in machine learning.
Usage
The following demonstrates a simple usage of the HGDL API.
import numpy as np
from hgdl.hgdl import HGDL as hgdl
from hgdl.support_functions import *
import dask.distributed as distributed
bounds = np.array([[-500,500],[-500,500]])
a = hgdl(schwefel, schwefel_gradient, bounds,
global_optimizer = "genetic",
local_optimizer = "dNewton",
number_of_optima = 30000,
num_epochs = 100)
x0 = np.random.uniform(low = bounds[:, 0], high = bounds[:,1],size = (20,2))
a.optimize(x0 = x0)
a.get_latest()
a.kill_client()
Credits
Main Developers: Marcus Noack (MarcusNoack@lbl.gov) and David Perryman.
Several people from across the DOE national labs have given insights
that led to the code in its current form.
See AUTHORS for more details on that.
HGDL is based on the HGDN algorithm by Noack and Funke.