networkqit.algorithms.optimize.Adam¶
-
class
Adam
(G: autograd.numpy.numpy_wrapper.array, x0: autograd.numpy.numpy_wrapper.array, model: networkqit.graphtheory.models.GraphModel.GraphModel, **kwargs)[source]¶ Implements the ADAM stochastic gradient descent. Adam: A Method for Stochastic Optimization Diederik P. Kingma, Jimmy Ba
https://arxiv.org/abs/1412.6980 However here we use quasi-hyperbolic adam by default, with parameters nu1,nu2 https://arxiv.org/pdf/1810.06801.pdf
-
__init__
(G: autograd.numpy.numpy_wrapper.array, x0: autograd.numpy.numpy_wrapper.array, model: networkqit.graphtheory.models.GraphModel.GraphModel, **kwargs)[source]¶ Initialize the stochastic optimizer with the observed graph, an initial guess and the model to optimize.
- Args:
G (numpy.array) :is the empirical network to study. A N x N adjacency matrix as numpy.array. x0 (numpy.array): is the k-element array of initial parameters estimates. Typically set as random. model (nq.GraphModel): the graph model to optimize the likelihood of.
Methods
__init__
(G, x0, model, **kwargs)Initialize the stochastic optimizer with the observed graph, an initial guess and the model to optimize.
gradient
(x, rho, beta[, batch_size])run
(beta[, rho, maxiter, learning_rate, …])-