copt.minimize_frank_wolfe¶
-
copt.
minimize_frank_wolfe
(fun, x0, lmo, jac='2-point', step='backtracking', lipschitz=None, args=(), max_iter=400, tol=1e-12, callback=None, verbose=0, eps=1e-08)¶ Frank-Wolfe algorithm.
Implements the Frank-Wolfe algorithm, see , see Frank-Wolfe for a more detailed description.
- Parameters
fun –
callable The objective function to be minimized.
fun(x, *args) -> float
where x is an 1-D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function.
x0 – array-like Initial guess for solution.
lmo – callable Takes as input a vector u of same size as x0 and returns both the update direction and the maximum admissible step-size.
jac –
{callable, ‘2-point’, bool}, optional Method for computing the gradient vector. If it is a callable, it should be a function that returns the gradient vector:
jac(x, *args) -> array_like, shape (n,)
where x is an array with shape (n,) and args is a tuple with the fixed parameters. Alternatively, the ‘2-point’ select a finite difference scheme for numerical estimation of the gradient. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated using ‘2-point’ finite difference estimation.
step –
str or callable, optional Step-size strategy to use. Should be one of
”backtracking”, will use the backtracking line-search from [PANJ2020]
”DR”, will use the Demyanov-Rubinov step-size. This step-size minimizes a quadratic upper bound ob the objective using the gradient’s lipschitz constant, passed in keyword argument lipschitz. [P2018]
”sublinear”, will use a decreasing step-size of the form 2/(k+2). [J2013]
callable, if step is a callable function, it will use the step-size returned by step(locals).
lipschitz – None or float, optional Estimate for the Lipschitz constant of the gradient. Required when step=”DR”.
max_iter – integer, optional Maximum number of iterations.
tol – float, optional Tolerance of the stopping criterion. The algorithm will stop whenever the Frank-Wolfe gap is below tol or the maximum number of iterations is exceeded.
callback – callable, optional Callback to execute at each iteration. If the callable returns False then the algorithm with immediately return.
eps – float or ndarray If jac is approximated, use this value for the step size.
verbose – int, optional Verbosity level.
- Returns
- scipy.optimize.OptimizeResult
The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.
References
- J2013
Jaggi, Martin. “Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization.” ICML 2013.
- P2018
Pedregosa, Fabian “Notes on the Frank-Wolfe Algorithm”, 2018
- PANJ2020
Pedregosa, Fabian, Armin Askari, Geoffrey Negiar, and Martin Jaggi. “Step-Size Adaptivity in Projection-Free Optimization.” arXiv:1806.05123 (2020).
Examples