copt.minimize_frank_wolfe¶

copt.
minimize_frank_wolfe
(fun, x0, lmo, jac='2point', step='backtracking', lipschitz=None, args=(), max_iter=400, tol=1e12, callback=None, verbose=0, eps=1e08)¶ FrankWolfe algorithm.
Implements the FrankWolfe algorithm, see , see FrankWolfe for a more detailed description.
 Parameters
fun –
callable The objective function to be minimized.
fun(x, *args) > float
where x is an 1D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function.
x0 – arraylike Initial guess for solution.
lmo – callable Takes as input a vector u of same size as x0 and returns both the update direction and the maximum admissible stepsize.
jac –
{callable, ‘2point’, bool}, optional Method for computing the gradient vector. If it is a callable, it should be a function that returns the gradient vector:
jac(x, *args) > array_like, shape (n,)
where x is an array with shape (n,) and args is a tuple with the fixed parameters. Alternatively, the ‘2point’ select a finite difference scheme for numerical estimation of the gradient. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated using ‘2point’ finite difference estimation.
step –
str or callable, optional Stepsize strategy to use. Should be one of
”backtracking”, will use the backtracking linesearch from [PANJ2020]
”DR”, will use the DemyanovRubinov stepsize. This stepsize minimizes a quadratic upper bound ob the objective using the gradient’s lipschitz constant, passed in keyword argument lipschitz. [P2018]
”sublinear”, will use a decreasing stepsize of the form 2/(k+2). [J2013]
callable, if step is a callable function, it will use the stepsize returned by step(locals).
lipschitz – None or float, optional Estimate for the Lipschitz constant of the gradient. Required when step=”DR”.
max_iter – integer, optional Maximum number of iterations.
tol – float, optional Tolerance of the stopping criterion. The algorithm will stop whenever the FrankWolfe gap is below tol or the maximum number of iterations is exceeded.
callback – callable, optional Callback to execute at each iteration. If the callable returns False then the algorithm with immediately return.
eps – float or ndarray If jac is approximated, use this value for the step size.
verbose – int, optional Verbosity level.
 Returns
 scipy.optimize.OptimizeResult
The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.
References
 J2013
Jaggi, Martin. “Revisiting FrankWolfe: ProjectionFree Sparse Convex Optimization.” ICML 2013.
 P2018
Pedregosa, Fabian “Notes on the FrankWolfe Algorithm”, 2018
 PANJ2020
Pedregosa, Fabian, Armin Askari, Geoffrey Negiar, and Martin Jaggi. “StepSize Adaptivity in ProjectionFree Optimization.” arXiv:1806.05123 (2020).
Examples