2.1. copt.minimize_TOS¶

copt.
minimize_TOS
(f_grad, x0, prox_1=None, prox_2=None, tol=1e06, max_iter=1000, verbose=0, callback=None, backtracking=True, step_size=None, max_iter_backtracking=100, backtracking_factor=0.7, h_Lipschitz=None)[source]¶ DavisYin three operator splitting method.
This algorithm can solve problems of the form
minimize_x f(x) + g(x) + h(x)where f is a smooth function and g is a (possibly nonsmooth) function for which the proximal operator is known.
Parameters:  f_grad (callable) – Returns the function value and gradient of the objective function. With return_gradient=False, returns only the function value.
 prox_1 (callable or None) – prox_1(x, alpha, *args) returns the proximal operator of g at xa with parameter alpha. Extra arguments can be passed by prox_1_args.
 y0 (arraylike) – Initial guess
 backtracking (boolean) – Whether to perform backtracking (i.e. linesearch) to estimate the step size.
 max_iter (int) – Maximum number of iterations.
 verbose (int) – Verbosity level, from 0 (no output) to 2 (output on each iteration)
 step_size (float) – Starting value for the linesearch procedure.
 callback (callable) – callback function (optional).
Returns: res – The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.Return type: OptimizeResult
References
 Davis, Damek, and Wotao Yin. “A threeoperator splitting scheme and its optimization applications.” SetValued and Variational Analysis, 2017.
 Pedregosa, Fabian, and Gauthier Gidel. “Adaptive Three Operator Splitting.” Proceedings of the 35th International Conference on Machine Learning, 2018.