copt.minimize_primal_dual

copt.minimize_primal_dual(f_grad, x0, prox_1=None, prox_2=None, L=None, tol=1e-12, max_iter=1000, callback=None, step_size=1.0, step_size2=None, line_search=True, max_iter_ls=20, verbose=0)

Primal-dual hybrid gradient splitting method.

This method for optimization problems of the form

minimize_x f(x) + g(x) + h(L x)

where f is a smooth function and g is a (possibly non-smooth) function for which the proximal operator is known.

Parameters
  • f_grad – callable Returns the function value and gradient of the objective function. It should accept the optional argument return_gradient, and when False it should return only the function value.

  • prox_1 – callable of the form prox_1(x, alpha) prox_1(x, alpha, *args) returns the proximal operator of g at x with parameter alpha.

  • prox_2 – callable or None prox_2(y, alpha, *args) returns the proximal operator of h at y with parameter alpha.

  • x0 – array-like Initial guess of solution.

  • L

    array-like or linear operator Linear operator inside the h term. It may be any of the following types:

    • ndarray

    • matrix

    • sparse matrix (e.g. csr_matrix, lil_matrix, etc.)

    • LinearOperator

    • An object with .shape and .matvec attributes

  • max_iter – int Maximum number of iterations.

  • verbose – int Verbosity level, from 0 (no output) to 2 (output on each iteration)

  • callback – callable. callback function (optional). Takes a single argument (x) with the current coefficients in the algorithm. The algorithm will exit if callback returns False.

Returns

OptimizeResult

The optimization result represented as a scipy.optimize.OptimizeResult object. Important attributes are: x the solution array, success a Boolean flag indicating if the optimizer exited successfully and message which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.

Return type

res

References

The implemented algorithm corresponds to Algorithm 4 in: Malitsky, Yura, and Thomas Pock. “A first-order primal-dual algorithm with linesearch.” SIAM Journal on Optimization (2018)

Condat, Laurent. “A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms.” Journal of Optimization Theory and Applications (2013).

Chambolle, Antonin, and Thomas Pock. “On the ergodic convergence rates of a first-order primal-dual algorithm.” Mathematical Programming (2015)