Iterative Algorithms#
Implementation of the operators is separated from the implementation of the iterative algorithm itself and passed as a function pointer. A similar stategy called supermarionation has been described in Murphy et al.
iter interfaces#
There are three iter interfaces that allow simplified access of the
algorithms: iter, iter2, and iter3. The iter interfaces use the
linop
{.interpreted-text role=”ref”} and operatorp
{.interpreted-text
role=”ref”}.
The iter interface considers the problem \(\min_x \frac{1}{2} \| Ax - y \|_2^2 + g(x)\). It requires \(A^\top A\), \(A^\top y\), and \(\text{prox}_g\) as inputs and outputs [x]{.title-ref}. The iter interface supports cg, ist, fista and admm. \(\text{prox}_g\) has to be NULL for cg.
The iter2 interface considers the problem \(\min_x \frac{1}{2} \| Ax - y \|_2^2 + \sum_i f_i (G_i x)\). It requires \(A^\top A\), \(A^\top y\), \(\text{prox}_{f_i}\), and \(G_i\) linops as inputs and outputs [x]{.title-ref}. The iter2 interface supports cg, ist, fista and admm. \(\text{prox}_{f_i}\) has to be NULL for cg. For ist and fista, the number of \(\text{prox}_{f_i}\) has to be one.
The iter3 interface contains the rest of the algorithms without a unifying interface.
cg#
cg implements the conjugate gradient method. It solves for:
Parameters:
Regularization \(\lambda\) and number of iterations
Input:
\(A^\top A\), and \(A^\top y\)
Ouput:
\(x\)
Iteration steps:
ist#
ist implements the iterative soft threshold method, also known as the proximal gradient method. It solves for:
where \(g(x)\) is simple, that is the proximal of [g(x)]{.title-ref} can be computed easily.
Parameters:
Step-size \(\tau\) and number of iterations
Input:
\(A^\top A\), \(A^\top y\), and \(\text{prox}_g\)
Ouput:
\(x\)
Iteration steps:
fista#
fista implements the fast iterative soft threshold method, also known as the accelerated proximal gradient method. It solves for:
where \(g(x)\) is simple, that is the proximal of [g(x)]{.title-ref} can be computed easily.
Parameters:
Step-size \(\tau\) and number of iterations
Input:
\(A^\top A\), \(A^\top y\), and \(\text{prox}_g\)
Ouput:
\(x\)
Iteration steps:
admm#
admm implements the alternating direction method of multipliers. It solves for:
where \(f_i\) s are simple, that is the proximal of each [f_i]{.title-ref} can be computed easily.
Parameters:
Convergence parameter \(\rho\), and number of iterations
Input:
\(A^\top A\), \(A^\top y\), \(\text{prox}_{f_i}\), \(G_i\), \(G_i^\top\), \(G_i^\top G_i\), and \(b_i\)
Ouput:
\(x\)
Iteration steps: