Update PDHG docstring.#2220
Update PDHG docstring.#2220manchester-jhellier wants to merge 2 commits intoTomographicImaging:masterfrom
Conversation
MargaretDuff
left a comment
There was a problem hiding this comment.
Thanks @manchester-jhellier for starting this conversation! I made some comments
|
|
||
| The general problem considered in the PDHG algorithm is the generic saddle-point problem | ||
|
|
||
| .. math:: \min_{x\in X}\max_{y\in Y} \langle Kx, y \rangle + g(x) - f^{*}(x) |
There was a problem hiding this comment.
Your above comment led me to spot this...
| .. math:: \min_{x\in X}\max_{y\in Y} \langle Kx, y \rangle + g(x) - f^{*}(y) |
| A Linear Operator. | ||
| sigma : positive :obj:`float`, or `np.ndarray`, `DataContainer`, `BlockDataContainer`, optional, default is 1.0/norm(K) or 1.0/ (tau*norm(K)**2) if tau is provided | ||
| Step size for the dual problem. | ||
| Step size for the dual problem. Needs to obey constraints with tau and operator norm to be valid, see below for details. |
There was a problem hiding this comment.
| Step size for the dual problem. Needs to obey constraints with tau and operator norm to be valid, see below for details. | |
| Step size for the dual problem. Needs to obey constraints with tau and operator norm to satisfy convergence guarantees, see below for details. |
Potentially? Valid is interesting as you get step sizes that don't meet the convergence guarantees but still lead to convergence... and this can vastly speed things up!
| Step size for the dual problem. Needs to obey constraints with tau and operator norm to be valid, see below for details. | ||
| tau : positive :obj:`float`, or `np.ndarray`, `DataContainer`, `BlockDataContainer`, optional, default is 1.0/norm(K) or 1.0/ (sigma*norm(K)**2) if sigma is provided | ||
| Step size for the primal problem. | ||
| Step size for the primal problem. Needs to obey constraints with sigma and operator's norm to be valid, see below for details. |
There was a problem hiding this comment.
| Step size for the primal problem. Needs to obey constraints with sigma and operator's norm to be valid, see below for details. | |
| Step size for the primal problem. Needs to obey constraints with sigma and operator's norm to satisfy convergence guarantees, see below for details. |
| ---------- | ||
| f : Function | ||
| A convex function with a "simple" proximal method of its conjugate. | ||
| A convex function with a "simple" proximal method of its conjugate. This function must map from the operator range to the Reals, as :math: `f(Kx)` or :math: `f^{*}(x)` will be the contribution to the total objective. See below for details. |
There was a problem hiding this comment.
| A convex function with a "simple" proximal method of its conjugate. This function must map from the operator range to the Reals, as :math: `f(Kx)` or :math: `f^{*}(x)` will be the contribution to the total objective. See below for details. | |
| A convex function with a "simple" proximal method of its conjugate. This function must map from the operator range to the Reals, as :math: `f(Kx)` will be the contribution to the total objective. See below for details. |
I think leave the convex conjugate out for now?
|
|
||
| class PDHG(Algorithm): | ||
|
|
||
| r"""Primal Dual Hybrid Gradient (PDHG) algorithm, see :cite:`CP2011`, :cite:`EZXC2010`. |
There was a problem hiding this comment.
Maybe if we make the objective clear at the top it will also help?
| r"""Primal Dual Hybrid Gradient (PDHG) algorithm, see :cite:`CP2011`, :cite:`EZXC2010`. | |
| PDHG minimises objectives of the form: | |
| .. math:: \min_{x\in X} f(Kx) + g(x), | |
| where :math:`f` and the regulariser :math:`g` need to be proper, convex and lower semi-continuous. The function :math:`f` and the convex conjugate of :math:`g` must also have calculable proximal methods. |
|
Hi @manchester-jhellier, thanks for opening this PR. Are you happy for us to commit Margaret's suggested changes? 😄 |
Description
Updated the docstring for PDHG to hopefully make it a bit clearer. Feel free to change. Related issue is #2219 .
Contribution Notes
❤️ Thanks for your contribution!