deepinv.optim#
This module provides optimization utils for constructing reconstruction models based on optimization algorithms. Please refer to the user guide for more details.
Base Class#
User Guide: refer to Optimization for more information.
Helper function for building an instance of the |
|
Class for optimization algorithms, consists in iterating a fixed-point operator. |
Potentials#
User Guide: refer to Potentials for more information.
Base class for a potential \(h : \xset \to \mathbb{R}\) to be used in an optimization problem. |
Data Fidelity#
User Guide: refer to Data Fidelity for more information.
Base class for the data fidelity term \(\distance{A(x)}{y}\) where \(A\) is the forward operator, \(x\in\xset\) is a variable and \(y\in\yset\) is the data, and where \(d\) is a distance function, from the class |
|
\(\ell_1\) data fidelity term. |
|
Implementation of the data-fidelity as the normalized \(\ell_2\) norm |
|
Data-fidelity as the indicator of \(\ell_2\) ball with radius \(r\). |
|
Poisson negative log-likelihood. |
|
Log-Poisson negative log-likelihood. |
|
Amplitude loss as the data fidelity term for |
Priors#
User Guide: refer to Priors for more information.
Prior term \(\reg{x}\). |
|
Plug-and-play prior \(\operatorname{prox}_{\gamma \regname}(x) = \operatorname{D}_{\sigma}(x)\). |
|
Regularization-by-Denoising (RED) prior \(\nabla \reg{x} = x - \operatorname{D}_{\sigma}(x)\). |
|
Score via MMSE denoiser \(\nabla \reg{x}=\left(x-\operatorname{D}_{\sigma}(x)\right)/\sigma^2\). |
|
Tikhonov regularizer \(\reg{x} = \frac{1}{2}\| x \|_2^2\). |
|
\(\ell_1\) prior \(\reg{x} = \| x \|_1\). |
|
Wavelet prior \(\reg{x} = \|\Psi x\|_{p}\). |
|
Total variation (TV) prior \(\reg{x} = \| D x \|_{1,2}\). |
|
Patch prior \(g(x) = \sum_i h(P_i x)\) for some prior \(h(x)\) on the space of patches. |
|
Patch prior via normalizing flows. |
|
\(\ell_{1,2}\) prior \(\reg{x} = \sum_i\| x_i \|_2\). |
Predefined models#
User Guide: refer to Predefined Iterative Algorithms for more information.
Deep Plug-and-Play (DPIR) algorithm for image restoration. |
|
Expected Patch Log Likelihood reconstruction method. |
Bregman#
User Guide: refer to Bregman for more information.
Module for the Bregman framework with convex Bregman potential \(\phi\). |
|
Module for the L2 norm as Bregman potential \(\phi(x) = \frac{1}{2} \|x\|_2^2\). |
|
Module for the using Burg's entropy as Bregman potential \(\phi(x) = - \sum_i \log x_i\). |
|
Module for the using negative entropy as Bregman potential \(\phi(x) = \sum_i x_i \log x_i\). |
|
Module for the using a deep ICNN as Bregman potential. |
Distance#
User Guide: refer to Potentials for more information.
Distance \(\distance{x}{y}\). |
|
Implementation of \(\distancename\) as the normalized \(\ell_2\) norm |
|
Indicator of \(\ell_2\) ball with radius \(r\). |
|
(Negative) Log-likelihood of the Poisson distribution. |
|
\(\ell_1\) distance |
|
Amplitude loss for |
|
Log-Poisson negative log-likelihood. |
Iterators#
User Guide: refer to Iterators for more information.
Fixed-point iterations module. |
|
Base class for all |
|
Module for the single iteration steps on the data-fidelity term \(f\). |
|
Module for the single iteration steps on the prior term \(\lambda \regname\). |
|
Iterator for Gradient Descent. |
|
Iterator for proximal gradient descent. |
|
Iterator for fast iterative soft-thresholding (FISTA). |
|
Iterator for Chambolle-Pock. |
|
Iterator for alternating direction method of multipliers. |
|
Iterator for Douglas-Rachford Splitting. |
|
Single iteration of half-quadratic splitting. |
|
Iterator for Mirror Descent. |
|
Iterator for Spectral Methods for |
Utils#
User Guide: refer to Utils for more information.
Standard conjugate gradient algorithm. |
|
Standard gradient descent algorithm`. |
|
Gaussian mixture model including parameter estimation. |