PoissonLikelihoodDistance#
- class deepinv.optim.PoissonLikelihoodDistance(gain=1.0, bkg=0, denormalize=False)[source]#
Bases:
Distance
(Negative) Log-likelihood of the Poisson distribution.
\[\distance{y}{x} = \sum_i y_i \log(y_i / x_i) + x_i - y_i\]Note
The function is not Lipschitz smooth w.r.t. \(x\) in the absence of background (\(\beta=0\)).
- Parameters:
gain (float) – gain of the measurement \(y\). Default: 1.0.
bkg (float) – background level \(\beta\). Default: 0.
denormalize (bool) – if True, the measurement is divided by the gain. By default, in the class
physics.noise.PoissonNoise
, the measurements are multiplied by the gain after being sampled by the Poisson distribution. Default: True.
- fn(x, y, *args, **kwargs)[source]#
Computes the Kullback-Leibler divergence
- Parameters:
x (torch.Tensor) – Variable \(x\) at which the distance is computed.
y (torch.Tensor) – Observation \(y\).
- grad(x, y, *args, **kwargs)[source]#
Gradient of the Kullback-Leibler divergence
- Parameters:
x (torch.Tensor) – signal \(x\) at which the function is computed.
y (torch.Tensor) – measurement \(y\).
- prox(x, y, *args, gamma=1.0, **kwargs)[source]#
Proximal operator of the Kullback-Leibler divergence
- Parameters:
x (torch.Tensor) – signal \(x\) at which the function is computed.
y (torch.Tensor) – measurement \(y\).
gamma (float) – proximity operator step size.