User Guide#

Imaging inverse problems are described by the equation \(y = \noise{\forw{x}}\) where \(x\) is an unknown signal (image, volume, etc.) we want to recover, \(y\) are the observed measurements, \(A\) is a deterministic (linear or non-linear) operator capturing the physics of the acquisition and \(N\) characterizes the noise affecting the measurements.

Operators#

The library provides a large variety of imaging operators physics modelling \(\noise{\forw{\cdot}}\), which can simulate the observation process:

x = load_image()
y = physics(x) # simulate observation

Introduction

Introduction to the physics package.

Introduction

Operators

Forward operators and noise distributions.

Operators & Noise

Defining your operator

How to define your own forward operator, if the existing ones are not enough.

Defining New Operators

Functional

Various utilities for forward operators.

Functional

Reconstruction#

In order to recover an image from its measurements, the library provides many reconstruction methods \(\hat{x}=R(y, A)\), which often leverage knowledge of the acquisition physics. Given a restoration model model, the reconstruction is therefore provided as

x_hat = model(y, physics) # reconstruct signal

Introduction

Introduction and summary of reconstruction algorithms.

Introduction

Denoisers

Classical and deep denoisers with pretrained weights.

Denoisers

Artifact Removal

Reconstruction networks from denoisers and other image-to-image networks.

Artifact Removal

Optimization

Priors and data-fidelity functions, and optimization algorithms.

Optimization

Unfolded Algorithms

Unfolded architectures.

Unfolded Algorithms

Iterative Reconstruction

Plug-and-play, RED, variational methods.

Iterative Reconstruction (PnP, RED, etc.)

Adversarial Reconstruction

Conditional, unconditional GANs and deep image prior.

Adversarial Networks

Sampling

Diffusion and MCMC algorithms.

Diffusion and MCMC Algorithms

Training, Testing and Utilities#

All the tools from the library, from measurement operator to restoration methods, are implemented as torch.nn.Module and therefore natively support backpropagation. Reconstruction networks model can be trained on datasets to improve their performance:

trainer = Trainer(model, loss, optimizer, metric, train_dataset, ...)
trainer.train()
trainer.test(test_dataset)

Training

Training and testing reconstruction models.

Trainer

Datasets

Utilities to generate and load datasets for training and testing.

Datasets

Loss

Supervised and self-supervised losses to train the models.

Training Losses

Metrics

Distortion and perceptual metrics to evaluate reconstructions.

Metrics

Transforms

Transforms for data augmentation and self-supervised learning.

Transforms

Utils

Plotting and other utilities.

Utils