PanNet#

class deepinv.models.PanNet(backbone_net: Module | None = None, hrms_shape: tuple = (4, 900, 900), scale_factor: int = 4, highpass_kernel_size: int = 5, device='cpu', **kwargs)[source]#

Bases: Module

PanNet architecture for pan-sharpening.

PanNet neural network from Yang et al. PanNet: A Deep Network Architecture for Pan-Sharpening, ICCV 2017.

Takes input measurements as a deepinv.utils.TensorList with elements (MS, PAN), where MS is the low-resolution multispectral image of shape (B, C, H, W) and PAN is the high-resolution panchromatic image of shape (B, 1, H*r, W*r) where r is the pan-sharpening factor.

Parameters:
  • backbone_net (nn.Module) – Backbone neural network, e.g. ResNet. If None, defaults to a simple ResNet.

  • hrms_shape (tuple[int]) – shape of high-resolution multispectral images (C,H,W), defaults to (4,900,900)

  • scale_factor (int) – pansharpening downsampling ratio HR/LR, defaults to 4

  • highpass_kernel_size (int) – square kernel size for extracting high-frequency features, defaults to 5

  • device (str) – torch device, defaults to “cpu”

create_sampler(direction: str, hr_shape: tuple, noise_gain: float = 0.0) Physics[source]#

Helper function for downsampling/upsampling images (useful for reduced-resolution training with Wald’s protocol).

Parameters:
  • direction (str) – down or up

  • hr_shape (tuple) – HRMS input shape (C,H,W)

  • noise_gain (float) – noise applied to downsampling ONLY, defaults to 0.

Return dinv.physics.Physics:

deepinv sampler

forward(y: TensorList, physics: Pansharpen, *args, **kwargs)[source]#

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Examples using PanNet:#

Remote sensing with satellite images

Remote sensing with satellite images