.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/basics/demo_remote_sensing.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_basics_demo_remote_sensing.py: Remote sensing with satellite images ==================================== In this example we demonstrate remote sensing inverse problems for multispectral satellite imaging. These have important applications for image restoration in environmental monitoring, urban planning, disaster recovery etc. We will demonstrate pan-sharpening, i.e., recovering high-resolution multispectral images from measurement pairs of low-resolution multispectral images and high-resolution panchromatic (single-band) images with the forward operator :class:`deepinv.physics.Pansharpen`. We will also demonstrate other inverse problems including compressive spectral imaging and hyperspectral unmixing. We provide a convenient satellite image dataset for pan-sharpening :class:`deepinv.datasets.NBUDataset` provided in the paper `A Large-Scale Benchmark Data Set for Evaluating Pansharpening Performance `_ which includes data from several satellites such as WorldView satellites. .. tip:: For remote sensing experiments, DeepInverse provides the following classes: - :class:`Pan-sharpening ` - :class:`Compressive spectral imaging ` - :class:`Hyperspectral unmixing ` - :class:`Super resolution ` - :class:`Satellite imagery dataset ` - Metrics for multispectral data: :class:`QNR `, :class:`SpectralAngleMapper `, :class:`ERGAS ` .. GENERATED FROM PYTHON SOURCE LINES 31-35 .. code-block:: Python import deepinv as dinv import torch .. GENERATED FROM PYTHON SOURCE LINES 36-53 Load raw pan-sharpening measurements ------------------------------------ The dataset includes raw pansharpening measurements containing ``(MS, PAN)`` where ``MS`` are the low-res (4-band) multispectral and ``PAN`` are the high-res panchromatic images. Note there are no ground truth images! .. note:: The pan-sharpening measurements are provided as a :class:`deepinv.utils.TensorList`, since the pan-sharpening physics :class:`deepinv.physics.Pansharpen` is a stacked physics combining :class:`deepinv.physics.Downsampling` and :class:`deepinv.physics.Decolorize`. See the User Guide :ref:`physics_combining` for more information. Note, for plotting purposes we only plot the first 3 bands (RGB). Note also that the linear adjoint must assume the unknown spectral response function (SRF). .. GENERATED FROM PYTHON SOURCE LINES 53-80 .. code-block:: Python DATA_DIR = dinv.utils.get_data_home() dataset = dinv.datasets.NBUDataset(DATA_DIR, return_pan=True, download=True) y = dataset[0].unsqueeze(0) # MS (1,4,256,256), PAN (1,1,1024,1024) physics = dinv.physics.Pansharpen((4, 1024, 1024), factor=4) # Pansharpen with classical Brovey method x_hat = physics.A_dagger(y) # shape (1,4,1024,1024) dinv.utils.plot( [ y[0][:, :3], y[1], # Note this will be interpolated to match high-res image size x_hat[:, :3], physics.A_adjoint(y)[:, :3], ], titles=[ "Input MS", "Input PAN", "Pseudo-inverse using Brovey method", "Linear adjoint", ], dpi=1200, ) .. image-sg:: /auto_examples/basics/images/sphx_glr_demo_remote_sensing_001.png :alt: Input MS, Input PAN, Pseudo-inverse using Brovey method, Linear adjoint :srcset: /auto_examples/basics/images/sphx_glr_demo_remote_sensing_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading datasets/nbu/gaofen-1.zip 0%| | 0/7914941 [00:00`_ model. This model can be trained using losses such as supervised learning using :class:`deepinv.loss.SupLoss` or self-supervised learning using Equivariant Imaging :class:`deepinv.loss.EILoss`, which was applied to pan-sharpening in `Wang et al., Perspective-Equivariant Imaging: an Unsupervised Framework for Multispectral Pansharpening `_ For evaluation, we use the standard full-reference metrics (ERGAS, SAM) and no-reference (QNR). .. note:: This is a tiny example using 5 images. We demonstrate training for 1 epoch for speed, but you can train from scratch using 50 epochs. .. GENERATED FROM PYTHON SOURCE LINES 152-156 .. code-block:: Python model = dinv.models.PanNet(hrms_shape=(4, 256, 256)) x_net = model(y, physics) .. GENERATED FROM PYTHON SOURCE LINES 157-161 Example training loss using measurement consistency on the multispectral images and Stein's Unbiased Risk Estimate on the panchromatic images. For metrics, we use standard full-reference and no-reference multispectral pan-sharpening metrics, since ground-truth is now available. .. GENERATED FROM PYTHON SOURCE LINES 161-171 .. code-block:: Python loss = dinv.loss.StackedPhysicsLoss( [dinv.loss.MCLoss(), dinv.loss.SureGaussianLoss(0.05)] ) sam = dinv.metric.distortion.SpectralAngleMapper() ergas = dinv.metric.distortion.ERGAS(factor=4) qnr = dinv.metric.QNR() print(sam(x_hat, x), ergas(x_hat, x), qnr(x_hat, x=None, y=y, physics=physics)) .. rst-class:: sphx-glr-script-out .. code-block:: none tensor([0.0545]) tensor([4.1428]) tensor([0.4736]) .. GENERATED FROM PYTHON SOURCE LINES 172-174 For training, we first load optimizer and pretrained model, then train using the deepinv Trainer. .. GENERATED FROM PYTHON SOURCE LINES 174-207 .. code-block:: Python optimizer = torch.optim.Adam(model.parameters()) from deepinv.models.utils import get_weights_url file_name = "demo_nbu_pansharpen.pth" url = get_weights_url(model_name="demo", file_name=file_name) ckpt = torch.hub.load_state_dict_from_url( url, map_location=lambda storage, loc: storage, file_name=file_name ) model.load_state_dict(ckpt["state_dict"]) optimizer.load_state_dict(ckpt["optimizer"]) from torch.utils.data import DataLoader trainer = dinv.Trainer( model=model, physics=physics, optimizer=optimizer, losses=loss, metrics=[sam, ergas], train_dataloader=DataLoader(dataset), epochs=1, online_measurements=True, plot_images=False, compare_no_learning=True, no_learning_method="A_dagger", show_progress_bar=False, ) trainer.train() trainer.test(DataLoader(dataset)) .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading: "https://huggingface.co/deepinv/demo/resolve/main/demo_nbu_pansharpen.pth?download=true" to /home/runner/.cache/torch/hub/checkpoints/demo_nbu_pansharpen.pth 0%| | 0.00/955k [00:00` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: demo_remote_sensing.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: demo_remote_sensing.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_