Contributing to DeepInverse#
DeepInverse is a community-driven project and welcomes contributions of all forms. We are building a comprehensive library of inverse problems and deep learning, and we need your help to get there!
View our active list of contributors here.
Learn more about our code philosophy in the paper: DeepInverse: A Python package for solving imaging inverse problems with deep learning.
How to Contribute#
To contribute, you must install deepinv
in editable mode
so that all your changes are visible when you run workflows.
Make sure that you download all the required dependencies for testing
by running in the root directory of the repository:
pip install -e .[test,dataset,denoisers,doc,lint,training]
# or using `uv` for faster installation:
uv pip install -e .[test,dataset,denoisers,doc,lint,training]
We acknowledge all contributors in the documentation and in the source code. Significant contributions will also be taken into account when deciding on the authorship of future publications.
Please contribute to deepinv
by forking the main
repository on GitHub,
then submit a “Pull Request” (PR). When preparing the PR, please make sure to
check the following points:
Code quality: your code is compliant with PEP8, the black style and a partial ruff checker. This can be done easily by installing the
black
andruff
libraries and runningblack .
andruff check --fix
in the root directory of the repository after making the desired changes.Typing: your docstrings and code are adequately typed. Typing rules such as PEP585 are automatically checked using ruff.
Tests: write tests in
deepinv/tests
to test your code’s intended functionality, including unit tests (e.g. checking each method’s return values) and integration tests (i.e. end-to-end behaviour), following a test-driven development methodology. We usepytest
andunittest.mock
to write our tests. All existing tests should pass on your local machine. This can be done by installingpytest
and runningpython -m pytest deepinv/tests
in the root directory of the repository after making the desired changes. Learn more here. Your code coverage will automatically be checked usingcodecov
.Docs: the documentation and docstrings are updated if necessary. Our documentation is written in reST and built with sphinx. Please refer to the docstring guidelines below. Your documentation should be added to: a) docstring, b) API reference, c) User Guide, d) Examples (optional). After making the desired changes, check the documentation by installing
sphinx
and building the docs by running one of the commands in the table below in thedocs
directory. Note that if the build process fails, supplementary additional libraries may need to be manually installed (e.g.sphinx-gallery
): please follow instructions in the log.
Tip
Once the GitHub tests have been approved by a maintainer (only required for first-time contributors), and the Build Docs
GitHub action
has run successfully, you can download the documentation as a zip file from the Actions page. Look for the workflow run corresponding to your pull request.
Command |
Description of command |
---|---|
|
Generates all the documentation |
|
Generates documentation faster but without running the examples |
|
Generates documentation for files matching |
|
Cleans the documentation files |
|
Cleans the documentation files (Windows OS) |
Finding Help#
If you are not familiar with the GitHub contribution workflow, you can also open an issue on the issue tracker and also ask any question in our discord server Discord server. We will then try to address the issue as soon as possible. You can also send an email to any of the maintainers with your questions or ideas.
Docstring Guidelines#
For class and function docstrings, we use the reStructuredText (reST) syntax. See the Sphinx documentation for more details.
Please follow these guidelines:
Each parameter and return must be properly descreibed, along with a type annotations for each
:param
field, as shown below::param <type> <name>: Description of the parameter. :return: Description of the return value.
Docstrings can be split into multiple sections using the horizontal separator
|sep|
, with section titles introduced by:Title:
.To provide usage examples, include an
:Example:
section. Code in this section will be executed during documentation generation.Use
:math:
for inline LaTeX-style mathematics, and.. math::
for block equations.To include remarks, warnings, or tips, use the
.. note::
directive.To cite a paper:
Add the BibTeX entry to the
refs.bib
file.Use
:footcite:t:`<key>`
to cite in the format Author et al. [1].Use
:footcite:p:`<key>`
to cite with only the reference number [1].
For details on citing references with Sphinx, see the sphinx-bibtex documentation.
All references will be compiled and listed automatically in the generated documentation.
Below is a minimal working example of a typical docstring that includes all these features:
class MyDenoiser:
r"""
Denoiser denoiser from the paper :footcite:t:`my_paper`.
.. math::
y = \D_\sigma{x + \sigma \omega}
.. note::
This is a note.
|sep|
:Example:
>>> import torch
>>> import deepinv as dinv
>>> model = dinv.models.DRUNet()
>>> x = torch.ones((1, 1, 8, 8))
>>> y = model(x)
:param int in_channels: number of input channels.
:param int out_channels: number of output channels.
:param str pretrained: path to pretrained weights or 'download'.
"""
def __init__(self, in_channels: int, out_channels: int, pretrained: bool = None):
pass
Contributing new datasets#
In order to contribute a new dataset, you must provide tests alongisde it to check that it functions as expected. The DeepInverse code base is regularly tested on automatic continuous integration (CI) servers in order to ensure that the code works the way it is supposed to. Unfortunately, the CI servers have limited resources and they can generally not host the datasets.
We get around this by mocking datasets in the tests. First, write the tests and the implementation, and make sure that the tests pass locally, on the real data. Then, write mocking code, code that intercepts calls to input/output (IO) related functions, e.g. os.listdir
, and make them return a hard coded value, thereby making execution go as if the data was there. For more details and examples, see this pull request.
Once the implementation, the tests and the mocking code are written, that they pass locally and on the CI servers, the maintainers will be able to review the code and merge it into the main branch if everything goes well. You should bear in mind though that the maintainers won’t have the time to make sure the tests pass on the real data, so they will have to trust that you did things correctly.