Code for the paper, GroupCDL: Interpretable Denoising and Compressed Sensing MRI via Learned Group-Sparsity and Circulant Attention, in IEEE Transactions on Computational Imaging 2025 (preprint available), which makes use of CirculantAttention.jl.
This repo additionally implements,
- CDLNet: Noise-Adaptive Convolutional Dictionary Learning Network for Blind Denoising and Demosaicing, in IEEE OJSP 2022, (code).
- Gabor is Enough: Interpretable Deep Denoising with a Gabor Synthesis Dictionary Prior, in IEEE IVMSP 2022, (preprint available), (code).
- If you're on an HPC, set your julia depot path to somewhere you can install files, e.g. scratch.
Throw this in your
.bashrc
and source it
export JULIA_DEPOT_PATH="/scratch/$USER/.julia"
export JULIAUP_DEPOT_PATH="/scratch/$USER/.julia"
- Install via juliaup manager:
curl -fsSL https://install.julialang.org | sh
- Install project environment In the GroupCDL directory, start a julia instance and instantiate the project environment,
julia --project -t auto
julia> using Pkg; Pkg.instantiate()
- Multi-GPU (once project environment instantiated)
julia --project -t auto -e "import MPI; MPI.install_mpiexecjl()"
The following asssumes you have a Julia REPL for the project open, e.g. julia +1.10 --project
. All network and training details are given through a config file. Example config files can be seen in trained_nets
. Config files are stitched together from config
automatically in the first training example below.
Edit the configuration files in configs/
to choose a network architecture, training, logging, and dataset details.
Then, in the Julia REPL, run,
julia> net, ps, st, ot = main(; network_config="config/groupcdl.yaml", closure_config="config/synthawgn_closure.yaml", data_config="config/image_data.yaml", warmup=true, train=true, verbose=true)
To train with multiple GPUs, we use MPI and call our main script as follows,
mpiexecjl -n <num_gpus> --project=. julia --project -t <num_cpus> main.jl --seed <seed> --train --warmup --verbose --mpi --config <path/to/config.yaml>"
Run julia --project main.jl --help
for additional details on training from the commandline.
julia> net, ps, st, ot = main(; config="trained_nets/GroupCDL-S25/config.yaml", eval=true, verbose=true)
Optionally, you can provide alternate config files for the data and closure, ex.,
julia> net, ps, st, ot = main(; config="path/to/pretained_model/config.yaml", eval=true, eval_closure_config="config/synthawgn_closure.yaml", eval_data_config="config/image_data.yaml", verbose=true)
Note that the path to the checkpoint is specified in the config file.
See media/sliding_window.mp4
for an animation of how the block-circulant with circulant blocks sparsity pattern
of GroupCDL's adjacency matrix is a result of the sliding-window nonlocal self-similarity.
If you find this code/work useful, please cite us:
@ARTICLE{janjusevicGroupCDL2025,
author={Janjušević, Nikola and Khalilian-Gourtani, Amirhossein and Flinker, Adeen and Feng, Li and Wang, Yao},
journal={IEEE Transactions on Computational Imaging},
title={{GroupCDL}: Interpretable Denoising and Compressed Sensing MRI via Learned Group-Sparsity and Circulant Attention},
year={2025},
volume={11},
number={},
pages={201-212},
doi={10.1109/TCI.2025.3539021}
}
@article{janjusevicCDLNet2022,
author={Janjušević, Nikola and Khalilian-Gourtani, Amirhossein and Wang, Yao},
journal={IEEE Open Journal of Signal Processing},
title={{CDLNet}: Noise-Adaptive Convolutional Dictionary Learning Network for Blind Denoising and Demosaicing},
year={2022},
volume={3},
number={},
pages={196-211},
doi={10.1109/OJSP.2022.3172842}
}
@INPROCEEDINGS{janjusevicGDLNet2022,
author={Janjušević, Nikola and Khalilian-Gourtani, Amirhossein and Wang, Yao},
booktitle={2022 IEEE 14th Image, Video, and Multidimensional Signal Processing Workshop (IVMSP)},
title={Gabor is Enough: Interpretable Deep Denoising with a Gabor Synthesis Dictionary Prior},
year={2022},
volume={},
number={},
pages={1-5},
doi={10.1109/IVMSP54334.2022.9816313}
}