Robust Compressed Sensing MRI with Deep Generative Priors

Ajil Jalal, Marius Arvinte, Giannis Daras, Eric Price, Alexandros G. Dimakis, Jonathan I. Tamir, NeurIPS 2021

System Model: Multi-coil Magnetic Resonance Imaging


MRI is a medical imaging modality that makes measurements using an array of radio-frequency coils placed around the body. Each coil is spatially sensitive to a local region, and measurements are acquired directly in the spatial frequency, or k-space, domain. To decrease scan time, reduce operating costs, and improve patient comfort, a reduced number of k-space measurements are acquired in clinical use and reconstructed by incorporating explicit or implicit knowledge of the spatial sensitivity maps [Sodickson and Manning, Pruessmann et. al., Griswold et. al.]. Formally, the vector of measurements \(y_i \in \mathbb{C}^L\) acquired by the \(n^{th}_{i}\) coil can be characterized by the forward model:

\[ y_i = P F S_i x + w_i,\quad i=1,…,N_c,\]

where \(x \in \mathbb{C}^N\) is the image containing \(N\) pixels, \(S_i\) is an operator representing the point-wise multiplication of the \(n^{th}_{i}\) coil sensitivity map, \(F\) is the spatial Fourier transform operator, \(P\) represents the k-space sampling operator, and we assume \(w_i \sim \mathcal{N}_c\left(0, \sigma^2I\right)\) for simplicity. Importantly, note that the same under-sampling operator is applied to all \(N_c\) coils. The undersampling ratio, or acceleration, is defined as \( R = L / N \).

Our Algorithm: Posterior Sampling

The algorithm we consider is posterior sampling [Song and Ermon, Jalal et. al]. That is, given an observation of the form \(y = Ax + w\), where \(y \in \mathbb{C}^M\), \(A \in \mathbb{C}^{M \times N}\), \(w \sim \mathcal{N}_c(0,\sigma^2 I)\), and \(x \sim \mu\), the posterior sampling recovery algorithm outputs \(\widehat{x}\) according to the posterior distribution \(\mu(\cdot | y)\). In order to implement posterior sampling, we use annealed Langevin dynamics with a score-based generative model [Song and Ermon].


Qualitative Results

Qualitative results showing our algorithm (last column) in comparison to state-of-the-art deep learning methods (second and third columns). The first column shows the ‘‘gold standard’’ reference image. All methods were trained on brains, with a fixed acceleration \(R\) (also called the undersampling ratio). Our algorithm is competitive with baselines when there is no change in the acceleration (top row), and is more robust when we change the acceleration (second row) or target anatomy (last row).


Recovering Fine Details

Additionally, when all methods are trained on brains and tested on knees, fine details like meniscus tears (annotated in the zoomed inset) are preserved better by our algorithm.


Quantitative Results

In this PSNR plot, our algorithm wins in most settings. All methods were trained on brains at a particular acceleration, and tested on varying accelerations, k-space masks, and anatomies.


Paper and Video

This is the arXiv link to our paper.

This is the Github repo with code and models.

This is the link to our NeurIPS video.


1. Sodickson, Daniel K., and Warren J. Manning. “Simultaneous acquisition of spatial harmonics (SMASH): fast imaging with radiofrequency coil arrays.” Magnetic resonance in medicine 38.4 (1997): 591-603.

2. Pruessmann, Klaas P., et al. “SENSE: sensitivity encoding for fast MRI.” Magnetic Resonance in Medicine: An Official Journal of the International Society for Magnetic Resonance in Medicine 42.5 (1999): 952-962.

3. Griswold, Mark A., et al. “Generalized autocalibrating partially parallel acquisitions (GRAPPA).” Magnetic Resonance in Medicine: An Official Journal of the International Society for Magnetic Resonance in Medicine 47.6 (2002): 1202-1210.

4. Song, Yang, and Stefano Ermon. “Generative Modeling by Estimating Gradients of the Data Distribution.” Proceedings of the 33rd Annual Conference on Neural Information Processing Systems. 2019.

5. Jalal, Ajil, et al. “Instance-Optimal Compressed Sensing via Posterior Sampling.” International Conference on Machine Learning (ICML). 2021.