Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems

Recent studies on inverse problems have proposed posterior samplers thatleverage the pre-trained diffusion models as powerful priors. These attemptshave paved the way for using diffusion models in a wide range of inverseproblems. However, the existing methods entail computationally demandingiterative sampling procedures and optimize a separate solution for eachmeasurement, which leads to limited scalability and lack of generalizationcapability across unseen samples. To address these limitations, we propose anovel approach, Diffusion prior-based Amortized Variational Inference (DAVI)that solves inverse problems with a diffusion prior from an amortizedvariational inference perspective. Specifically, instead of separatemeasurement-wise optimization, our amortized inference learns a function thatdirectly maps measurements to the implicit posterior distributions ofcorresponding clean data, enabling a single-step posterior sampling even forunseen measurements. Extensive experiments on image restoration tasks, e.g.,Gaussian deblur, 4$\times$ super-resolution, and box inpainting with twobenchmark datasets, demonstrate our approach's superior performance over strongbaselines. Code is available at https://github.com/mlvlab/DAVI.