An Inner-loop Free Solution to Inverse Problems using Deep Neural Networks

Published

Conference Paper

© 2017 Neural information processing systems foundation. All rights reserved. We propose a new method that uses deep learning techniques to accelerate the popular alternating direction method of multipliers (ADMM) solution for inverse problems. The ADMM updates consist of a proximity operator, a least squares regression that includes a big matrix inversion, and an explicit solution for updating the dual variables. Typically, inner loops are required to solve the first two sub-minimization problems due to the intractability of the prior and the matrix inversion. To avoid such drawbacks or limitations, we propose an inner-loop free update rule with two pre-trained deep convolutional architectures. More specifically, we learn a conditional denoising auto-encoder which imposes an implicit data-dependent prior/regularization on ground-truth in the first sub-minimization problem. This design follows an empirical Bayesian strategy, leading to so-called amortized inference. For matrix inversion in the second sub-problem, we learn a convolutional neural network to approximate the matrix inversion, i.e., the inverse mapping is learned by feeding the input through the learned forward network. Note that training this neural network does not require ground-truth or measurements, i.e., data-independent. Extensive experiments on both synthetic data and real datasets demonstrate the efficiency and accuracy of the proposed method compared with the conventional ADMM solution using inner loops for solving inverse problems.

Duke Authors

Cited Authors

  • Fai, K; Wei, Q; Carin, L; Heller, K

Published Date

  • January 1, 2017

Published In

Volume / Issue

  • 2017-December /

Start / End Page

  • 2371 - 2381

International Standard Serial Number (ISSN)

  • 1049-5258

Citation Source

  • Scopus