Back to Search Start Over

NETT: Solving Inverse Problems with Deep Neural Networks

Authors :
Johannes Schwab
Housen Li
Stephan Antholzer
Markus Haltmeier
Publication Year :
2018

Abstract

Recovering a function or high-dimensional parameter vector from indirect measurements is a central task in various scientific areas. Several methods for solving such inverse problems are well developed and well understood. Recently, novel algorithms using deep learning and neural networks for inverse problems appeared. While still in their infancy, these techniques show astonishing performance for applications like low-dose CT or various sparse data problems. However, there are few theoretical results for deep learning in inverse problems. In this paper, we establish a complete convergence analysis for the proposed NETT (network Tikhonov) approach to inverse problems. NETT considers nearly data-consistent solutions having small value of a regularizer defined by a trained neural network. We derive well-posedness results and quantitative error estimates, and propose a possible strategy for training the regularizer. Our theoretical results and framework are different from any previous work using neural networks for solving inverse problems. A possible data driven regularizer is proposed. Numerical results are presented for a tomographic sparse data problem, which demonstrate good performance of NETT even for unknowns of different type from the training data. To derive the convergence and convergence rates results we introduce a new framework based on the absolute Bregman distance generalizing the standard Bregman distance from the convex to the non-convex case.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....f8129b461407f9db04fc926b31a817e6