Back to Search Start Over

Deep null space learning for inverse problems: convergence analysis and rates.

Authors :
Johannes Schwab
Stephan Antholzer
Markus Haltmeier
Source :
Inverse Problems. Feb2019, Vol. 35 Issue 2, p1-1. 1p.
Publication Year :
2019

Abstract

Recently, deep learning based methods appeared as a new paradigm for solving inverse problems. These methods empirically show excellent performance but lack of theoretical justification; in particular, no results on the regularization properties are available. In particular, this is the case for two-step deep learning approaches, where a classical reconstruction method is applied to the data in a first step and a trained deep neural network is applied to improve results in a second step. In this paper, we close the gap between practice and theory for a particular network structure in a two-step approach. For that purpose, we propose using so-called null space networks and introduce the concept of -regularization. Combined with a standard regularization method as reconstruction layer, the proposed deep null space learning approach is shown to be a -regularization method; convergence rates are also derived. The proposed null space network structure naturally preserves data consistency which is considered as key property of neural networks for solving inverse problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02665611
Volume :
35
Issue :
2
Database :
Academic Search Index
Journal :
Inverse Problems
Publication Type :
Academic Journal
Accession number :
134109057
Full Text :
https://doi.org/10.1088/1361-6420/aaf14a