Back to Search Start Over

Universal Approximation Property of Neural Ordinary Differential Equations

Authors :
Teshima, Takeshi
Tojo, Koichi
Ikeda, Masahiro
Ishikawa, Isao
Oono, Kenta
Publication Year :
2020

Abstract

Neural ordinary differential equations (NODEs) is an invertible neural network architecture promising for its free-form Jacobian and the availability of a tractable Jacobian determinant estimator. Recently, the representation power of NODEs has been partly uncovered: they form an $L^p$-universal approximator for continuous maps under certain conditions. However, the $L^p$-universality may fail to guarantee an approximation for the entire input domain as it may still hold even if the approximator largely differs from the target function on a small region of the input space. To further uncover the potential of NODEs, we show their stronger approximation property, namely the $\sup$-universality for approximating a large class of diffeomorphisms. It is shown by leveraging a structure theorem of the diffeomorphism group, and the result complements the existing literature by establishing a fairly large set of mappings that NODEs can approximate with a stronger guarantee.<br />Comment: 10 pages, 1 table. Accepted at NeurIPS 2020 Workshop on Differential Geometry meets Deep Learning

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2012.02414
Document Type :
Working Paper