Back to Search Start Over

Learning via nonlinear conjugate gradients and depth-varying neural ODEs

Authors :
Baravdish, George
Eilertsen, Gabriel
Jaroudi, Rym
Johansson, B. Tomas
Malý, Lukáš
Unger, Jonas
Publication Year :
2022

Abstract

The inverse problem of supervised reconstruction of depth-variable (time-dependent) parameters in a neural ordinary differential equation (NODE) is considered, that means finding the weights of a residual network with time continuous layers. The NODE is treated as an isolated entity describing the full network as opposed to earlier research, which embedded it between pre- and post-appended layers trained by conventional methods. The proposed parameter reconstruction is done for a general first order differential equation by minimizing a cost functional covering a variety of loss functions and penalty terms. A nonlinear conjugate gradient method (NCG) is derived for the minimization. Mathematical properties are stated for the differential equation and the cost functional. The adjoint problem needed is derived together with a sensitivity problem. The sensitivity problem can estimate changes in the network output under perturbation of the trained parameters. To preserve smoothness during the iterations the Sobolev gradient is calculated and incorporated. As a proof-of-concept, numerical results are included for a NODE and two synthetic datasets, and compared with standard gradient approaches (not based on NODEs). The results show that the proposed method works well for deep learning with infinite numbers of layers, and has built-in stability and smoothness.<br />Comment: 26 pages, 3 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2202.05766
Document Type :
Working Paper