Back to Search Start Over

TJU-DNN: A trajectory-unified framework for training deep neural networks and its applications.

Authors :
Lv, Xian-Long
Chiang, Hsiao-Dong
Wang, Bin
Zhang, Yong-Feng
Source :
Neurocomputing. Feb2023, Vol. 520, p103-114. 12p.
Publication Year :
2023

Abstract

The training method for deep neural networks mainly adopts the gradient descent (GD) method. These methods, however, are very sensitive to initialization and hyperparameters. In this paper, an enhanced gradient descent method guided by the trajectory-based method for training deep neural networks, termed the Trajectory Unified Framework (TJU) method, is presented. From a theoretical viewpoint, the robustness of the TJU-based method is supported by an analytical basis presented in the paper. From a computational viewpoint, a TJU methodology consisting of a Block-Diagonal-Pseudo-Transient-Continuation method and a gradient descent method, termed the TJU-GD method, for training deep neural networks is added to obtain high-quality results. Furthermore, to resolve the issue of imbalanced classification, a TJU-Focal-GD method is developed and evaluated. Experimental numerical evaluation of the proposed TJU-GD on various public datasets reveals that the proposed method can achieve great improvements over baseline methods. Specifically, the proposed TJU-Focal-GD also possesses several advantages over other methods for a class of imbalanced datasets from the homemade power line inspection dataset (PLID). [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
520
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
160939315
Full Text :
https://doi.org/10.1016/j.neucom.2022.11.052