Back to Search Start Over

Gradient Correction beyond Gradient Descent

Authors :
Li, Zefan
Ni, Bingbing
Li, Teng
Zhang, WenJun
Gao, Wen
Publication Year :
2022

Abstract

The great success neural networks have achieved is inseparable from the application of gradient-descent (GD) algorithms. Based on GD, many variant algorithms have emerged to improve the GD optimization process. The gradient for back-propagation is apparently the most crucial aspect for the training of a neural network. The quality of the calculated gradient can be affected by multiple aspects, e.g., noisy data, calculation error, algorithm limitation, and so on. To reveal gradient information beyond gradient descent, we introduce a framework (\textbf{GCGD}) to perform gradient correction. GCGD consists of two plug-in modules: 1) inspired by the idea of gradient prediction, we propose a \textbf{GC-W} module for weight gradient correction; 2) based on Neural ODE, we propose a \textbf{GC-ODE} module for hidden states gradient correction. Experiment results show that our gradient correction framework can effectively improve the gradient quality to reduce training epochs by $\sim$ 20\% and also improve the network performance.<br />Comment: There are errors in the description of GC-W module and GC-ODE Section 3.2 and Section 3.3, which may mislead the readers. e.g., 1. the structure of GC-W module is not described correctly. 2. the GC-ODE module is not described clearly. Therefore we want to withdrawal this paper for a thorough correction

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.08345
Document Type :
Working Paper