Back to Search Start Over

A Learning Rate Method for Full-Batch Gradient Descent

Authors :
Vogel Manfred
Asadi Soodabeh
Source :
Műszaki Tudományos Közlemények. 13:174-177
Publication Year :
2020
Publisher :
Muszaki Tudomanyos Kozlemenyek, 2020.

Abstract

In this paper, we present a learning rate method for gradient descent using only first order information. This method requires no manual tuning of the learning rate. We applied this method on a linear neural network built from scratch, along with the full-batch gradient descent, where we calculated the gradients for the whole dataset to perform one parameter update. We tested the method on a moderate sized dataset of housing information and compared the result with that of the Adam optimizer used with a sequential neural network model from Keras. The comparison shows that our method finds the minimum in a much fewer number of epochs than does Adam.

Details

ISSN :
26015773
Volume :
13
Database :
OpenAIRE
Journal :
Műszaki Tudományos Közlemények
Accession number :
edsair.doi...........3b892581497537420af919493d3cc977