Back to Search Start Over

Logistic Regression with Variable Fractional Gradient Descent Method

Authors :
Xinrui Xu
Wei Nai
Yuan Sun
Dan Li
Zan Yang
Yating Wang
Qi Jia
Source :
2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC).
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

Logistic regression is a classic classification method in machine learning. Classical logistic regression uses a general gradient descent method to solve the best parameters of the loss function, but it is easy to fall into the dilemma of local extremes. The fractional step descent method cannot converge to the exact extreme point, and there is always a deviation, so we use a variable fractional step descent method to simulate a new type of iteration to ensure the global convergence of the optimization algorithm. In this paper, the new variable fractional gradient descent method is used to solve the optimal parameters of the loss function, which overcomes the limitation of the gradient function on the step size of the loss function. In this paper, logistic regression and variable fractional step descent methods are used to deal with the problem of data dimensionality reduction to verify the effectiveness of the algorithm after optimization.

Details

Database :
OpenAIRE
Journal :
2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC)
Accession number :
edsair.doi...........b0770624ac1bc0230aac2fb04f92a0b6