Back to Search
Start Over
Online Stochastic DCA With Applications to Principal Component Analysis.
- Source :
-
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2024 May; Vol. 35 (5), pp. 7035-7047. Date of Electronic Publication: 2024 May 02. - Publication Year :
- 2024
-
Abstract
- Stochastic algorithms are well-known for their performance in the era of big data. In this article, we study nonsmooth stochastic Difference-of-Convex functions (DC) programs-the major class of nonconvex stochastic optimization, which have a variety of applications in divers domains, in particular, machine learning. We propose new online stochastic algorithms based on the state-of-the-art DC Algorithm (DCA)-a powerful approach in nonconvex programming framework, in the online context of streaming data continuously generated by some (unknown) source distribution. The new schemes use the stochastic approximations (SAs) principle: deterministic quantities of the standard DCA are replaced by their noisy estimators constructed using newly arriving samples. The convergence analysis of the proposed algorithms is studied intensively with the help of tools from modern convex analysis and martingale theory. Finally, we study several aspects of the proposed algorithms on an important problem in machine learning: the expected problem in principal component analysis (PCA).
Details
- Language :
- English
- ISSN :
- 2162-2388
- Volume :
- 35
- Issue :
- 5
- Database :
- MEDLINE
- Journal :
- IEEE transactions on neural networks and learning systems
- Publication Type :
- Academic Journal
- Accession number :
- 36315540
- Full Text :
- https://doi.org/10.1109/TNNLS.2022.3213558