Back to Search
Start Over
Monotonic learning with hypothesis evolution.
- Source :
-
Information Sciences . Nov2023, Vol. 647, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- A machine learning algorithm is monotonic if it returns a model with better performance when trained with a larger data set. Monotonicity is essential in scenarios when a learning algorithm is working with continually collected data, as non-monotonicity may result in unstable performance and a huge waste of resources during the learning process. However, existing learning algorithms working in scenarios such as online learning, domain incremental learning and reinforcement learning hardly address the monotonicity issue. In this paper, we propose an evolutionary framework that focuses on the enforcement of monotonicity for a learning algorithm over streaming data feeds. In each iteration, training is triggered by a new collection of incoming data, which consequently creates a new generation of hypotheses, and only a portion of the generation with best performance is retained for the next round based on a novel statistical hypothesis test. We carry out experiments on DNN models with continual data feeds constructed from MNIST, CIFAR-10, SST-2 and Tiny ImageNet. The results justify that our approach can significantly increase the probability of locally monotonic updates on the generated learning curves for the trained models and outperforms the state-of-the-art methods on that purpose. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 647
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 170903598
- Full Text :
- https://doi.org/10.1016/j.ins.2023.119455