Back to Search
Start Over
Federated probability memory recall for federated continual learning.
- Source :
-
Information Sciences . Jun2023, Vol. 629, p551-565. 15p. - Publication Year :
- 2023
-
Abstract
- Federated Continual Learning (FCL) approaches exist two major problems of the probability bias and the imbalance in parameter variations. These two problems lead to catastrophic forgetting of the network in the FCL process. Therefore, this paper proposes a novel FCL framework, Federated Probability Memory Recall (FedPMR), to mitigate the probability bias problem and the imbalance in parameter variations. Firstly, for the probability bias problem, this paper designs the Probability Distribution Alignment (PDA) module, which consolidates the memory of old probability experience. Specifically, PDA maintains a replay buffer and uses the probability memory stored in the buffer to correct the offset probabilities of the previous tasks during the two-stage training. Secondly, to alleviate the imbalance in parameter variations, this paper designs the Parameter Consistency Constraint (PCC) module, which constrains the magnitude of neural weight changes for previous tasks. Concretely, PCC applies a set of adaptive weights to subsets of the regularization term that constrains parameter changes, forcing the current model to be sufficiently close to the past model in parameter space distance. Experiments with various levels of task similitude across clients demonstrate that our technique establishes the new state-of-the-art performance when compared to previous FCL approaches. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 629
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 162396287
- Full Text :
- https://doi.org/10.1016/j.ins.2023.02.015