1. A consensus-based decentralized training algorithm for deep neural networks with communication compression
- Author
-
Zhengtao Ding and Bo Liu
- Subjects
Distributed training ,0209 industrial biotechnology ,Consensus ,Artificial neural network ,Computer science ,Cognitive Neuroscience ,Process (computing) ,Training (meteorology) ,Data compression ratio ,Topology (electrical circuits) ,02 engineering and technology ,Neural network ,Computer Science Applications ,020901 industrial engineering & automation ,Model compression ,Artificial Intelligence ,Compression (functional analysis) ,0202 electrical engineering, electronic engineering, information engineering ,Decentralized communication topology ,020201 artificial intelligence & image processing ,Convergence ,Algorithm - Abstract
Facing the challenge of distributed computing on processing large-scale data, this paper proposes a consensus-based decentralized training method with communication compression. First, the decentralized training method is designed based on the decentralized topology to reduce the communication burden on the busiest agent and avoid any agent revealing its locally stored data. The convergence of the decentralized training algorithm is then analyzed, which demonstrates that the decentralized trained model can reach the minimal empirical risk on the whole dataset, without the sharing of data samples. Furthermore, model compression combined with the error-compensated method is considered to reduce communication costs during the decentralized training process. At last, the simulation study shows that the proposed decentralized training with error-compensated communication compression is applicable for both IID and non-IID datasets, and exhibits much better performance than the local training method. Besides, the proposed algorithm with an appropriate compression rate shows comparable performance with decentralized training and centralized training, while saving a lot of communication costs.
- Published
- 2021