Back to Search
Start Over
Finite Sample Guarantees for Distributed Online Parameter Estimation with Communication Costs
- Publication Year :
- 2022
-
Abstract
- We study the problem of estimating an unknown parameter in a distributed and online manner. Existing work on distributed online learning typically either focuses on asymptotic analysis, or provides bounds on regret. However, these results may not directly translate into bounds on the error of the learned model after a finite number of time-steps. In this paper, we propose a distributed online estimation algorithm which enables each agent in a network to improve its estimation accuracy by communicating with neighbors. We provide non-asymptotic bounds on the estimation error, leveraging the statistical properties of the underlying model. Our analysis demonstrates a trade-off between estimation error and communication costs. Further, our analysis allows us to determine a time at which the communication can be stopped (due to the costs associated with communications), while meeting a desired estimation accuracy. We also provide a numerical example to validate our results.<br />Comment: 9 pages, 1 figure, 2022 Conference on Decision and Control (CDC)
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2209.06678
- Document Type :
- Working Paper