Back to Search Start Over

Inaccuracy rates for distributed inference over random networks with applications to social learning

Authors :
Bajovic, Dragana
Publication Year :
2022

Abstract

This paper studies probabilistic rates of convergence for consensus+innovations type of algorithms in random, generic networks. For each node, we find a lower and also a family of upper bounds on the large deviations rate function, thus enabling the computation of the exponential convergence rates for the events of interest on the iterates. Relevant applications include error exponents in distributed hypothesis testing, rates of convergence of beliefs in social learning, and inaccuracy rates in distributed estimation. The bounds on the rate function have a very particular form at each node: they are constructed as the convex envelope between the rate function of the hypothetical fusion center and the rate function corresponding to a certain topological mode of the node's presence. We further show tightness of the discovered bounds for several cases, such as pendant nodes and regular networks, thus establishing the first proof of the large deviations principle for consensus+innovations and social learning in random networks.<br />Comment: 43 pages, 3 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2208.05236
Document Type :
Working Paper