1. On Learning Parametric Distributions from Quantized Samples
- Author
-
Septimia Sarbu and Abdellatif Zaidi
- Subjects
FOS: Computer and information sciences ,Discrete mathematics ,Computer Science - Machine Learning ,Generalization ,Computer Science - Information Theory ,Information Theory (cs.IT) ,Mathematics - Statistics Theory ,Sample (statistics) ,Statistics Theory (math.ST) ,Minimax ,Information theory ,Machine Learning (cs.LG) ,symbols.namesake ,Distribution (mathematics) ,Independent samples ,FOS: Mathematics ,symbols ,Fisher information ,Mathematics ,Parametric statistics - Abstract
We consider the problem of learning parametric distributions from their quantized samples in a network. Specifically, $n$ agents or sensors observe independent samples of an unknown parametric distribution; and each of them uses $k$ bits to describe its observed sample to a central processor whose goal is to estimate the unknown distribution. First, we establish a generalization of the well-known van Trees inequality to general $L_p$-norms, with $p > 1$, in terms of Generalized Fisher information. Then, we develop minimax lower bounds on the estimation error for two losses: general $L_p$-norms and the related Wasserstein loss from optimal transport., Short version accepted for publication at the IEEE Information Theory Symposium (ISIT) 2021; this version contains the detailed proofs with some minor corrections
- Published
- 2021