Back to Search
Start Over
An Information-Theoretic Analysis of the Cost of Decentralization for Learning and Inference under Privacy Constraints.
- Source :
-
Entropy . Apr2022, Vol. 24 Issue 4, p485-485. 10p. - Publication Year :
- 2022
-
Abstract
- In vertical federated learning (FL), the features of a data sample are distributed across multiple agents. As such, inter-agent collaboration can be beneficial not only during the learning phase, as is the case for standard horizontal FL, but also during the inference phase. A fundamental theoretical question in this setting is how to quantify the cost, or performance loss, of decentralization for learning and/or inference. In this paper, we study general supervised learning problems with any number of agents, and provide a novel information-theoretic quantification of the cost of decentralization in the presence of privacy constraints on inter-agent communication within a Bayesian framework. The cost of decentralization for learning and/or inference is shown to be quantified in terms of conditional mutual information terms involving features and label variables. [ABSTRACT FROM AUTHOR]
- Subjects :
- *COST analysis
*SUPERVISED learning
*LEARNING problems
*PRIVACY
Subjects
Details
- Language :
- English
- ISSN :
- 10994300
- Volume :
- 24
- Issue :
- 4
- Database :
- Academic Search Index
- Journal :
- Entropy
- Publication Type :
- Academic Journal
- Accession number :
- 156531852
- Full Text :
- https://doi.org/10.3390/e24040485