1. Decentralized and Incentivized Federated Learning : A Blockchain-Enabled Framework Utilising Compressed Soft-Labels and Peer Consistency
- Author
-
Witt, Leon, Zafar, Usama, Shen, Kuoyeh, Sattler, Felix, Li, Dan, Wang, Songtao, Samek, Wojciech, Witt, Leon, Zafar, Usama, Shen, Kuoyeh, Sattler, Felix, Li, Dan, Wang, Songtao, and Samek, Wojciech
- Abstract
Federated Learning (FL) has emerged as a powerful paradigm in Artificial Intelligence, facilitating the parallel training of Artificial Neural Networks on edge devices while safeguarding data privacy. Nonetheless, to encourage widespread adoption, Federated Learning Frameworks (FLFs) must tackle (i) the power imbalance between a central authority and its participants, and (ii) the challenge of equitably measuring and incentivizing contributions. Existing approaches to decentralize and incentivize FL processes are hindered by (i) computational overhead and (ii) uncertainty in contribution assessment (Witt et al. 2023), limiting FL's scalability beyond use cases where trust between participants and the server is established. This work introduces a cutting-edge, blockchain-enabled federated learning framework that incorporates Federated Knowledge Distillation (FD) with compressed 1-bit soft-labels, aggregated through a smart contract. Furthermore, we present the Peer Truth Serum for Federated Distillation (PTSFD), which cultivates an incentive-compatible ecosystem by rewarding honest participation based on an implicit yet effective comparison of worker contributions. The primary innovation stems from its lightweight architecture that simultaneously promotes decentralization and incentivization, addressing critical challenges in contemporary FL approaches.
- Published
- 2024
- Full Text
- View/download PDF