Back to Search Start Over

FedCRMW: Federated model ownership verification with compression-resistant model watermarking.

Authors :
Nie, Hewang
Lu, Songfeng
Source :
Expert Systems with Applications. Sep2024:Part C, Vol. 249, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Federated Learning is a collaborative machine learning paradigm that allows training models on decentralized data while preserving data privacy. It has gained significant attention due to its potential applications in various domains. However, the issue of protecting model copyright in the Federated Learning setting has become a critical concern. In this paper, we propose a novel watermarking framework called FedCRMW (Federal Learning Compression-Resistance Model Watermark) to address the challenge of model copyright protection in Federated Learning. FedCRMW embeds unique watermarks into client-contributed models, ensuring ownership, integrity, and authenticity. The framework leverages client-specific identifiers and exclusive logos to construct trigger sets for watermark embedding, enhancing security and traceability. One of the key advantages of FedCRMW is its optimization for the common data compression challenge in the Federated Learning scenario. By utilizing compressed data inputs for copyright verification, we achieve an efficient watermark validation process and reduce communication and storage overheads. Experimental results demonstrate the effectiveness of FedCRMW in terms of watermark success rate, imperceptibility, robustness against attacks, and resistance to model compression and pruning. Compared to existing watermarking methods, FedCRMW exhibits superior performance in the Federated Learning context. • FedCRMW: Robust watermarking for Federated Learning models. • Novel trigger dataset construction scheme for watermarking. • Enhanced robustness with feature-consistent training. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
249
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
176785323
Full Text :
https://doi.org/10.1016/j.eswa.2024.123776