Privacy vs. utility in federated learning : an experimental analysis of noise injection techniques

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers

Abstract

Federated Learning (FL) enables decentralized model training, while maintaining the privacy of the underlying individual datasets. Therefore, FL can resolve some intrinsically privacy-sensitive challenges in domains, such as healthcare and finance. However, privacy preservation usually comes with a trade-off on the usefulness (i.e., utility) of the information. The research problem is how to optimize this inversely proportional trade-off balance between privacy and utility. This study uses an experimental comparative analysis, in a synthetic healthcare setting, of different noise types (i.e., Gaussian, Laplacian, Poisson, Uniform, and Exponential) injected on the client side at the input-feature level prior to local training to enhance privacy in FL. We explore the impact of these noise types on the privacy–utility trade-off in FL data. The findings indicate that although Laplacian, Poisson, and Exponential types of noise provides stronger obfuscation which often comes at the cost of utility. This confirms and amplifies the trade-off in maintaining the usefulness of the data against its privacy. More importantly, the findings also show that Gaussian noise generally offers the best trade-off between privacy and utility on this task, suggesting a practical default for privacy-aware FL in healthcare-like environments.

Description

Keywords

Data poisoning, Differential privacy, Federated learning, Malicious data injection, Privacy-utility trade-off

Sustainable Development Goals

SDG-09: Industry, innovation and infrastructure

Citation

Leope, N.R., Eloff, J., Dlamini, M.T. 225, 'Privacy vs. utility in federated learning : an experimental analysis of noise injection techniques', IEEE Access, vol. 13, pp. 198623-198648. DOI: 10.1109/ACCESS.2025.36355320.