Privacy vs. utility in federated learning : an experimental analysis of noise injection techniques

dc.contributor.authorLeope, Neo R.
dc.contributor.authorEloff, Jan H.P.
dc.contributor.authorDlamini, Moses Thandokuhle
dc.contributor.emailjan.eloff@up.ac.za
dc.date.accessioned2026-03-31T06:40:44Z
dc.date.available2026-03-31T06:40:44Z
dc.date.issued2025-11-20
dc.description.abstractFederated Learning (FL) enables decentralized model training, while maintaining the privacy of the underlying individual datasets. Therefore, FL can resolve some intrinsically privacy-sensitive challenges in domains, such as healthcare and finance. However, privacy preservation usually comes with a trade-off on the usefulness (i.e., utility) of the information. The research problem is how to optimize this inversely proportional trade-off balance between privacy and utility. This study uses an experimental comparative analysis, in a synthetic healthcare setting, of different noise types (i.e., Gaussian, Laplacian, Poisson, Uniform, and Exponential) injected on the client side at the input-feature level prior to local training to enhance privacy in FL. We explore the impact of these noise types on the privacy–utility trade-off in FL data. The findings indicate that although Laplacian, Poisson, and Exponential types of noise provides stronger obfuscation which often comes at the cost of utility. This confirms and amplifies the trade-off in maintaining the usefulness of the data against its privacy. More importantly, the findings also show that Gaussian noise generally offers the best trade-off between privacy and utility on this task, suggesting a practical default for privacy-aware FL in healthcare-like environments.
dc.description.departmentComputer Science
dc.description.librarianam2026
dc.description.sdgSDG-09: Industry, innovation and infrastructure
dc.description.urihttps://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639
dc.identifier.citationLeope, N.R., Eloff, J., Dlamini, M.T. 225, 'Privacy vs. utility in federated learning : an experimental analysis of noise injection techniques', IEEE Access, vol. 13, pp. 198623-198648. DOI: 10.1109/ACCESS.2025.36355320.
dc.identifier.issn2169-3536 (online)
dc.identifier.other10.1109/ACCESS.2025.3635532
dc.identifier.other10.1109/ACCESS.2025.3635532
dc.identifier.urihttp://hdl.handle.net/2263/109356
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers
dc.rights@ The Author(s). This article is licensed under a Creative Commons Attribution-Non Commercial-No Derivatives 4.0 International License.
dc.subjectData poisoning
dc.subjectDifferential privacy
dc.subjectFederated learning
dc.subjectMalicious data injection
dc.subjectPrivacy-utility trade-off
dc.titlePrivacy vs. utility in federated learning : an experimental analysis of noise injection techniques
dc.typeArticle

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Leope_Privacy_2025.pdf
Size:
11.3 MB
Format:
Adobe Portable Document Format
Description:
Article

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: