Fairness-Aware Differential Privacy: A Fairly Proportional Noise Mechanism
Summary
This research proposes a Fairly Proportional Noise Mechanism (FPNM) to address a problem in differential privacy (DP, a technique that adds random noise to data to protect individual privacy while allowing statistical analysis). Traditional DP methods add noise uniformly without considering fairness, which can unfairly affect different groups of people differently, especially in decision-making and learning tasks. The new FPNM approach adjusts noise based on both its direction and size relative to the actual data values, reducing unfairness by about 17-19% in experiments while maintaining privacy protections.
Classification
Original source: http://ieeexplore.ieee.org/document/11293801
First tracked: March 16, 2026 at 08:02 PM
Classified by LLM (prompt v3) · confidence: 92%