IEEE Open Journal of the Computer Society (Jan 2022)
Local Differential Privacy for Person-to-Person Interactions
Abstract
Currently, many global organizations collect personal data for marketing, recommendation system improvement, and other purposes. Some organizations collect personal data securely based on a technique known as $\epsilon$-local differential privacy (LDP). Under LDP, a privacy budget is allocated to each user in advance. Each time the user's data are collected, the user's privacy budget is consumed, and their privacy is protected by ensuring that the remaining privacy budget is greater than or equal to zero. Existing research and organizations assume that each individual's data are completely unrelated to other individuals' data. However, this assumption does not hold in a situation where interaction data between users are collected from them. In this case, each user's privacy is not sufficiently protected because the privacy budget is actually overspent. In this study, the issue of local differential privacy for person-to-person interactions is clarified. We propose a mechanism that satisfies LDP in a person-to-person interaction scenario. Mathematical analysis and experimental results show that the proposed mechanism can maintain high data utility while ensuring LDP compared to existing methods.
Keywords