Assessing Local Differential Privacy for Compliance with the Personal Data Protection Law in Integrated Data Systems

评估本地差分隐私在集成数据系统中是否符合个人数据保护法

阅读:1

Abstract

Organizations increasingly integrate and share person-level data across internal platforms and external partners to enable analytics, digital services, and evidence-based decision making. However, combining quasi-identifiers across systems and releases can enable re-identification via linkage attacks, creating regulatory compliance and trust risks. This paper proposes an operational methodology for (i) identifying direct identifiers and quasi-identifiers (QIs), (ii) quantifying baseline re-identification risk using uniqueness and prosecutor-style risk proxies, and (iii) applying Local Differential Privacy (LDP) to reduce link-ability prior to data sharing. We implement categorical LDP using a Generalized Randomized Response (GRR) mechanism and evaluate privacy-utility trade-offs through a sensitivity analysis over the privacy budget ε. Utility is quantified using (a) distributional distortion (total variation distance) and (b) downstream task performance (job-title classification). We further address reviewer concerns by discussing repeated releases, privacy accounting as mitigations for longitudinal deployments, and by improving figure readability and updating related work with recent studies.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。