In this paper we study the relationship between privacy and accuracy in the context of correlated datasets. We use a model of quantitative information flow to describe the the trade-off between privacy of individuals' data and and the utility of queries to that data by modelling the effectiveness of adversaries attempting to make inferences after a data release. We show that, where correlations exist in datasets, it is not possible to implement optimal noise-adding mechanisms that give the best possible accuracy or the best possible privacy in all situations. Finally we illustrate the trade-off between accuracy and privacy for local and oblivious differentially private mechanisms in terms of inference attacks on medium-scale datasets.
|Title of host publication||31st International Conference on Concurrency Theory|
|Subtitle of host publication||CONCUR 2020, September 1–4, 2020, Vienna, Austria (Virtual Conference)|
|Editors||Igor Konnov, Laura Kovács|
|Place of Publication||Saarbrücken/Wadern, Germany|
|Number of pages||18|
|Publication status||Published - Aug 2020|
|Event||31st International Conference on Concurrency Theory, CONCUR 2020 - Virtual, Vienna, Austria|
Duration: 1 Sep 2020 → 4 Sep 2020
|Name||Leibniz International Proceedings in Informatics, LIPIcs|
|Conference||31st International Conference on Concurrency Theory, CONCUR 2020|
|Period||1/09/20 → 4/09/20|
Bibliographical noteCopyright the Author(s) 2020. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.
- Privacy/utility trade-off
- Quantitative Information Flow
- Inference attacks