A Framework for Privacy Quantification: Measuring the Impact of Privacy Techniques through Mutual Information, Distance Mapping, and Machine Learning
Abstract
In This Paper, We Propose to Investigate How the Effects of Privacy Techniques Can Be Practically Assessed in the Specific Context of Data Anonymization, and Present Some Possible Tools for Measuring the Effects of Such Anonymization. We Develop an Approach using Mutual Information for Measuring the Information Content in Any Dataset, Including over Non-Euclidean Data Spaces, by Means of Mapping Non-Euclidean Distances to a Euclidean Space. We Further Evaluate the Proposed Approach over Toy Datasets Composed of Timestamped Gps Traces, and Attempt to Quantify the Information Content Loss Created by Three State-Of-The-Art Anonymization Approaches. the Results Allow for an Objective Quantification of the Effects of the K-Anonymity and Differential Privacy Algorithms, and Illustrate on the Toy Data Used, that Such Privacy Techniques Have Very Non-Linear Effects on the Information Content of the Data.
Recommended Citation
Y. Miche et al., "A Framework for Privacy Quantification: Measuring the Impact of Privacy Techniques through Mutual Information, Distance Mapping, and Machine Learning," Cognitive Computation, vol. 11, no. 2, pp. 241 - 261, Springer, Apr 2019.
The definitive version is available at https://doi.org/10.1007/s12559-018-9604-7
Department(s)
Engineering Management and Systems Engineering
Keywords and Phrases
Data privacy; Distance mapping; Mutual information; Non-Euclidean data; Privacy quantification
International Standard Serial Number (ISSN)
1866-9964; 1866-9956
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Springer, All rights reserved.
Publication Date
15 Apr 2019
Comments
Horizon 2020 Framework Programme, Grant 737422