[Resource Topic] 2008/347: Information Leakage in Optimal Anonymized and Diversified Data

Welcome to the resource topic for 2008/347

Title:
Information Leakage in Optimal Anonymized and Diversified Data

Authors: Chengfang Fang, Ee-Chien Chang

Abstract:

To reconcile the demand of information dissemination and preservation of privacy, a popular approach generalizes the attribute values in the dataset, for example by dropping the last digit of the postal code, so that the published dataset meets certain privacy requirements, like the notions of k-anonymity and l-diversity. On the other hand, the published dataset should remain useful and not over generalized. Hence it is desire to disseminate a database with high “usefulness”, measured by a utility function. This leads to a generic framework whereby the optimal dataset (w.r.t. the utility function) among all the generalized datasets that meet certain privacy requirements, is chosen to be disseminated. In this paper, we observe that, the fact that a generalized dataset is optimal may leak information about the original. Thus, an adversary who is aware of how the dataset is generalized may able to derive more information than what the privacy requirements constrained. This observation challenges the widely adopted approach that treats the generalization process as an optimization problem. We illustrate the observation by giving counter-examples in the context of k-anonymity and l-diversity.

ePrint: https://eprint.iacr.org/2008/347

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .