[Resource Topic] 2017/658: Privacy for Targeted Advertising

Welcome to the resource topic for 2017/658

Privacy for Targeted Advertising

Authors: Avradip Mandal, John Mitchell, Hart Montgomery, Arnab Roy


In the past two decades, targeted online advertising has led to massive data collection, aggregation, and exchange. This infrastructure raises significant privacy concerns. While several prominent theories of data privacy have been proposed over the same period of time, these notions have limited application to advertising ecosystems. Differential privacy, the most robust of them, is inherently inapplicable to queries about particular individuals in the dataset. We therefore formulate a new definition of privacy for accessing private information about unknown individuals identified by some random token. Unlike most current privacy definitions, our’s takes probabilistic prior information into account and is intended to reflect the use of aggregated web information for targeted advertising. We explain how our theory captures the natural expectation of privacy in the advertising setting and avoids the limitations of existing alternatives. However, although we can construct artificial databases which satisfy our notion of privacy together with reasonable utility, we do not have evidence that real world databases can be sanitized to preserve reasonable utility. In fact we offer real world evidence that adherence to our notion of privacy almost completely destroys utility. Our results suggest that a significant theoretical advance or a change in infrastructure is needed in order to obtain rigorous privacy guarantees in the digital advertising ecosystem.

ePrint: https://eprint.iacr.org/2017/658

See all topics related to this paper.

Feel free to post resources that are related to this paper below.

Example resources include: implementations, explanation materials, talks, slides, links to previous discussions on other websites.

For more information, see the rules for Resource Topics .