Overview   |

Overview 

Privacy definitions provide ways for trading-off the privacy of individuals in a statistical database for the utility of down- stream analysis of the data. In this paper, we present Blow- fish, a class of privacy definitions inspired by the Pufferfish framework, that provides a rich interface for this trade-off. In particular, we allow data publishers to extend differential privacy using a policy, which specifies (a) secrets, or informa- tion that must be kept secret, and (b) constraints that may be known about the data. While the secret specification allows increased utility by lessening protection for certain individual properties, the constraint specification provides added protection against an adversary who knows correla- tions in the data (arising from constraints). We formalize policies and present novel algorithms that can handle general specifications of sensitive information and certain count con- straints. We show that there are reasonable policies under which our privacy mechanisms for k-means clustering, his- tograms and range queries introduce significantly lesser noise than their differentially private counterparts. We quantify the privacy-utility trade-offs for various policies analytically and empirically on real datasets.

Project Members at Duke

Subprojects