One-sided differential privacy
We study the problem of privacy-preserving data sharing, wherein only a subset of the records in a database is sensitive, possibly based on predefined privacy policies. Existing solutions, viz, differential privacy (DP), are over-pessimistic as they treat all records as sensitive. Alternatively, techniques like access control and personalized differential privacy that reveal all non-sensitive records truthfully indirectly leak whether a record is sensitive and consequently the record's value. In this work we introduce one-sided differential privacy (OSDP) that offers provable privacy guarantees to the sensitive records. In addition, OSDP satisfies the sensitivity masking property which ensures that any algorithm satisfying OSDP does not allow an attacker to significantly decrease his/her uncertainty about whether a record is sensitive or not.We design OSDP algorithms that can truthfully release a sample of non-sensitive records. Such algorithms can be used to support applications that must output true data with little loss in utility, especially when using complex types of data like images or location trajectories. Additionally, we present OSDP algorithms for releasing count queries, which leverage the presence of nonsensitive records and are able to offer up to a 6× improvement in accuracy over state-of-the-art DP-solutions.