Differential Privacy
Differential privacy is the de facto standard privacy notion for privacy-preserving data analysis and publishing. Roughly speaking, differential privacy means the output distributions from an analysis will be essentially the same, regardless whether any individual joins, or refrains from joining, the dataset. This is usually achieved by adding a pre-determined amount of randomness, or “noise”, into a computation performed on a dataset. The noise makes the output insensitive to any single record in the dataset, so that every individual in the dataset can plausibly deny that their data was included.
My research on differential privacy focuses on developing novel mechanisms with differential privacy guarantee, while striking a balance among privacy, utility and efficiency.
Related publications
-
IJIS -
SOUPSNew Differential Privacy Communication Pipeline and Design Framework (Poster)In 18th Symposium on Usable Privacy and Security (SOUPS 2022), 2022
-
ICDMDifferentially Private String Sanitization for Frequency-Based Mining TasksIn IEEE International Conference on Data Mining, 2021