in 

Artificial Intelligence and Cybersecurity

Theory and Applications

ISBN: 978-3-031-15030-2

Differential Privacy: An Umbrella Review

Abstract

Privacy-preserving analysis of data refers to possibilities of using personal information from individuals in a completely anonymous fashion. In a statistical sense, this means that statistics and models derived and learned from data are insensitive to individual observations. Differential Privacy as defined by Cynthia Dwork in (Dwork 2006) has become a popular approach for ensuring privacy. In contrast to earlier definitions, Dwork defined differential privacy as a relative guarantee that nothing more could be learned from data whether an individual observation is included or excluded from the analysis. This was achieved by adding random noise that is bigger than the effect of a change due to the largest single participant. The approach was referred as 𝜖-differential privacy. Such an actionable definition gave more room for practitioners to define how, for example, machine learning algorithms can ensure differential privacy. In this paper, we present an umbrella review on differential privacy related studies based on a methodology proposed by Aromataris et al. (Int J Evidence-Based Healthcare 13(3):132–140, 2015).

DOI: 10.1007/978-3-031-15030-2_8