In this thesis, we focus on differential privacy and the trade-off between privacy level and accuracy of shared databases. For differential privacy, the Laplace mechanism is commonly utilised, for which the choice of value for a privacy parameter ε, also known as the privacy budget, plays a key role in the trade-off between privacy level and accuracy. We aim to build a game-theoretical model of choosing this privacy budget ε, while optimising the utility of the shared databases to which the differential privacy mechanism is applied. In this thesis, we consider differentially private queries applied in the context of two different models. The first model is an information-theory based query system using a discrete Laplace mechanism.The utility and leakage are quantified by information theory with min-entropy between original information, real answer and reported answer in the system. The second model is based on data analysis with privacy-aware machine learning using differentially-private gradient queries. We quantify the quality of the trained model by fitness cost, which is a function of differential-privacy parameters and the size of the distributed datasets, to capture the trade-off between privacy and utility by machine learning. Then, game theory is used to analyse the utility-leakage tradeoffs for both of these two models respectively
History
Table of Contents
1. Introduction -- 2. Literature review -- 3. Privacy preserving federated learning -- 4. Game theory in privacy preserving machine learning -- 5. Conclusions and future research -- Appendix -- Bibliography.
Notes
Empirical thesis.
Bibliography: pages 55-62
Awarding Institution
Macquarie University
Degree Type
Thesis MRes
Degree
MRes, Macquarie University, Faculty of Science and Engineering, Department of Computing
Department, Centre or School
Department of Computing
Year of Award
2019
Principal Supervisor
Dali Kaafar
Rights
Copyright Nan Wu 2019.
Copyright disclaimer: http://mq.edu.au/library/copyright