Distributed Differentially Private Stochastic Gradient Descent: An Empirical Study

Istvan Hegedus, Mark Jelasity

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In fault-prone large-scale distributed environments stochastic gradient descent (SGD) is a popular approach to implement machine learning algorithms. Data privacy is a key concern in such environments, which is often addressed within the framework of differential privacy. The output quality of differentially private SGD implementations as a function of design choices has not yet been thoroughly evaluated. In this study, we examine this problem experimentally. We assume that every data record is stored by an independent node, which is a typical setup in networks of mobile devices or Internet of things (IoT) applications. In this model we identify a set of possible distributed differentially private SGD implementations. In these implementations all the sensitive computations are strictly local, and any public information is protected by differentially private mechanisms. This means that personal information can leak only if the corresponding node is directly compromised. We then perform a set of experiments to evaluate these implementations over several machine learning problems with both logistic regression and support vector machine (SVM) loss functions. Depending on the parameter setting and the choice of the algorithm, the performance of the noise-free algorithm can be closely approximated by differentially private variants.

Original languageEnglish
Title of host publicationProceedings - 24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016
EditorsYiannis Cotronis, Masoud Daneshtalab, George Angelos Papadopoulos
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages566-573
Number of pages8
ISBN (Electronic)9781467387750
DOIs
Publication statusPublished - Mar 31 2016
Event24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016 - Heraklion, Crete, Greece
Duration: Feb 17 2016Feb 19 2016

Publication series

NameProceedings - 24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016

Other

Other24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016
CountryGreece
CityHeraklion, Crete
Period2/17/162/19/16

    Fingerprint

Keywords

  • distributed differential privacy
  • machine learning
  • stochastic gradient descent

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Hardware and Architecture
  • Software
  • Control and Optimization

Cite this

Hegedus, I., & Jelasity, M. (2016). Distributed Differentially Private Stochastic Gradient Descent: An Empirical Study. In Y. Cotronis, M. Daneshtalab, & G. A. Papadopoulos (Eds.), Proceedings - 24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016 (pp. 566-573). [7445391] (Proceedings - 24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2016). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/PDP.2016.19