Fully distributed privacy preserving mini-batch gradient descent learning

Gábor Danner, Márk Jelasity

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

In fully distributed machine learning, privacy and security are important issues. These issues are often dealt with using secure multiparty computation (MPC). However, in our application domain, known MPC algorithms are not scalable or not robust enough. We propose a light-weight protocol to quickly and securely compute the sum of the inputs of a subset of participants assuming a semi-honest adversary. During the computation the participants learn no individual values. We apply this protocol to efficiently calculate the sum of gradients as part of a fully distributed mini-batch stochastic gradient descent algorithm. The protocol achieves scalability and robustness by exploiting the fact that in this application domain a “quick and dirty” sum computation is acceptable. In other words, speed and robustness takes precedence over precision. We analyze the protocol theoretically as well as experimentally based on churn statistics from a real smartphone trace. We derive a sufficient condition for preventing the leakage of an individual value, and we demonstrate the feasibility of the overhead of the protocol.

Original languageEnglish
Title of host publicationDistributed Applications and Interoperable Systems - 15th IFIP WG 6.1 International Conference, DAIS 2015 Held as Part of the 10th International Federated Conference on Distributed Computing Techniques, DisCoTec 2015, Proceedings
EditorsAlysson Bessani, Sara Bouchenak
PublisherSpringer Verlag
Pages30-44
Number of pages15
ISBN (Electronic)9783319191287
DOIs
Publication statusPublished - Jan 1 2015
Event15th IFIP WG 6.1 International Conference on Distributed Applications and Interoperable Systems, DAIS 2015 Held as Part of the 10th International Federated Conference on Distributed Computing Techniques, DisCoTec 2015 - Grenoble, France
Duration: Jun 2 2015Jun 4 2015

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9038
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other15th IFIP WG 6.1 International Conference on Distributed Applications and Interoperable Systems, DAIS 2015 Held as Part of the 10th International Federated Conference on Distributed Computing Techniques, DisCoTec 2015
CountryFrance
CityGrenoble
Period6/2/156/4/15

    Fingerprint

Keywords

  • Fully distributed learning
  • Mini-batch stochastic gradient descent
  • P2P smartphone networks
  • Secure sum

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Danner, G., & Jelasity, M. (2015). Fully distributed privacy preserving mini-batch gradient descent learning. In A. Bessani, & S. Bouchenak (Eds.), Distributed Applications and Interoperable Systems - 15th IFIP WG 6.1 International Conference, DAIS 2015 Held as Part of the 10th International Federated Conference on Distributed Computing Techniques, DisCoTec 2015, Proceedings (pp. 30-44). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9038). Springer Verlag. https://doi.org/10.1007/978-3-319-19129-4_3