Robust decentralized differentially private stochastic gradient descent

István Hegedũs, Árpád Berta, M. Jelasity

Research output: Contribution to journalArticle

3 Citations (Scopus)


Stochastic gradient descent (SGD) is one of the most applied machine learning algorithms in unreliable large-scale decentralized environments. In this type of environment data privacy is a fundamental concern. The most popular way to investigate this topic is based on the framework of differential privacy. However, many important implementation details and the performance of differentially private SGD variants have not yet been completely addressed. Here, we analyze a set of distributed differentially private SGD implementations in a system, where every private data record is stored separately by an autonomous node. The examined SGD methods apply only local computations and communications contain only protected information in a differentially private manner. A key middleware service these implementations require is the single random walk service, where a single random walk is maintained in the face of different failure scenarios. First we propose a robust implementation for the decentralized single random walk service and then perform experiments to evaluate the proposed random walk service as well as the private SGD implementations. Our main conclusion here is that the proposed differentially private SGD implementations can approximate the performance of their original noise-free variants in faulty decentralized environments, provided the algorithm parameters are set properly.

Original languageEnglish
Pages (from-to)20-40
Number of pages21
JournalJournal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications
Issue number2
Publication statusPublished - Jun 1 2016



  • Decentralized differential privacy
  • Machine learning
  • Random walks
  • Stochastic gradient descent

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications

Cite this