Gossip learning as a decentralized alternative to federated learning

István Hegedűs, Gábor Danner, M. Jelasity

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.

Original languageEnglish
Title of host publicationDistributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings
EditorsJosé Pereira, Laura Ricci
PublisherSpringer Verlag
Pages74-90
Number of pages17
ISBN (Print)9783030224950
DOIs
Publication statusPublished - Jan 1 2019
Event19th IFIP WG 6.1 International Conference on Distributed Applications and Interoperable Systems, DAIS 2019 held as part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019 - Kongens Lyngby, Denmark
Duration: Jun 17 2019Jun 21 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11534 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference19th IFIP WG 6.1 International Conference on Distributed Applications and Interoperable Systems, DAIS 2019 held as part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019
CountryDenmark
CityKongens Lyngby
Period6/17/196/21/19

Fingerprint

Gossip
Decentralized
Learning systems
Servers
Agglomeration
Alternatives
Mobile phones
Aggregation
Machine Learning
Server
Costs
Mobile Phone
Vertex of a graph
Experiments
Empirical Study
Compression
Trace
Learning
Scenarios
Computing

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Hegedűs, I., Danner, G., & Jelasity, M. (2019). Gossip learning as a decentralized alternative to federated learning. In J. Pereira, & L. Ricci (Eds.), Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings (pp. 74-90). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11534 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-22496-7_5

Gossip learning as a decentralized alternative to federated learning. / Hegedűs, István; Danner, Gábor; Jelasity, M.

Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings. ed. / José Pereira; Laura Ricci. Springer Verlag, 2019. p. 74-90 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11534 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hegedűs, I, Danner, G & Jelasity, M 2019, Gossip learning as a decentralized alternative to federated learning. in J Pereira & L Ricci (eds), Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11534 LNCS, Springer Verlag, pp. 74-90, 19th IFIP WG 6.1 International Conference on Distributed Applications and Interoperable Systems, DAIS 2019 held as part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Kongens Lyngby, Denmark, 6/17/19. https://doi.org/10.1007/978-3-030-22496-7_5
Hegedűs I, Danner G, Jelasity M. Gossip learning as a decentralized alternative to federated learning. In Pereira J, Ricci L, editors, Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings. Springer Verlag. 2019. p. 74-90. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-22496-7_5
Hegedűs, István ; Danner, Gábor ; Jelasity, M. / Gossip learning as a decentralized alternative to federated learning. Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings. editor / José Pereira ; Laura Ricci. Springer Verlag, 2019. pp. 74-90 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{f122f6aafe1a4b6bb17cc63270588891,
title = "Gossip learning as a decentralized alternative to federated learning",
abstract = "Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.",
author = "Istv{\'a}n Hegedűs and G{\'a}bor Danner and M. Jelasity",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-22496-7_5",
language = "English",
isbn = "9783030224950",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "74--90",
editor = "Jos{\'e} Pereira and Laura Ricci",
booktitle = "Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings",

}

TY - GEN

T1 - Gossip learning as a decentralized alternative to federated learning

AU - Hegedűs, István

AU - Danner, Gábor

AU - Jelasity, M.

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.

AB - Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.

UR - http://www.scopus.com/inward/record.url?scp=85067510080&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067510080&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-22496-7_5

DO - 10.1007/978-3-030-22496-7_5

M3 - Conference contribution

SN - 9783030224950

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 74

EP - 90

BT - Distributed Applications and Interoperable Systems - 19th IFIP WG 6.1 International Conference, DAIS 2019, Held as Part of the 14th International Federated Conference on Distributed Computing Techniques, DisCoTec 2019, Proceedings

A2 - Pereira, José

A2 - Ricci, Laura

PB - Springer Verlag

ER -