Python module for simulating gossip learning and decentralized federated learning.
The documentation is available at https://makgyver.github.io/gossipy/
gossipy is available as a PyPI module and it can be installed using pip
:
$ pip install gossipy-dfl
- Models cache[Ormandi 2013] [Giaretta 2019] (partially implemented)
- Perfect matching [Ormandi 2013]
- More realistic online behaviour (currently it is a worst case scenario)
- DFL [Liu 2022]
- Segmented GL [Hu 2019]
- CMFL [Che 2021]
- MATCHA [Wang 2019]
- Add training stopping criterion
- GPU support (quick fix)
- Add 'Weights and Biases' support
[Ormandi 2013] Ormándi, Róbert, István Hegedüs, and Márk Jelasity. 'Gossip Learning with Linear Models on Fully Distributed Data'. Concurrency and Computation: Practice and Experience 25, no. 4 (February 2013): 556–571. https://doi.org/10.1002/cpe.2858.
[Berta 2014] Arpad Berta, Istvan Hegedus, and Robert Ormandi. 'Lightning Fast Asynchronous Distributed K-Means Clustering', 22th European Symposium on Artificial Neural Networks, (ESANN) 2014, Bruges, Belgium, April 23-25, 2014.
[Danner 2018] G. Danner and M. Jelasity, 'Token Account Algorithms: The Best of the Proactive and Reactive Worlds'. In 2018 IEEE 38th International Conference on Distributed Computing Systems (ICDCS), 2018, pp. 885-895. https://doi.org/10.1109/ICDCS.2018.00090.
[Giaretta 2019] Giaretta, Lodovico, and Sarunas Girdzijauskas. 'Gossip Learning: Off the Beaten Path'. In 2019 IEEE International Conference on Big Data (Big Data), 1117–1124. Los Angeles, CA, USA: IEEE, 2019. https://doi.org/10.1109/BigData47090.2019.9006216.
[Hu 2019] Chenghao Hu, Jingyan Jiang and Zhi Wang. 'Decentralized Federated Learning: A Segmented Gossip Approach'. https://arxiv.org/pdf/1908.07782.pdf
[Wang 2019] Jianyu Wang, Anit Kumar Sahu, Zhouyi Yang, Gauri Joshi, Soummya Kar. 'MATCHA: Speeding Up Decentralized SGD via Matching Decomposition Sampling'. https://arxiv.org/pdf/1905.09435.pdf
[Hegedus 2020] Hegedűs, István, Gábor Danner, Peggy Cellier and Márk Jelasity. 'Decentralized Recommendation Based on Matrix Factorization: A Comparison of Gossip and Federated Learning'. In 2020 Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2020, pp. 317-332. https://doi.org/10.1007/978-3-030-43823-4_27.
[Koloskova 2020] Anastasia Koloskova, Nicolas Loizou, Sadra Boreiri, Martin Jaggi, and Sebastian U. Stich. 'A unified theory of decentralized SGD with changing topology and local updates'. In Proceedings of the 37th International Conference on Machine Learning, pp. 5381–5393, 2020.
[Hegedus 2021] Hegedűs, István, Gábor Danner, and Márk Jelasity. 'Decentralized Learning Works: An Empirical Comparison of Gossip Learning and Federated Learning'. Journal of Parallel and Distributed Computing 148 (February 2021): 109–124. https://doi.org/10.1016/j.jpdc.2020.10.006.
[Onoszko 2021] Noa Onoszko, Gustav Karlsson Olof Mogren, and Edvin Listo Zec. 'Decentralized federated learning of deep neural networks on non-iid data'. International Workshop on Federated Learning for User Privacy and Data Confidentiality in Conjunction with ICML 2021 (FL-ICML'21). https://fl-icml.github.io/2021/papers/FL-ICML21_paper_3.pdf
[Che 2021] Chunjiang Che, Xiaoli Li, Chuan Chen, Xiaoyu He, and Zibin Zheng. 'A Decentralized Federated Learning Framework via Committee Mechanism with Convergence Guarantee'. https://arxiv.org/pdf/2108.00365.pdf
[Liu 2022] Wei Liu, Li Chen and Wenyi Zhang. 'Decentralized Federated Learning: Balancing Communication and Computing Costs'. https://arxiv.org/pdf/2107.12048.pdf