TITLE:
Edge-Federated Self-Supervised Communication Optimization Framework Based on Sparsification and Quantization Compression
AUTHORS:
Yifei Ding
KEYWORDS:
Communication Optimization, Federated Self-Supervision, Sparsification, Gradient Compression, Edge Computing
JOURNAL NAME:
Journal of Computer and Communications,
Vol.12 No.5,
May
31,
2024
ABSTRACT: The federated self-supervised framework is a distributed machine learning method that combines federated learning and self-supervised learning, which can effectively solve the problem of traditional federated learning being difficult to process large-scale unlabeled data. The existing federated self-supervision framework has problems with low communication efficiency and high communication delay between clients and central servers. Therefore, we added edge servers to the federated self-supervision framework to reduce the pressure on the central server caused by frequent communication between both ends. A communication compression scheme using gradient quantization and sparsification was proposed to optimize the communication of the entire framework, and the algorithm of the sparse communication compression module was improved. Experiments have proved that the learning rate changes of the improved sparse communication compression module are smoother and more stable. Our communication compression scheme effectively reduced the overall communication overhead.