Skip to main content

Hermes: An efficient federated learning framework for heterogeneous mobile clients

Publication ,  Conference
Li, A; Sun, J; Li, P; Pu, Y; Li, H; Chen, Y
Published in: Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM
January 1, 2021

Federated learning (FL) has been a popular method to achieve distributed machine learning among numerous devices without sharing their data to a cloud server. FL aims to learn a shared global model with the participation of massive devices under the orchestration of a central server. However, mobile devices usually have limited communication bandwidth to transfer local updates to the central server. In addition, the data residing across devices is intrinsically statistically heterogeneous (i.e., non-IID data distribution). Learning a single global model may not work well for all devices participating in the FL under data heterogeneity. Such communication cost and data heterogeneity are two critical bottlenecks that hinder from applying FL in practice. Moreover, mobile devices usually have limited computational resources. Improving the inference efficiency of the learned model is critical to deploy deep learning applications on mobile devices. In this paper, we present Hermes-a communication and inference-efficient FL framework under data heterogeneity. To this end, each device finds a small subnetwork by applying the structured pruning; only the updates of these subnetworks will be communicated between the server and the devices. Instead of taking the average over all parameters of all devices as conventional FL frameworks, the server performs the average on only overlapped parameters across each subnetwork. By applying Hermes, each device can learn a personalized and structured sparse deep neural network, which can run efficiently on devices. Experiment results show the remarkable advantages of Hermes over the status quo approaches. Hermes achieves as high as 32.17% increase in inference accuracy, 3.48× reduction on the communication cost, 1.83× speedup in inference efficiency, and 1.8× savings on energy consumption.

Duke Scholars

Published In

Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM

DOI

Publication Date

January 1, 2021

Start / End Page

420 / 436
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Li, A., Sun, J., Li, P., Pu, Y., Li, H., & Chen, Y. (2021). Hermes: An efficient federated learning framework for heterogeneous mobile clients. In Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM (pp. 420–436). https://doi.org/10.1145/3447993.3483278
Li, A., J. Sun, P. Li, Y. Pu, H. Li, and Y. Chen. “Hermes: An efficient federated learning framework for heterogeneous mobile clients.” In Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM, 420–36, 2021. https://doi.org/10.1145/3447993.3483278.
Li A, Sun J, Li P, Pu Y, Li H, Chen Y. Hermes: An efficient federated learning framework for heterogeneous mobile clients. In: Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM. 2021. p. 420–36.
Li, A., et al. “Hermes: An efficient federated learning framework for heterogeneous mobile clients.” Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM, 2021, pp. 420–36. Scopus, doi:10.1145/3447993.3483278.
Li A, Sun J, Li P, Pu Y, Li H, Chen Y. Hermes: An efficient federated learning framework for heterogeneous mobile clients. Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM. 2021. p. 420–436.

Published In

Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM

DOI

Publication Date

January 1, 2021

Start / End Page

420 / 436