LotteryFL: Empower Edge Intelligence with Personalized and Communication-Efficient Federated Learning
With the proliferation of mobile computing and Internet of Things (IoT), massive mobile and IoT devices are connected to the Internet. These devices are generating a huge amount of data every second at the network edge. Many artificial intelligence applications and ser-vices have been proposed for edge devices based on the distributed data. Federated learning (FL) proves to be an extremely viable option for distributed machine learning with enhanced privacy, which can help artificial intelligence applications unleash the potential of data residing at the network edge. Its primary goal is learning a global model that offers good performance for the participants as many as possible. However, the data residing across devices is intrinsically statistically heterogeneous (i.e., non-IID data distribution) and edge devices usually have limited communication resources to transfer data. Such statistical heterogeneity (i.e., non-IID) and communication efficiency are two critical bottlenecks that hinder the development of FL. In this work, we propose LotteryFL - a personalized and communication-efficient FL framework via exploiting the Lottery Ticket hypothesis. In LotteryFL, each client learns a lottery ticket network (i.e., a subnetwork of the base model) by applying the Lottery Ticket hypothesis, and only these lottery networks will be communicated between the server and clients. Rather than learning a shared global model in classic FL, each client learns a personalized model via LotteryFL; the communication cost can be significantly reduced due to the compact size of lottery networks. To support the training and evaluation of our framework, we construct non-IID) datasets based on MNIST, CIFAR-10 and EMNIST by taking feature distribution skew, label distribution skew and quantity skew into consideration. Experiments on these non-IID datasets demonstrate that compared with the state-of-the-art approaches, LotteryFL can achieve as much as 17.24% increase in inference accuracy and 2.94x reduction on communication cost. We also demonstrate the via-bility of LotteryFL, showcasing the real-time performance of the deployed models on edge devices.