Skip to main content

SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training

Publication ,  Conference
Diao, E; Ding, J; Tarokh, V
Published in: Advances in Neural Information Processing Systems
January 1, 2022

Federated Learning allows the training of machine learning models by using the computation and private data resources of many distributed clients. Most existing results on Federated Learning (FL) assume the clients have ground-truth labels. However, in many practical scenarios, clients may be unable to label task-specific data due to a lack of expertise or resource. We propose SemiFL to address the problem of combining communication-efficient FL such as FedAvg with Semi-Supervised Learning (SSL). In SemiFL, clients have completely unlabeled data and can train multiple local epochs to reduce communication costs, while the server has a small amount of labeled data. We provide a theoretical understanding of the success of data augmentation-based SSL methods to illustrate the bottleneck of a vanilla combination of communication-efficient FL with SSL. To address this issue, we propose alternate training to 'fine-tune global model with labeled data' and 'generate pseudo-labels with the global model.' We conduct extensive experiments and demonstrate that our approach significantly improves the performance of a labeled server with unlabeled clients training with multiple local epochs. Moreover, our method outperforms many existing SSFL baselines and performs competitively with the state-of-the-art FL and SSL results. Our code is available here.

Duke Scholars

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2022

Volume

35

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Diao, E., Ding, J., & Tarokh, V. (2022). SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training. In Advances in Neural Information Processing Systems (Vol. 35).
Diao, E., J. Ding, and V. Tarokh. “SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training.” In Advances in Neural Information Processing Systems, Vol. 35, 2022.
Diao E, Ding J, Tarokh V. SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training. In: Advances in Neural Information Processing Systems. 2022.
Diao, E., et al. “SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training.” Advances in Neural Information Processing Systems, vol. 35, 2022.
Diao E, Ding J, Tarokh V. SemiFL: Semi-Supervised Federated Learning for Unlabeled Clients with Alternate Training. Advances in Neural Information Processing Systems. 2022.

Published In

Advances in Neural Information Processing Systems

ISSN

1049-5258

Publication Date

January 1, 2022

Volume

35

Related Subject Headings

  • 4611 Machine learning
  • 1702 Cognitive Sciences
  • 1701 Psychology