Skip to main content

Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning

Publication ,  Conference
Wang, R; Yu, T; Zhang, R; Kim, S; Rossi, R; Zhao, H; Wu, J; Mitra, S; Yao, L; Henao, R
Published in: Findings of the Association for Computational Linguistics Naacl 2024 Findings
January 1, 2024

In this paper, we study personalized federated learning for text classification with Pretrained Language Models (PLMs). We identify two challenges in efficiently leveraging PLMs for personalized federated learning: 1) Communication. PLMs are usually large in size, inducing huge communication cost in a federated setting. 2) Local Training. Training with PLMs generally requires back-propagation, during which memory consumption can be several times that of the forward-propagation. This may not be affordable when the PLMs are trained locally on the clients that are resource constrained, e.g., mobile devices with limited access to memory resources. In solving these, we propose a training framework that includes an approach of discrete local search for gradient-free local training, along with a compression mechanism inspired from the linear word analogy that allows communicating with discretely indexed tokens, thus significantly reducing the communication cost. Experiments show that our gradient-free framework achieves superior performance compared with baselines.

Duke Scholars

Published In

Findings of the Association for Computational Linguistics Naacl 2024 Findings

DOI

Publication Date

January 1, 2024

Start / End Page

4597 / 4612
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Wang, R., Yu, T., Zhang, R., Kim, S., Rossi, R., Zhao, H., … Henao, R. (2024). Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning. In Findings of the Association for Computational Linguistics Naacl 2024 Findings (pp. 4597–4612). https://doi.org/10.18653/v1/2024.findings-naacl.286
Wang, R., T. Yu, R. Zhang, S. Kim, R. Rossi, H. Zhao, J. Wu, S. Mitra, L. Yao, and R. Henao. “Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning.” In Findings of the Association for Computational Linguistics Naacl 2024 Findings, 4597–4612, 2024. https://doi.org/10.18653/v1/2024.findings-naacl.286.
Wang R, Yu T, Zhang R, Kim S, Rossi R, Zhao H, et al. Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning. In: Findings of the Association for Computational Linguistics Naacl 2024 Findings. 2024. p. 4597–612.
Wang, R., et al. “Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning.” Findings of the Association for Computational Linguistics Naacl 2024 Findings, 2024, pp. 4597–612. Scopus, doi:10.18653/v1/2024.findings-naacl.286.
Wang R, Yu T, Zhang R, Kim S, Rossi R, Zhao H, Wu J, Mitra S, Yao L, Henao R. Personalized Federated Learning for Text Classification with Gradient-Free Prompt Tuning. Findings of the Association for Computational Linguistics Naacl 2024 Findings. 2024. p. 4597–4612.

Published In

Findings of the Association for Computational Linguistics Naacl 2024 Findings

DOI

Publication Date

January 1, 2024

Start / End Page

4597 / 4612