Skip to main content

Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents

Publication ,  Conference
Jia, Y; Vahidian, S; Sun, J; Zhang, J; Kungurtsev, V; Gong, NZ; Chen, Y
Published in: Lecture Notes in Computer Science
January 1, 2025

Data heterogeneity presents significant challenges for federated learning (FL). Recently, dataset distillation techniques have been introduced, and performed at the client level, to attempt to mitigate some of these challenges. In this paper, we propose a highly efficient FL dataset distillation framework on the server side, significantly reducing both the computational and communication demands on local devices while enhancing the clients’ privacy. Unlike previous strategies that perform dataset distillation on local devices and upload synthetic data to the server, our technique enables the server to leverage prior knowledge from pre-trained deep generative models to synthesize essential data representations from a heterogeneous model architecture. This process allows local devices to train smaller surrogate models while enabling the training of a larger global model on the server, effectively minimizing resource utilization. We substantiate our claim with a theoretical analysis, demonstrating the asymptotic resemblance of the process to the hypothetical ideal of completely centralized training on a heterogeneous dataset. Empirical evidence from our comprehensive experiments indicates our method’s superiority, delivering an accuracy enhancement of up to 40% over non-dataset-distillation techniques in highly heterogeneous FL contexts, and surpassing existing dataset-distillation methods by 18%. In addition to the high accuracy, our framework converges faster than the baselines because rather than the server trains on several sets of heterogeneous data distributions, it trains on a multi-modal distribution. Our code is available at https://github.com/jyqhahah/FedDGM.git.

Duke Scholars

Published In

Lecture Notes in Computer Science

DOI

EISSN

1611-3349

ISSN

0302-9743

Publication Date

January 1, 2025

Volume

15136 LNCS

Start / End Page

18 / 33

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 46 Information and computing sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Jia, Y., Vahidian, S., Sun, J., Zhang, J., Kungurtsev, V., Gong, N. Z., & Chen, Y. (2025). Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents. In Lecture Notes in Computer Science (Vol. 15136 LNCS, pp. 18–33). https://doi.org/10.1007/978-3-031-73229-4_2
Jia, Y., S. Vahidian, J. Sun, J. Zhang, V. Kungurtsev, N. Z. Gong, and Y. Chen. “Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents.” In Lecture Notes in Computer Science, 15136 LNCS:18–33, 2025. https://doi.org/10.1007/978-3-031-73229-4_2.
Jia Y, Vahidian S, Sun J, Zhang J, Kungurtsev V, Gong NZ, et al. Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents. In: Lecture Notes in Computer Science. 2025. p. 18–33.
Jia, Y., et al. “Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents.” Lecture Notes in Computer Science, vol. 15136 LNCS, 2025, pp. 18–33. Scopus, doi:10.1007/978-3-031-73229-4_2.
Jia Y, Vahidian S, Sun J, Zhang J, Kungurtsev V, Gong NZ, Chen Y. Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents. Lecture Notes in Computer Science. 2025. p. 18–33.

Published In

Lecture Notes in Computer Science

DOI

EISSN

1611-3349

ISSN

0302-9743

Publication Date

January 1, 2025

Volume

15136 LNCS

Start / End Page

18 / 33

Related Subject Headings

  • Artificial Intelligence & Image Processing
  • 46 Information and computing sciences