Fork me on GitHub

EFFGAN: Ensembles of fine-tuned federated GANs

Images generated by (a) FedGAN [Rasouli et al.(2020)] and (b) our EFFGAN trained on the FashionMNIST dataset. The rows corresponds to models trained with local epochs E = 1, 5, 10, 20, 50 (top to bottom).

Generative adversarial networks have proven to be a powerful tool for learning complex and high-dimensional data distributions, but issues such as mode collapse have been shown to make it difficult to train them. This is an even harder problem when the data is decentralized over several clients in a federated learning setup, as problems such as client drift and non-iid data make it hard for federated averaging to converge. In this work, we study the task of how to learn a data distribution when training data is heterogeneously decentralized over clients and cannot be shared. Our goal is to sample from this distribution centrally, while the data never leaves the clients. We show using standard benchmark image datasets that existing approaches fail in this setting, experiencing so-called client drift when the local number of epochs becomes to large. We thus propose a novel approach we call EFFGAN: Ensembles of fine-tuned federated GANs. Being an ensemble of local expert generators, EFFGAN is able to learn the data distribution over all clients and mitigate client drift. It is able to train with a large number of local epochs, making it more communication efficient than previous works.

Ebba Ekblom, Edvin Listo Zec, Olof Mogren

IEEE International Conference on Big Data
PDF Fulltext
arxiv: 2206.11682
bibtex.

Olof Mogren, PhD, RISE Research institutes of Sweden. Follow me on Bluesky.