Fork me on GitHub

Specialized federated learning using a mixture of experts

The proposed framework visualized.

Federated learning has received attention for its efficiency and privacy benefits,in settings where data is distributed among devices. Although federated learn-ing shows significant promise as a key approach when data cannot be shared orcentralized, current incarnations show limited privacy properties and have shortcomings when applied to common real-world scenarios. One such scenario isheterogeneous data among devices, where data may come from different generating distributions. In this paper, we propose a federated learning framework usinga mixture of experts to balance the specialist nature of a locally trained model withthe generalist knowledge of a global model in a federated learning setting. Ourresults show that the mixture of experts model is better suited as a personalizedmodel for devices when data is heterogeneous, outperforming both global and local models. Furthermore, our framework gives strict privacy guarantees, whichallows clients to select parts of their data that may be excluded from the federation. The evaluation shows that the proposed solution is robust to the settingwhere some users require a strict privacy setting and do not disclose their modelsto a central server at all, opting out from the federation partially or entirely. The proposed framework is general enough to include any kind of machine learningmodels, and can even use combinations of different kinds.

Edvin Listo Zec, Olof Mogren, John Martinsson, Leon René Sütfeld, Daniel Gillblad

Arxiv Preprint 2020
PDF Fulltext
arxiv:
bibtex.

Olof Mogren, PhD, RISE Research institutes of Sweden. Follow me on Bluesky.