Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning

Shaoxiong Ji, Wenqi Jiang, Anwar Walid, Xue Li

Tutkimustuotos: ArtikkelijulkaisuArtikkeliTieteellinenvertaisarvioitu

Abstrakti

Federated learning (FL) is a novel machine learning setting that enables on-device intelligence via decentralized training and federated optimization. Deep neural networks rapid development facilitates the learning techniques for modeling complex problems and emerges into federated deep learning under the federated setting. However, the tremendous amount of model parameters burdens the communication network with a high load of transportation. This article introduces two approaches for improving communication efficiency by dynamic sampling and top-kk selective masking. The former controls the fraction of selected client models dynamically, while the latter selects parameters with top-kk largest values of difference for federated updating. Experiments on convolutional image classification and recurrent language modeling are conducted on three public datasets to show our proposed methods effectiveness.

Alkuperäiskielienglanti
LehtiIEEE Intelligent Systems
Vuosikerta37
Numero2
Sivut27-34
Sivumäärä8
ISSN1541-1672
DOI - pysyväislinkit
TilaJulkaistu - 2022
Julkaistu ulkoisestiKyllä
OKM-julkaisutyyppiA1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä, vertaisarvioitu

Lisätietoja

Publisher Copyright:
© 2001-2011 IEEE.

Tieteenalat

  • 113 Tietojenkäsittely- ja informaatiotieteet

Siteeraa tätä