Speaker
Description
Federated learning is aimed at implementing machine learning based on training data stored by many personal devices. The key idea is that data is never transferred to a central location, instead, machine learning models are trained locally and then aggregated centrally. Our research aims at reducing the burden on the central cloud component by using local communication on the Edge. Devices exchange information with each other and perform the aggregation step among themselves in a decentralized manner. Apart from the decentralized learning algorithm, we introduce additional techniques to reduce communication such as subsampling and compression. We demonstrate that our approach is comparable to the centralized version in terms of convergence speed as a function of the amount of information exchanged.