Federated Learning in Large-Scale Applications

Federated Learning in Large-Scale Applications

Master Thesis

Google uses the federated learning technique to build machine learning models based on distributed data. Users train the model locally on their data and send only the model updates to the server, which aggregates all updates to optimize the global model. This technique is prone to several attacks aiming at revealing information about individual users. Currently all the aggregation operations happen on Google servers. The scalability of the federated learning is questionable as millions of users can take part of the training process.

In this thesis, we aim at:

• Analyzing the scalability of federated learning under centralized setting

• Proposing decentralized federated learning approach, which allows users to participate in the aggregation operations, thus, improves their data protection.