SUPR
Federated learning and decentralised AI
Dnr:

NAISS 2023/7-3

Type:

SSC

Principal Investigator:

Addi Ait-Mlouk

Affiliation:

Högskolan i Skövde

Start Date:

2023-01-30

End Date:

2024-02-01

Primary Classification:

10201: Computer Sciences

Secondary Classification:

10205: Software Engineering

Tertiary Classification:

10299: Other Computer and Information Science

Allocation

Abstract

In domains where data is sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices. Recently, Federated Learning (FL) has been proposed as a solution for collaborative machine learning and data privacy problems. In federated learning, multiple parties (cross-device, cross-silo) can collaborate to train new models while keeping their data, local and private. Instead of moving data to a central storage system or cloud for model training, code is moved to the data owners’ local sites, and incremental local updates are combined into a global model. In this way, only model parameters are shared, and training proceeds by completely local model updates on private data nodes. The nodes then send model updates to a central server that combines theme (for example, by averaging model parameters) into a global model. Following the above points and the fact that data never leaves its local storage while training a global model; FL enhances data privacy and reduces the probability of eavesdropping to a certain extent. Hence, more users will be willing to take part in collaborative model training, and so, better inference models can be built.