SUPR
One Shot Federated Learning with Diffusion Models
Dnr:

NAISS 2025/22-610

Type:

NAISS Small Compute

Principal Investigator:

Obaidullah Zaland

Affiliation:

UmeƄ universitet

Start Date:

2025-04-16

End Date:

2026-05-01

Primary Classification:

10210: Artificial Intelligence

Webpage:

Allocation

Abstract

Federated learning (FL) enables collaborative learning without data centralization but introduces significant communication costs due to multiple communication rounds between clients and the server. One-shot federated learning (OSFL) addresses this by forming a global model with a single communication round, often relying on the server's model distillation or auxiliary dataset generation - mostly through pre-trained diffusion models (DMs). Existing DM-assisted OSFL methods, however, typically employ classifier-guided DMs, which require training auxiliary classifier models at each client, introducing additional computation overhead. In this work, we want to explore the effectiveness of classifier-free diffusion models in OSFL. Alongside, we want to investigate the impact of OSFL with diffusion models in incremental federated learning.