SUPR
Communication efficient DML using temporal correlations.
Dnr:

NAISS 2025/22-165

Type:

NAISS Small Compute

Principal Investigator:

Adrian Edin

Affiliation:

Linköpings universitet

Start Date:

2025-02-13

End Date:

2025-06-01

Primary Classification:

20203: Communication Systems

Webpage:

Allocation

Abstract

Distributed machine learning (DML) enables the scalable processing of large datasets and the training of complex models by distributing computations across multiple nodes or machines. However, the communication between the distributed agents and a central parameter server become a bottleneck for the training efficiency and learning performance, especially when wireless networks support the communication links. This project is a part of finding methods for increasing the communication and energy efficiency in DML systems, for example, though data compression.