SUPR
Equivariant deep learning
Dnr:

NAISS 2024/22-8

Type:

NAISS Small Compute

Principal Investigator:

Axel Flinth

Affiliation:

UmeƄ universitet

Start Date:

2024-02-01

End Date:

2025-02-01

Primary Classification:

10105: Computational Mathematics

Webpage:

Allocation

Abstract

In certain learning tasks, there are obvious symmetries in the data and/or learning tasks. As an example, graphs do not change after a relabeling of the data. There is reason to that a machine learning system should benefit from utilizing these symmetries, or equivariances. The aim of this project is to understand equivariant deep neural networks models function from a mathematical perspective. Of particular interest is how their training different from their non-equivariant counterparts. Our interests are mainly theoretical, and the networks we will train will typically be relatively small. Performing the experiments on the clusters is still appropriate, since the experiments still may be quite unhandy to handle on our local machines (e.g. when repeating experiments many times).