Gauge Equivariant Convolutional Neural Networks

Dnr:

NAISS 2023/5-393

Type:

NAISS Medium Compute

Principal Investigator:

Daniel Persson

Affiliation:

Chalmers tekniska hÃ¶gskola

Start Date:

2023-10-01

End Date:

2024-10-01

Primary Classification:

10799: Other Natural Sciences not elsewhere specified

Despite the overwhelming success of deep neural networks we are still at a loss for explaining exactly how deep learning works, and why it works so well. What is the mathematical framework underlying deep learning? One promising direction is to consider symmetries as an underlying design principle for network architectures. This can be implemented by constructing deep neural networks on a group G that acts transitively on the input data. This is directly relevant for instance in the case of spherical signals where G is a rotation group.
Even more generally, it is natural to consider the question of how to train neural networks in the case of "non-Euclidean data''. Relevant applications include omnidirectional computer vision, biomedicine, and climate observations, just to mention a few situations where data is naturally "non-flat''. Mathematically, this calls for developing a theory of deep learning on manifolds, or even more exotic structures, like graphs or algebraic varieties.
The aim of this project is to use techniques and theorems from mathematics and physics to develop a general framework for efficiently applying convolutional neural networks (CNNs) non-Euclidean data. The project aims to apply this formalism to concrete problems arising in autonomous driving, where a general framework for applying CNNs to non-Euclidean data is highly desirable. In particular, it will be applied to image recognition problems and object detection from Fisheye cameras, as well as for interpolated point clouds arising from the Lidars, mounted on the self-driving vehicle. This part of the project will be pursued in collaboration with Zenseact.
This project is a continuation of the ongoing project (SNIC 2022/5-207) resulting in the publications
https://arxiv.org/abs/2105.13926
https://arxiv.org/abs/2105.05400
https://arxiv.org/abs/2202.03990
https://arxiv.org/abs/2307.07313