SUPR
Numerical methods for optimization problems arising in machine learning
Dnr:

NAISS 2024/22-256

Type:

NAISS Small Compute

Principal Investigator:

Måns Williamson

Affiliation:

Lunds universitet

Start Date:

2024-02-28

End Date:

2025-03-01

Primary Classification:

10105: Computational Mathematics

Allocation

Abstract

The project aims to investigate how numerical methods for solving differential equations can be applied to large scale optimization problems (e.g. training neural networks). An example of this is the stochastic gradient descent algorithm; this can be viewed as a stochastic version of the explicit Euler scheme applied to the gradient flow equation. Convergence results are often in expectation, and this requires heavy Monte Carlo simulations for a large number of random seeds to get good estimates of the expectation.