The project aims to investigate how numerical methods for solving differential equations can be applied to large scale optimization problems (e.g. training neural networks). An example of this is the stochastic gradient descent algorithm; this can be viewed as a stochastic version of the explicit Euler scheme applied to the gradient flow equation. Convergence results are often in expectation, and this requires heavy Monte Carlo simulations for a large number of random seeds to get good estimates of the expectation.