SUPR
Solving Kernel Ridge Regression with Gradient-Based Optimization Methods
Dnr:

NAISS 2025/22-1098

Type:

NAISS Small Compute

Principal Investigator:

Oskar Allerbo

Affiliation:

Kungliga Tekniska högskolan

Start Date:

2025-09-01

End Date:

2026-09-01

Primary Classification:

10106: Probability Theory and Statistics (Statistics with medical aspects at 30118 and with social aspects at 50907)

Webpage:

Allocation

Abstract

Kernel ridge regression is usually solved by utilizing its closed-form expression, which includes inverting a generally quite large matrix. This matrix inversion can be avoided by instead solving the problem iteratively, using gradient-based optimization methods. Apart from the reduced computational cost, this approach also opens up for changing the kernel during training, something that can result in both benign overfitting and in double descent. This is investigated in https://arxiv.org/abs/2311.01762. The article is in its final stages but would benefit from additional experiments, including bootstrap confidence intervals, something that more computational resources would greatly facilitate.