SUPR
Solving Kernel Ridge Regression with Gradient-Based Optimization Methods
Dnr:

NAISS 2024/22-971

Type:

NAISS Small Compute

Principal Investigator:

Oskar Allerbo

Affiliation:

Kungliga Tekniska högskolan

Start Date:

2024-08-19

End Date:

2025-09-01

Primary Classification:

10106: Probability Theory and Statistics

Webpage:

Allocation

Abstract

Kernel ridge regression is usually solved by utilizing its closed-form expression, which includes inverting a generally quite large matrix. This matrix inversion can be avoided by instead solving the problem iteratively, using gradient-based optimization methods. Apart from the reduced computational cost, this approach also opens up both for using penalties other than the ridge penalty and for changing the kernel during training. Other penalties than ridge are investigated in https://arxiv.org/abs/2306.16838, where we mainly focus on robust kernel regression by replacing the ridge norm with the infinity norm. Changing the kernel during training is investigated in https://arxiv.org/abs/2311.01762. Both articles are in their final stages but would benefit from additional experiments, including bootstrap confidence intervals, something that would be greatly facilitated by more computational resources.