SUPR
Generating Lagrangians with Large Language Model
Dnr:

NAISS 2024/22-359

Type:

NAISS Small Compute

Principal Investigator:

Yong Sheng Koay

Affiliation:

Uppsala universitet

Start Date:

2024-03-13

End Date:

2025-04-01

Primary Classification:

10301: Subatomic Physics

Webpage:

Allocation

Abstract

In this project, we explore the use of transformer models to generate particle theory Lagrangians. By treating Lagrangians as complex, rule-based constructs similar to linguistic expressions, we employ transformer architectures —proven in language processing tasks— to model and predict Lagrangians. The ultimate goal of this project is to develop an AI system capable of formulating theoretical explanations to experimental observations, a significant step towards integrating artificial intelligence into the iterative process of theoretical physics.