We aim to train various Transformer architectures on the resource. More precisely, we will be training the models in order to generate solutions to a certain physical setting. We follow a procedure that aims to find vacuum solutions to String Theory, the leading Quantum Gravity candidate. The search for possible solutions that are needed to describe the physics of our universe is rather complex and requires strong computational tools. If one aims to study general solutions, the problem rapidly grows in complexity and hence sufficiently deep Transformers will be needed. We also aim to fine-tune pre-trained LLMs to adapt them to our physical setting.