SUPR
Change is Key!
Dnr:

NAISS 2023/6-103

Type:

NAISS Medium Storage

Principal Investigator:

Nina Tahmasebi

Affiliation:

Göteborgs universitet

Start Date:

2023-04-19

End Date:

2024-05-01

Primary Classification:

10208: Language Technology (Computational Linguistics)

Allocation

Abstract

Within the RJ funded research program, Change is Key!, we are working on semantic change detection on Swedish and other languages, using large-scale NLP models (like contextual embeddings). Our current aim is to study semantic change on multiple time periods thus increasing the amount of computation needed, and to add sense-discrimination on top of context vector representations. In our projects we will train transformer models under different linguistic conditions and analyze the models outputs (hidden layers) to create better and more interpretable computational models of language. The projects will last for 6-8 months, during which we plan to train transformer models on different NLP and linguistic tasks across several languages. Training the models would require the use of GPUs (and CPUs) for long periods of time, as well as storage for our textual resources and datasets, and for saving our trained models. In addition to that, we plan to conduct a deep analysis of transformers hidden representations, which will be done off-line, and would require even more storage capacity.