SUPR
Change is Key!
Dnr:

NAISS 2024/5-148

Type:

NAISS Medium Compute

Principal Investigator:

Nina Tahmasebi

Affiliation:

Göteborgs universitet

Start Date:

2024-05-01

End Date:

2025-05-01

Primary Classification:

10208: Language Technology (Computational Linguistics)

Allocation

Abstract

In the "Change is Key!" research program, funded by RJ, we aim to advance the study of semantic change by leveraging the power of state-of-the-art large-scale NLP models. Our focus extends to understanding semantic change over multiple time periods and incorporating sense discrimination into our analysis, enhancing the precision of context vector representations. Our studies typically contain experiments for Swedish in addition to a set of baselines, including but not limited to English, German, Spanish, Latin, Russian, etc. In our projects we will train transformer models under different linguistic conditions and analyze the models outputs (hidden layers) to create better and more interpretable computational models of language. We plan to train transformer models on different NLP and linguistic tasks across several languages. Training the models would require the use of GPUs (and CPUs) for long periods of time, as well as storage for our textual resources and datasets, and for saving our trained models. In addition to that, we plan to conduct a deep analysis of transformers hidden representations, which will be done off-line, and would require even more storage capacity. Our previous experiments have already resulted in several publications which would not have been possible without the use of NAISS infrastructure.