NAISS
SUPR
NAISS Projects
SUPR
Self-Supervised Foundation Model for Automated Seizure Detection in Rodent EEG
Dnr:

NAISS 2026/3-358

Type:

NAISS Medium

Principal Investigator:

Marco Ledri

Affiliation:

Lunds universitet

Start Date:

2026-04-28

End Date:

2027-05-01

Primary Classification:

30105: Neurosciences

Allocation

Abstract

Epilepsy research relies heavily on rodent models, where continuous electroencephalography (EEG) monitoring generates thousands of hours of recordings per study. Identifying seizures in these recordings remains a critical bottleneck: manual review by trained experts is prohibitively time-consuming, while existing automated detection tools suffer from high false-positive rates and poor generalisation across laboratories. We are developing NED-Net (Neural Event Detection Network), an open-source platform for automated seizure detection in rodent EEG. The platform currently combines traditional signal processing methods (spike-train analysis, spectral band indexing, autocorrelation-based detection) with a supervised 1D U-Net for per-sample seizure segmentation. While the supervised model achieves good performance when sufficient annotated data is available, its effectiveness is limited by the scarcity of expert annotations — a common constraint in preclinical epilepsy research where labelling is expensive and time-consuming. This project aims to address this limitation by pre-training a self-supervised foundation model on a large corpus of unlabelled rodent EEG data. Self-supervised learning enables models to learn general signal representations from raw data without requiring annotations, and these representations can subsequently be fine-tuned for specific tasks with very few labelled examples. This approach has proven transformative in natural language processing (BERT) and speech recognition (wav2vec 2.0), but has not yet been applied to rodent electrophysiology at scale. We will adapt the BENDR architecture (Kostas et al., 2021, Frontiers in Human Neuroscience), which applies the wav2vec 2.0 framework to EEG data. BENDR uses a convolutional encoder to compress raw EEG into a sequence of learned representations, followed by a transformer contextualiser trained with a contrastive masked prediction objective. The original model was pre-trained on approximately 2,500 hours of human EEG. We propose to pre-train on approximately 25,000 hours of continuous rodent EEG recordings from our laboratory — representing over 50 animals monitored continuously for 21 or more days each. To our knowledge, this would be the first EEG foundation model trained on rodent data, and the largest self-supervised EEG pre-training effort in preclinical neuroscience. The pre-trained model will be integrated into the NED-Net platform as a feature extraction and confidence scoring module, complementing the existing supervised detection pipeline. We will evaluate the model's ability to improve seizure detection accuracy, particularly in few-shot scenarios where only a small number of annotated seizures are available — the typical situation faced by laboratories beginning a new study. All code, training procedures, and model weights will be released as open-source software, enabling other epilepsy research laboratories to either fine-tune our pre-trained model on their own data or pre-train from scratch using our training pipeline. This project directly addresses a significant unmet need in preclinical epilepsy research and has the potential to substantially reduce the manual annotation burden across the field.