SUPR
Classification and analysis of multidimensional respiratory-driven oscillatory brain activity and its effect on visual perpception
Dnr:

NAISS 2023/5-73

Type:

NAISS Medium Compute

Principal Investigator:

Artin Arshamian

Affiliation:

Karolinska Institutet

Start Date:

2023-02-23

End Date:

2024-03-01

Primary Classification:

30105: Neurosciences

Allocation

Abstract

Animal studies have demonstrated that respiration creates neural oscillations (i.e., in the central nervous system that propagate globally to sensory (e.g. visual and auditory cortex) and multisensory brain regions critical for perception, multisensory integration and active sensing and directly influence them. These oscillations are created by two different oscillators in the brain. One is located in the brainstem and activates during both nose and mouth breathing, the other is located in the forebrain and is activated only during nose breathing. Cross-frequency phase-amplitude coupling is a mechanism of information transfer in the brain that links respiration to oscillatory amplitudes. In mice, slow respiration rhythms in the olfactory bulb are transmitted through the cortex to modulate the amplitude of faster oscillations in upstream areas. This modulation is due to the coupling of the respiratory rhythm to specific neural rhythms that code for transient brain states of heightened susceptibility for sensory stimulation. These phasic cycles of neural excitability determine the intensity of early sensory responses and vary over the respiration cycle, being tightly coupled to brain oscillations. For example, a decrease in alpha power has been shown to increase detection rates for near-threshold stimuli, and performance levels in cognitive tasks. Thus, it has been hypothesized that these respiratory oscillations may serve similar functions in humans, but next to nothing is known about these processes in humans. In these sets of studies, we will use EEG to measure different types of breathing (e.g., via the nose or mouth, fast and slow) during both rest and when subjects perform different visual perceptual tasks. We will first study if it is possible to classify respiration type (inhale, exhale, route frequency) from EEG source extracted data from the olfactory bulb and other sensory regions (e.g. visual and auditory cortices). Specifically, we want to do this on a single trial level (i.e., over each respiratory cycle). Moreover, we want to study phase-amplitude coupling across the whole respiratory cycle across multiple anatomical nodes and oscillatory frequencies. Next, we want to relate this data to behavioral data using representational similarity analysis which is a computational technique that uses pairwise comparisons of stimuli to reveal their representation in higher-order space. Taken together this will give fundamental insights into the mechanisms by which respiration shapes brain activity and behavior during rest and active sensory perception.