SUPR
Simulations of cortical spiking neural network models and development of brain-like computing network architectures for applications in data analytics
Dnr:

NAISS 2023/5-484

Type:

NAISS Medium Compute

Principal Investigator:

Pawel Herman

Affiliation:

Kungliga Tekniska högskolan

Start Date:

2023-12-01

End Date:

2024-12-01

Primary Classification:

10201: Computer Sciences

Secondary Classification:

10203: Bioinformatics (Computational Biology) (applications to be 10610)

Tertiary Classification:

20699: Other Medical Engineering

Allocation

Abstract

The proposal specified in this application is a continuation of the previous project aimed at building large-scale brain-like neural networks. We continue the development and simulations of two major families of models: i) spiking neural networks constrained by biophysically details derived from the neocortex with the aim of explaining complex cognitive memory functions and, ii) brain-like computing architectures with mixed spiking and non-spiking networks aimed at solving machine learning tasks and spatio-temporal pattern recognition.  With regard to simulations of cortical systems aimed at capturing cognitive phenomena, the main emphasis is on coupling cognitive function and neural dynamics so that analysis could be performed in close relationship with biological mesoscopic recordings obtained from our experimental collaborators. We have developed large-scale neural network models in NEST simulator (using MPI parallelism) and have validated different alternative implementations. Currently we are extending our simulations from NEST to exploit GPU parallelism provided by newer biologically detailed neural network simulation environments and exploring alternatives such as GeNN and CARLSim that offer hybrid GPU and CPU (multi-threading and MPI) parallelism. This development is crucial for our new plans to build and simulate a multi-area neural system that exhibits memory and decision making functionality. We also keep investigating the fundamental problem of learning, particularly in the context of short-term memory encoding. To this end we exploit our Hebbian plasticity rule to phenomenologically account for synaptic learning processes. We develop brain-like computing algorithms in the context of specific instances of pattern recognition and temporal prediction problems including scenarios where information is organised in sequences. In this research direction, we are translating the insights and experiences from the past cognitive/neuroscientific modelling into large-scale machine learning algorithms. Our recent work in this direction has demonstrated that our model can perform representation learning, which is the key characteristic of most modern machine learning systems. We plan to continue our efforts towards a new generation of cortex inspired hierarchical networks for pattern recognition purposes with focus on unsupervised representation learning. Following up on our recent advances, we intend to invest more efforts in building a network hierarchy with local learning rules as well as studying the network's capability to derive spatiotemporal representations. This work, unlike the aforementioned efforts on non-spiking networks with specific applications in mind, will be more generic and oriented towards brain-like computing methodology. The HPC environment is an excellent platform for developing and evaluating performance of the resulting brain-like neural network systems. The grand goal for our research is code able to execute an abstract cortex-scale model in real time with a simulation time step of 1ms, while performing real-world perceptual, associative memory, decision making, and motor output tasks. This work has already greatly benefitted from HPC resources - Beskow, Tegner (PDC, Sweden), and Vega (IZUM, Slovenia).