This project aims to benchmark Spiking Neural Network (SNN) simulations on a modern high-performance computing (HPC) system, targeting use cases that span both computational neuroscience and computer science. The primary objective is to establish a clear, quantitative baseline of how state-of-the-art CPU-based systems handle various classes of SNN workloads, which can subsequently be used to evaluate and inform the design of specialized accelerators.
The work will utilize the NEST simulator as the primary software framework for running large-scale, biologically inspired models. As a neuroscience-oriented use case, cortical microcircuit simulations will be employed to stress memory capacity, communication overhead, and scalability across many nodes and cores. As a more algorithmic and constraint-solving example, SNN-based Sudoku solving will be used to represent discrete optimisation and graph-like computation. Finally, a liquid state machine applied to MNIST digit classification will provide a machine-learning-oriented workload, enabling an analysis of accuracy, throughput, and energy–performance trade-offs for SNNs in pattern recognition tasks.
Across these three use cases, the project will systematically study strong and weak scaling, memory behaviour, and communication patterns on the supercomputer. The resulting data will provide realistic performance and scalability baselines for different SNN workloads. This finding will be valuable for the broader HPC and neuromorphic communities and will serve as a reference point for future work on software optimisation and hardware accelerator design for large-scale SNN simulation.