SUPR
Building competence of performing large scale experiments in the field of Hyperdimensional Computing with HPC
Dnr:

NAISS 2024/22-442

Type:

NAISS Small Compute

Principal Investigator:

Evgeny Osipov

Affiliation:

Luleå tekniska universitet

Start Date:

2024-04-02

End Date:

2025-05-01

Primary Classification:

10201: Computer Sciences

Webpage:

Allocation

Abstract

We intend to build the competence of using HPC in the scope of the research project funded by VR, the Swedish Research Council. The overarching goal of the project is to solve the grand challenge of fast and power-efficient Artificial Intelligence (AI) by solving the challenges of the neuromorphic (NM) computing. Solving this challenge is important for more rapid emergence of such intelligent technologies as self-driving cars, autonomous robots, large scale information retrieval systems, which in turn are essential for the sustainable development of cities and industries as well as building resilient infrastructures. The emerging neuromorphic (NM) computing technology, i.e., neuromorphic processors (Davies, M., et al. (2018). Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro, 38(1):82–99) and in-memory computing (Karunaratne, G., et al. (2020a). In-Memory Hyper-dimensional Computing. Nature Electronics, pages 1–14), delivers great promises for orders of magnitude improvements in power efficiency of computations. It is prospected to be a game-changing technology for fast and energy efficient AI. This project intends to bridge the NM computing usability gap, which in the opinion of the neuromorphic research community is one of the most important challenges, by developing an algebraic programming methodology for intuitive realization of AI models on the neuromorphic hardware. The theoretical core of the project is Vector Symbolic Architectures also known as hyperdimensional computing, i.e., VSA/HDC (Kanerva, P.: Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors. In: Cognitive Computation (2009), vol. 1 p. 139–159). For the execution of the project it is essential to perform large scale experiments involving massive computations on vectors of extremely high dimensionality up to a million.