SUPR
Efficient Deep Learning
Dnr:

NAISS 2024/22-1108

Type:

NAISS Small Compute

Principal Investigator:

Morteza Haghir Chehreghani

Affiliation:

Chalmers tekniska högskola

Start Date:

2024-08-30

End Date:

2025-09-01

Primary Classification:

10201: Computer Sciences

Webpage:

Allocation

Abstract

This project explores the development and enhancement of efficient deep learning models with a focus on optimizing computational resources and memory usage. As deep learning models continue to grow in size and complexity, their demand for processing power and memory has escalated, leading to challenges in scalability, deployment, and energy consumption. The project aims to address these concerns by investigating novel techniques, including model pruning, quantization, low-rank approximations, and hardware-aware neural architecture search. By implementing and evaluating these methods, the project seeks to reduce the computational overhead and memory footprint of deep learning models without significantly compromising their performance. This research will contribute towards creating more sustainable and deployable AI systems, particularly for resource-constrained environments like mobile devices, edge computing, and cloud infrastructures.