SUPR
Machine learning training for automatic tracking of insect heads
Dnr:

NAISS 2024/22-799

Type:

NAISS Small Compute

Principal Investigator:

Emily Baird

Affiliation:

Stockholms universitet

Start Date:

2024-06-11

End Date:

2024-10-01

Primary Classification:

10611: Ecology

Allocation

Abstract

Describing the field of view (FOV) of animal eyes can shed light onto the how animals use visual information from their environment to guide behaviour. FOVs, however, are not fixed but instead vary with the orientation of the eye and can therefore be used to sample different parts of visual space. For animals that cannot move their eyes in their heads, such as butterflies and bumblebees, it is possible to use their head movements to understand how their FOVs are oriented in space while they perform different behavioural tasks. While my lab has now established a reliable method for reconstructing insect FOVs from 3D volumetric scans of their eyes, we lack reliable, high-thoughput methods for analysing how they orient their heads, and therefore their eyes, in space while performing different behavioural tasks. We have recently started using machine learning approaches to help us effectively identify from video footage but one major hurdle for using this approach is the time taken to train and retrain a neural network model until it reaches high enough accuracy to achieve the desired task. Each training of a neural network model can take over 1 day on our relatively powerful desktop computers, even with the utilization of a GPU. To overcome this hurdle within our project, we would like to apply to utilize the available GPUs from your facility. With the incorporation of more developed GPUs, the development of our Artificial Intelligence and Machine Learning research can facilitated more efficiently and this will, in turn enable us to improve the speed and accuracy with which we can predict the field of view of behaving insect.