NAISS
SUPR
NAISS Projects
SUPR
Deep learning-based DoA estimation under adverse conditions
Dnr:

NAISS 2025/22-1566

Type:

NAISS Small Compute

Principal Investigator:

Chenyang Yan

Affiliation:

Kungliga Tekniska högskolan

Start Date:

2025-11-11

End Date:

2026-12-01

Primary Classification:

20205: Signal Processing

Webpage:

Allocation

Abstract

This project aims to develop and evaluate deep learning–based direction-of-arrival (DoA) estimation algorithms that remain robust under adverse weather conditions, with direct relevance to safety-critical sensing systems such as railway level-crossing monitoring. Classical subspace-based DoA methods (e.g., MUSIC, ESPRIT) rely on ideal array-manifold coherence and stationary noise assumptions. Under heavy rain, fog, or snow, mmWave radar returns exhibit attenuation, backscattering, and small-scale random phase distortions that violate these assumptions and degrade estimation accuracy. Recent physics-based S-matrix models indicate that such distortions introduce non-stationary, angle-dependent perturbations that cannot be captured by conventional covariance models. As a result, robust data-driven approaches capable of learning weather-induced manifold deformations from large datasets are needed. The planned research focuses on designing convolutional and transformer-based neural architectures that map sample covariance matrices or raw complex radar snapshots to accurate DoA predictions under a broad range of weather severities. The goal is twofold: (i) to learn invariant representations that compensate for random medium-induced distortions, and (ii) to quantify the performance gap between classical methods, purely data-driven models, and hybrid physics-informed neural networks. In addition, the project will evaluate multi-source scenarios where weather distortions reduce signal subspace separability, requiring models with improved robustness and generalization. High-fidelity data generation is central to the study. Radar signals will be simulated using both narrowband and broadband system models, realistic antenna geometries (ULA and non-ULA), and weather-dependent propagation effects generated from physical rain models (e.g., S-matrix-based scattering, ITU-R attenuation curves, lognormal drop size distributions). Training datasets will span a wide variety of SNRs, aperture sizes, weather intensities, and source counts to ensure generalizable learning. Complementary real mmWave radar data from TI IWR6843AOP and AWR2944 demonstrators will be included to validate domain-transfer performance. The project requires NAISS Small Compute 2025 resources primarily for large-scale deep learning training and hyperparameter optimization. The models involve millions of parameters and must be trained on datasets containing 0.5–2 million radar snapshots per experimental configuration. GPU acceleration is essential to support mini-batch training, spectral-domain data augmentation, and mixed-precision optimization. Multiple training runs are necessary to benchmark architectures, explore model robustness under different weather distributions, and quantify generalization across sensor configurations. Additionally, GPU resources will be used to simulate large multi-source datasets, which require repeated matrix factorizations, stochastic channel sampling, and high-resolution angle sweeps. Expected outcomes include: (i) a systematic comparison of deep learning, physics-informed, and subspace methods under controlled adverse-weather conditions; (ii) new neural architectures that explicitly integrate structured covariance models into the learning pipeline; (iii) performance-bound analysis comparing empirical RMSE to analytical Cramér–Rao lower bounds under weather-induced distortions; and (iv) open-source datasets and training pipelines facilitating reproducible research in adverse-weather DoA estimation. The project supports the broader goal of developing resilient, weather-robust sensing systems for intelligent transportation and contributes to Sweden’s strategic research agenda in safe, reliable, and autonomous mobility technologies.