This research project aims to leverage deep learning-based perception methods to analyze traffic participant behavior and spatial planning at urban intersections, with a specific focus on Nordic environmental challenges. Using multi-sensor datasets including 4D millimeter-wave radar, LiDAR, and cameras, we evaluate the robustness of infrastructure-level perception systems under adverse weather conditions such as snow and low visibility. The methodology involves training 3D object detection frameworks (e.g., OpenPCDet) to identify and track diverse traffic participants, including emerging entities like delivery robots. The refined perception outputs—trajectories and interaction patterns—will serve as empirical evidence for optimizing intersection geometry and proactive safety assessments. This work bridges the gap between high-fidelity sensor perception and macro-level urban transportation spatial planning.