This project aims to develop scalable and physically consistent reduced-order models for complex physical systems using Implicit Neural Fields (INFs). The focus is on deformable objects and continuum mechanics, where the underlying dynamics are governed by partial differential equations and high-dimensional spatiotemporal states. Implicit neural representations are particularly well suited for this setting, as they allow continuous modeling of fields over space and time and naturally handle irregular discretizations such as point clouds.
The core objective is to perform model order reduction (MOR) by learning a low-dimensional latent representation of the system dynamics while preserving physical structure. To this end, the proposed approach incorporates physics and geometry-informed architectural constraints, ensuring that the learned reduced-order model retains a meaningful notion of dynamics, stability, and physical consistency. This is critical for downstream tasks such as long-term prediction, interpolation, and generalization to unseen configurations.
Preliminary experiments and prior work on related datasets indicate that training physics-informed implicit neural field models on 2D spatiotemporal data requires substantial computational resources. In particular, a single training run on an NVIDIA RTX A5000 GPU typically takes between 6 hours and 24 hours, depending on model size and training configuration. The present project extends this framework to 3D point cloud data, which significantly increases both the model capacity and the computational cost. As a result, access to dedicated GPU resources is essential not only for training the proposed models, but also for systematic hyper-parameter tuning and for fair comparison with state-of-the-art methods of comparable computational complexity.