Analyse multivariate time series with machine learning and time-lagged correlation. We will perform simulation of brain dynamics using a Hoof model. The model is based on the Hoof oscillator equations, which described the evolution of complex-valued state variable of an oscillator over time. The goal of the simulation is to optimise the model parameters o that simulated functional connectivity (FC) of the brain matches the empirical FC, obtained from functional magnetic resonance imaging. The number of observations are roughly 10,000, each with a size of ~400x1000 (i.e. 400 time series per observation).
Additionally, for each observation a 400x400 matrix representing the structural connectivity will be used.
The code then performs various processing steps on the fMRI data, including bandpass filtering and calculation of empirical FC matrices for different time lags. It also defines the parameters of the Hopf model, such as the natural frequency, coupling strength, and noise level. The model is simulated using the Euler method, and the resulting time series data are used to compute the simulated FC matrices.
Finally, the code performs an optimization procedure to find the best-fitting model parameters that minimize the difference between the simulated and empirical FC matrices. This optimization is based on a gradient descent algorithm, and it iteratively updates the model parameters until the cost function is minimized. The optimized parameters are then used to generate new simulated data, and the process is repeated until convergence. The final output of the code is the optimized model parameters and the simulated FC matrices that match the empirical FC.