Seminar by Benjamin Peherstorfer
Speaker
Benjamin Peherstorfer (Courant Institute of Mathematical Sciences)
Title
Neural Galerkin schemes for model reduction of transport-dominated problems
Date
March 05, 2024 16:00 CET+0100 (Europe/Rome)
March 05, 2024 10:00 EST-0500 (US/Eastern)
March 05, 2024 09:00 CST-0600 (US/Central)
March 05, 2024 07:00 PST-0800 (US/Pacific)
Abstract
Nonlinear parametrizations such as deep neural networks can circumvent the Kolmogorov barrier of classical model reduction methods that seek linear approximations in subspaces. However, numerically fitting (“training”) nonlinear parametrizations is challenging because (a) training data need to be sampled (residual measurements) to estimate well the population loss with the empirical loss and (b) training errors quickly accumulate and amplify over time. This work introduces Neural Galerkin schemes that build on the Dirac-Frenkel variational principle for training nonlinear parametrizations sequentially in time. The accumulation of error over the sequential-in-time training is addressed by updating only randomized sparse subsets of the parameters, which is motivated by dropout that addresses a similar issue of overfitting due to neuron co-adaptation. Additionally, an adaptive sampling scheme is proposed that judiciously tests the residual so that few residual calculations are sufficient for training. In numerical experiments with a wide range of evolution equations, the proposed scheme outperforms classical linear methods, especially for problems with transport-dominated features and high-dimensional spatial domains.