Modul:   MAT870  Zurich Colloquium in Applied and Computational Mathematics

Understanding and Accelerating Subsampled Natural Gradient Algorithms for Scientific Machine Learning

Vortrag von Gil Goldshlager

Datum: 17.09.25  Zeit: 16.30 - 18.00  Raum: ETH HG G 19.2

Over the last two years, subsampled natural gradient descent (SNGD) has demonstrated breakthrough performance for parametric optimization tasks in scientific machine learning, including neural network wavefunctions and physics-informed neural networks. In this talk, I will first present an accelerated algorithm called SPRING which improves the convergence of SNGD at no extra cost. I will then consider the theoretical aspects of these algorithms and explain why previous analytical approaches based on stochastic optimization theory fail to explain the empirical observations. Finally, I will show how a different perspective rooted in randomized linear algebra overcomes these limitations and provides an accurate and detailed understanding of the convergence properties of both SNGD and SPRING. Beyond explaining existing algorithms, the new analytical framework also suggests new pathways towards faster and more robust training algorithms for scientific machine learning