Tom O'Leary-Roseberry

Seminar:

Using model structure to design efficient neural network surrogates for parametric PDE problems.
Wednesday, March 10, 2021
10AM – 11AM
Zoom Meeting

Thomas O'Leary-Roseberry

Neural network surrogates show promise in making tractable the solution of many-query problems that require numerous evaluations of a parameter-to-output map (e.g. uncertainty quantification, Bayesian inversion, optimal experimental design). These evaluations are prohibitive if this parametric map is high-dimensional and involves expensive solution of partial differential equations (PDEs). Typical machine learning methodologies rely on an abundance of data which are not feasible for high dimensional nonlinear physics-based problems.

In this talk, we propose to construct surrogates for high-dimensional PDE-governed parametric maps in the form of projected neural networks that parsimoniously capture the geometry and intrinsic low-dimensionality of these maps, as revealed by derivative-based dimension reduction. Such surrogates can exploit low dimensional structure of the map, and require few data to train, while having superior performance to full neural network performance in generalization accuracy. Numerical experiments are shown for PDE parameter-to-observable regression problems.

Time permitting, we will show that efficient matrix-free stochastic Newton methods can exploit low dimensional information of neural network training loss landscapes. These methods have similar per-iteration computational costs to traditional stochastic first order methods with better convergence properties, less sensitivity to hyperparameter tuning. Further, these methods can explicitly use curvature information to avoid saddle points and truncate noisy directions of the stochastic Hessians to improve generalization properties.

Bio
Tom O'Leary-Roseberry is a postdoctoral researcher working with Omar Ghattas at the Oden Institute. His research interests include scientific machine learning and stochastic nonconvex optimization. He is interested in designing efficient methods for constructing machine learning models to assist in the solution of high dimensional stochastic many-query problems that arise in computational engineering settings, as well as designing efficient matrix-free Newton methods for the solution of stochastic nonconvex optimization problems. He received a Ph.D. in CSEM at the Oden Institute in 2020. He received a BA in Mathematics and BSE in Engineering Mechanics from the University of Wisconsin--Madison in 2015.

Note: the Babuška Forum series was started by Professor Ivo Babuška several years ago to expose students to interesting and curious topics relevant to computational engineering and science with technical content at the graduate student level (i.e. the focus of the lectures is on main ideas with some technical content). Seminar credit is given to those students who attend.

For questions, please contact: stefan@oden.utexas.edu

Hosted by Stefan Henneking




 Live Zoom Meeting Link: Click Here to Watch