Talks and Presentations

Science-guided Machine Learning for Forward, Inverse, and Control Problems

March 01, 2023

Talk, Grand Valley State University, Michigan, USA

Scientific machine learning (SciML) is an interdisciplinary field that solves complex scientific problems by combining computational and algorithmic techniques with machine learning methods. This talk will cover the most recent developments in SciML. We will highlight the limitations of current methodologies and explore new ideas to address them. Several exemplar problems will be investigated, including optimal control for dynamical systems and inference on chaotic models. Following that, we look at how our methods might be used in applications such as climate modeling, robotics, and biology .

On Time-stepping Methods for Gradient-flow Optimization

March 01, 2022

Talk, The Canadian Applied and Industrial Mathematics Society /SCMAI 2022, Ontario, CA

Gradient-based optimization methods are essential in neural network training in many applications. The evolution of neural network parameters can be considered as an ODE system evolving in pseudo-time towards a local minimum of the objective function. This interpretation allows us to use different time-stepping schemes for the optimization. We will show that existing gradient-descent and momentum methods such as SGD and ADAM can be viewed as special time-discretization of the continuous gradient-flow. We will also consider using IMEX and high order schemes for improved efficiency in the optimization. Some demonstrations on small test problems will be presented.

Linearly Implicit Genral Linear methods

March 02, 2020

Talk, SIAM Conference on Computational Science and Engineering 2021, Atlanta, GA

Linearly implicit Runge-Kutta methods provide a fitting balance between implicit treatment of stiff systems and computational costs. We extend the class of linearly implicit Runge-Kutta methods to include multi-stage and multi-step methods. We discuss the order condition to achieve high stage order and overall accuracy while admitting arbitrary Jacobians. Several classes of Linearly implicit general linear methods (GLMs) are discussed based on existing families such as type-II and Type-IV GLMs, two-step Runge-Kutta methods, Parallel IMEX GLMs, and BDF methods. We investigate the stability implications for stiff problems and provide numerical studies for the behavior of our methods compared to others.

Discrete Multirate GARK schems

March 01, 2019

Talk, SIAM Conference on Computational Science and Engineering 2019, Spokane, WA

Multirate time integration schemes apply different step sizes to different components of a system based on the local dynamics of the components. Local selection of step sizes allows increased computational efficiency while maintaining the desired solution accuracy. The multirating idea is elegant and has been around for decades, however, difficulties faced in construction of high order multirate schemes has hampered their application. Seeking to overcome these challenges, our work focuses on the design of high-order multirate methods using the theoretical framework of generalized additive Runge-Kutta (GARK) methods, which provides the generic order conditions and the stability analyses. Of special interest is deriving methods that avoid unnecessary coupling between the components of the system, and allow straightforward transition to different step sizes between the steps. We present Multirate GARK schemes of up to order four that are explicit-explicit, implicit-explicit, and explicit-implicit in different components. We present numerical experiments illustrating the performance of these new schemes.