Estimating Committor Functions via Deep Adaptive Sampling on Rare Transition Paths

Release Time:2025-04-17Number of visits:98

Speaker:  Kejun Tang, Great Bay University.

Time:        14:00 pm, Apr.18th.

Location: SIST 1C-101

Host:        Qifeng Liao

Abstract:

The committor functions are central to investigating rare but important events in molecular simulations. It is known that computing the committor function suffers from the curse of dimensionality. Recently, using neural networks to estimate the committor function has gained attention due to its potential for high-dimensional problems. Training neural networks to approximate the committor function needs to sample transition data from straightforward simulations of rare events, which is very inefficient. The scarcity of transition data makes it challenging to approximate the committor function. To address this problem, we propose an efficient framework to generate data points in the transition state region that helps train neural networks to approximate the committor function. We design a Deep Adaptive Sampling method for TRansition paths (DASTR), where deep generative models are employed to generate samples to capture the information of transitions effectively. In particular, we treat a non-negative function in the integrand of the loss functional as an unnormalized probability density function and approximate it with the deep generative model. The new samples from the deep generative model are located in the transition state region and fewer samples are located in the other region. This distribution provides effective samples for approximating the committor function and significantly improves the accuracy. We demonstrate the effectiveness of the proposed method through both simulations and realistic examples.

Bio:

Kejun Tang, Assistant Professor of the School of Sciences at Great Bay University, obtained his PhD from the School of Information Science and Technology at ShanghaiTech University. He worked at Pengcheng National Laboratory, PKU-Changsha Institute for Computing and Digital Economy, and Shenzhen University of Advanced Technology.His main research interests are the intersection of scientific computing and machine learning, focusing on tensor computing, deep generative models, uncertainty quantification, and the intersection of differential equations and neural networks. The main results of his research are published in Journal of Computational Physics (JCP), SIAM Journal on Scientific Computing (SISC), Journal of Scientific Computing (JSC) and ICLR. He has also served as a reviewer for multiple journals and conferences, including Journal of Computational Physics, SIAM Journal on Scientific Computing, Journal of Scientific Computing and ICLR.