Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler

Publisher:闻天明Release Time:2020-10-16Number of visits:14

腾讯会议/Tencent Meeting:  432 960 536


本次讲座签到需到现场签到
--------------------------------------------

Speaker:   Prof. Jinglai Li

Time:       15:00-16:00, Oct. 22

Location:  SIST 1A 402

Host:       Prof. Qifeng Liao

 

Abstract:

The performance of Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov-Sinai entropy (KSE) based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a byproduct the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples.

 

Bio:

Jinglai Li received the B.Sc. degree in Applied Mathematics from Sun Yat-sen University in 2002 and the PhD degree in Mathematics from SUNY Buffalo in 2007. After his PhD degree, Jinglai Li did postdoctoral research at Northwestern University (2007-2010) and MIT (2010-2012) respectively. He subsequently worked at Shanghai Jiao Tong University (Associate Professor, 2012-2018) and University of Liverpool (Reader, 2018-2020). Jinglai Li joined the University Birmingham in 2020 where he is currently a Professor of Mathematical Optimisation. Jinglai Li's current research interests are in scientific computing, computational statistics, uncertainty quantification, data science and their applications.

SIST-Seminar 18245