Entropy Estimation via Normalizing Flow

Release Time:2021-10-07Number of visits:164

Speaker:    Prof. Jinglai Li, University Birmingham

Time:         15:00-16:00, Oct. 08. 2021 

Location:   SIST 1C 101

Host:          Prof. Qifeng Liao

 

Abstract:

Entropy estimation is an important problem in information theory and statistical science. Many popular entropy estimators suffer from fast growing estimation bias with respect to dimensionality, rendering them unsuitable for high dimensional problems. In this work we propose a transform-based method for high dimensional entropy estimation, which consists of the following two main ingredients. First by modifying the k-NN based entropy estimator, we propose a new estimator which enjoys small estimation bias for samples that are close to a uniform distribution. Second we design a normalizing flow based mapping that pushes samples toward a uniform distribution, and the relation between the entropy of the original samples and the transformed ones is also derived. As a result the entropy of a given set of samples is estimated by first transforming them toward a uniform distribution and then applying the proposed estimator to the transformed samples. Numerical experiments demonstrate the effectiveness of the method for high dimensional entropy estimation problems.

Bio:

Jinglai Li received the B.Sc. degree in Applied Mathematics from Sun Yat-sen University in 2002 and the PhD degree in Mathematics from SUNY Buffalo in 2007. After his PhD degree, Jinglai Li did postdoctoral research at Northwestern University (2007-2010) and MIT (2010-2012) respectively. He subsequently worked at Shanghai Jiao Tong University (Associate Professor, 2012-2018) and University of Liverpool (Reader, 2018-2020). Jinglai Li joined the University Birmingham in 2020 where he is currently a Professor of Mathematical Optimisation. Jinglai Li's current research interests are in scientific computing, computational statistics, uncertainty quantification, data science and their applications.