Nonconvex Approaches for Data Science

Release Time:2019-06-03Number of visits:111

Speaker:    Dr. Zhihui Zhu

Time:        10:00-12:00, June 4

Location:    SIST 1A-200

Host:       Prof. Manolis Tsakiris

Abstract:

As technological advances in fields such as the Internet, medicine, finance, and remote sensing have produced larger and more complex data sets, we are faced with the challenge of efficiently and effectively extracting meaningful information from large-scale and high-dimensional signals and data. Many modern approaches (e.g., deep learning) to addressing this challenge naturally involve nonconvex optimization formulations.  In theory, however, it is often difficult to find a global minimum for a general nonconvex problem; obtaining even a local minimizer is computationally hard.

In this lecture, I will present specific benign geometric properties coupled with algorithmic development for nonconvex optimization: (i) a benign global landscape in the sense that all local minima are global and the other critical points that are not global solutions are strict saddles, ensuring convergence of iterative algorithms with arbitrary or random initializations; (ii) a sufficiently large basin of attraction around the global minima, enabling us to develop optimization algorithms that can rapidly converge to a global minimum with a data-driven initialization. I will use two canonical examples---low-rank matrix factorization and robust subspace learning---to illustrate the details of this approach. I will also discuss the efficiency of this geometric analysis framework in the context of training shallow neural networks, blind deconvolution, robust principal component analysis, etc. At the end of the lecture, I will discuss open problems and future directions to enrich this framework.

Bio:

Zhihui Zhu is a Postdoctoral Fellow in the Mathematical Institute for Data Science at the Johns Hopkins University. He received his B.Eng. degree in communications engineering in 2012 from Zhejiang University of Technology (Jianxing Honors College), and his Ph.D. degree in electrical engineering in 2017 at the Colorado School of Mines, where his research was awarded a Graduate Research Award. His research interests include the areas of machine learning, signal processing, data science, and optimization. His current research largely focuses on the theory and applications of nonconvex optimization and low-dimensional models in large-scale machine learning and signal processing problems.

  

SIST-Seminar 18164