上海科技大学 | English
首页> 新闻信息> 讲座
When Are Nonconvex Optimization Problems Not Scary?
Date:2016/6/2     Browse:569

When Are Nonconvex Optimization Problems Not Scary?

Speaker: Ju Sun

Time: Jun 2, 3:30pm - 4:30pm.

Location: Lecture Hall, Administration Center

Abstract:

General nonconvex optimization problems are NP-hard. In applied disciplines, however, nonconvex problems abound, and heuristic algorithms are often surprisingly effective. The ability of nonconvex heuristics to find high-quality solutions for practical problems remains largely mysterious.

In this talk, I will describe a family of nonconvex problems which can be solved efficiently. This family has the characteristic structure that (1) every local minimizer is also global, and (2) the objective function has a negative directional curvature around all saddle points (“ridable saddle”). Natural (nonconvex) formulations for a number of important problems in signal processing and machine learning lie in this family, including the eigenvector problem, complete dictionary learning (CDL), generalized phase retrieval (GPR), orthogonal tensor decomposition, and various synchronization problems. This benign geometric structure allows a number of optimization methods to efficiently find a global minimizer, without special initializations.  To corroborate this, I will describe the second-order trust-region method.  This geometric approach to solving nonconvex problems has led to new computational guarantees for several practical problems, such as CDL and GPR.

To complete and enrich the framework is an ongoing research effort. I will highlight challenges from both theoretical and algorithmic sides.

Joint work with Qing Qu and John Wright.  An overview article on this is available online:  http://arxiv.org/abs/1510.06096 ; see also my PhD thesis http://sunju.org/pub/docs/thesis.pdf for details.

Bio:

Ju Sun received his B. Eng. degree in computer engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004 - 2008), and PhD degree from Electrical Engineering of Columbia University in 2016 (2011 - 2016).  In Fall 2016, he is going to join Professor Emmanuel Candes’ group at Stanford University as a postdoctoral researcher.  His research interests lie at the intersection of computer vision, machine learning, numerical optimization, signal/image processing, information theory, and compressive sensing, focused on modeling, harnessing, and computing with low-dimensional structures in massive data, with provable guarantees and practical algorithms. Recently, he is particularly interested in explaining the surprisingly effectiveness of nonconvex optimization heuristics arising in practical problems. He maintains a webpage dedicated to this topic: http://sunju.org/research/nonconvex/ .

 

SIST-Seminar 16039