Title: Exploiting Second Order Sparsity in Big Data Optimization
Reporter: Prof. Toh Kim Chuan (National University of Singapore)
Time: 2019-5-9 (Thursday) 10:00
Place: G507
Abstract: In this talk, we shall demonstrate how second order sparsity (SOS) in important optimization problems such as sparse optimization models in machine learning, semidefinite programming, and many others can be exploited to design highly efficient algorithms.
The SOS property appears naturally when one applies a semismooth Newton (SSN) method to solve the subproblems in an augmented Lagrangian method (ALM) designed for certain classes of structured convex optimization problems. With in-depth analysis of the underlying generalized Jacobians and sophisticated numerical implementation, one can solve the subproblems at surprisingly low costs. For lasso problems with sparse solutions, the cost of solving a single ALM subproblem by our second order method is comparable or even lower than that in a single iteration of many first order methods.
Consequently, with the fast convergence of the SSN based ALM, we are able to solve many challenging large scale convex optimization problems in big data applications efficiently and robustly. For the purpose of illustration, we present a highly efficient software called SuiteLasso for solving various well-known Lasso-type problems.
This talk is based on joint work with Xudong Li (Fudan U.) and Defeng Sun (PolyU).