数学系Seminar第1819期 Exploiting Second Order Sparsity in Big Data Optimization

创建时间:  2019/05/08  龚惠英   浏览次数:   返回

报告主题:Exploiting Second Order Sparsity in Big Data Optimization 
报告人:卓金全   教授    (新加坡国立大学)
报告时间:2019年5月9日(周四)10:00
报告地点:校本部G507
邀请人:白延琴教授
主办部门:太阳成集团tyc33455数学系
报告摘要: In this talk, we shall demonstrate how second order sparsity (SOS) in important optimization problems such as sparse optimization models in machine learning, semidefinite programming, and many others can be exploited to design highly efficient algorithms.
The SOS property appears naturally when one applies a semismooth Newton (SSN) method to solve the subproblems in an augmented Lagrangian method (ALM) designed for certain classes of structured convex optimization problems. With in-depth analysis of the underlying generalized Jacobians and sophisticated numerical implementation, one can solve the subproblems at surprisingly low costs. For lasso problems with sparse solutions, the cost of solving a single ALM subproblem by our second order method is comparable or even lower than that in a single iteration of many first order methods.
Consequently, with the fast convergence of the SSN based ALM, we are able to solve many challenging large scale convex optimization problems in big data applications efficiently and robustly. For the purpose of illustration, we present a highly efficient software called SuiteLasso for solving various well-known Lasso-type problems.
This talk is based on joint work with Xudong Li (Fudan U.) and Defeng Sun (PolyU).

 


欢迎教师、员工参加!

上一条:数学系Seminar第1820期 Cahn-Hilliard 方程的有限差分方法

下一条:数学系Seminar第1818期 多项式非线性控制系统


数学系Seminar第1819期 Exploiting Second Order Sparsity in Big Data Optimization

创建时间:  2019/05/08  龚惠英   浏览次数:   返回

报告主题:Exploiting Second Order Sparsity in Big Data Optimization 
报告人:卓金全   教授    (新加坡国立大学)
报告时间:2019年5月9日(周四)10:00
报告地点:校本部G507
邀请人:白延琴教授
主办部门:太阳成集团tyc33455数学系
报告摘要: In this talk, we shall demonstrate how second order sparsity (SOS) in important optimization problems such as sparse optimization models in machine learning, semidefinite programming, and many others can be exploited to design highly efficient algorithms.
The SOS property appears naturally when one applies a semismooth Newton (SSN) method to solve the subproblems in an augmented Lagrangian method (ALM) designed for certain classes of structured convex optimization problems. With in-depth analysis of the underlying generalized Jacobians and sophisticated numerical implementation, one can solve the subproblems at surprisingly low costs. For lasso problems with sparse solutions, the cost of solving a single ALM subproblem by our second order method is comparable or even lower than that in a single iteration of many first order methods.
Consequently, with the fast convergence of the SSN based ALM, we are able to solve many challenging large scale convex optimization problems in big data applications efficiently and robustly. For the purpose of illustration, we present a highly efficient software called SuiteLasso for solving various well-known Lasso-type problems.
This talk is based on joint work with Xudong Li (Fudan U.) and Defeng Sun (PolyU).

 


欢迎教师、员工参加!

上一条:数学系Seminar第1820期 Cahn-Hilliard 方程的有限差分方法

下一条:数学系Seminar第1818期 多项式非线性控制系统