报告题目:A random perturbation approach to some stochastic approximation algorithms in optimization
报 告 人:Prof. Wenqing Hu (Missouri S&T)
报告时间:2018年5月27日(周日)10:00
报告地点:校本部G309
邀请人:敖平
报告摘要:
Many large-scale learning problems in modern statistics and machine learning can be reduced to solving stochastic optimization problems, i.e., the search for (local) minimum points of the expectation of an objective random function (loss function). These optimization problems are usually solved by certain stochastic approximation algorithms, which are recursive update rules with random inputs in each iteration. In this talk, we will be considering various types of such stochastic approximation algorithms, including the stochastic gradient descent, the stochastic composite gradient descent, as well as the stochastic heavy-ball method. By introducing approximating diffusion processes to the discrete recursive schemes, we will analyze the convergence of the diffusion limits to these algorithms via delicate techniques in stochastic analysis and asymptotic methods, in particular via random perturbations of dynamical systems. This talk is based on a series of joint works with Chris Junchi Li (Princeton), Weijie Su (UPenn) and Haoyi Xiong (Missouri S&T->Baidu).