CN EN

理学院2018-2019学年冬季第一周学术报告(7)


学 术 报 告


报告题目:Globally and quadratically convergent Newton method for sparse optimization


报告学者:周声龙


报告者单位:南安普顿大学 


报告时间:2019年1月21日10:40---11:20 


报告地点:思源楼101室


报告摘要:  Newton method to deal with continuous optimization has been known to be notoriously slow when the involved linear equation has a big size (e.g., the number of the variables/equations is over ten thousand). But it is capable of converging fast (with quadratic convergence rate often) as long as the sequence reaches to the local area of a solution. When it comes to the sparsity constrained optimization (SCO), a combinatorial problem, what is its performance? This talk will answer this question. Theoretically, it is the first time to prove that our proposed Newton method is able to converge to a stationary point of SCO globally and quadratically under standard assumptions. Numerically, the complexity of solving a linear equation with n variables in each step by Newton method is O(n^3) in general, while our proposed method enjoys a significantly low complexity O(s^3), where s << n is a given scalar to control the number of non-zeros of a solution, which signifies it is computationally fast even for large n. When against with other state-of-the-art methods on solving compressed sensing in signal processing and sparse logistic regression in marching learning with big data, our proposed method exhibits its extraordinary advantages.


主办教师:孔令臣

欢迎广大同学老师积极踊跃参加!