首页

当前位置: 首页 > 通知公告 > 正文

[学术报告] 数学学院学术报告 张超教授 Adaptive smoothing mini-batch stochastic accelerated gradient method for nonsmooth convex stochastic composite optimization

发布日期:2022-03-14    作者:         点击:

云师范大学数学学院学术报告

题目:Adaptive smoothing mini-batch stochastic accelerated gradient method for nonsmooth convex stochastic composite optimization

报告人:张超 北京交通大学

时间: 2022314日上午1030-12:00

地点:腾讯会议:ID 981741076


Abstract: This paper considers a class of convex constrained nonsmooth convex stochastic composite optimization problems whose objective function is given by the summation of a differentiable convex component, together with a general nonsmooth but convex component. The nonsmooth component is not required to have easily obtainable proximal operator, or have the max structure that the smoothing technique in Nesterov’s smoothing method can be used. In order to solve such type problems, we propose an adaptive smoothing mini-batch stochastic accelerated gradient (AdaSMSAG) method, which combines the stochastic approximation method, the Nesterov's accelerated gradient method, and the smoothing methods that allow general smoothing approximations. Convergence of the method is established. Moreover, the order of the worst-case iteration complexity is better than that of the state-of-the art stochastic approximation methods. Numerical results are provided to illustrate the efficiency of the proposed AdaSMSAG method for a risk management in portfolio optimization and a family of Wasserstein distributionally robust support vector machine problems with real data.



报告人简介:张超,北京交通大学数学与统计学院教授、博士生导师。2008年日本弘前大学博士毕业,导师陈小君教授。主要从事随机优化、非光滑优化的理论和算法研究,取得多项国际领先成果。在领域顶尖期刊SIAM J. Optim.Math. Programming, SIAM J. Sci. Comput., IEEE trans. Image Process., IEEE trans. Neural Network, Transportation Res. Part B. 发表科研论文8篇。主持召开1次国际学术会议。完成国家自然科学基金青年基金1项,国家自然科学基金面上项目1项,目前正在主持国家自然科学基金面上项目1项;主持北京市自然科学基金面上项目1项。



Copyright©2006-2023 All Rights Reserved 云南师范大学云南省现代分析数学及其应用重点实验室 版权所有