报告题目: Solving 10,000-Dimensional Optimization Problems Using Inaccurate Function Values: An Old Algorithm
报告人: 张在坤
照片:

邀请人: 高卫峰
报告时间: 7月8日 10:00
报告地点: 行政辅楼119
报告人简介: 张在坤 2007 年本科毕业于吉林大学,2012 年博士毕业于中国科学院,目前为中山大学数学学院教授、博士生导师、逸仙优秀学者。主要研究兴趣为最优化理论与算法,特别是无导数方法、基于不精确信息的方法、随机化方法等。代表作发表于 Math. Program.、 SIAM J. Optim.、SIAM J. Sci. Comput. 等杂志。张在坤曾主持香港研究资助局 ECS/GRF 项目五项, 参与科技部国家重点研发计划一项,并于 2023 年入选国家级青年人才计划。2024 年,张在坤的团队被授予中国运筹学会科学技术奖“运筹应用奖”,以表彰其对无导数优化算法、软件及其工业应用作出的贡献。
报告摘要:
We reintroduce a derivative-free subspace optimization framework originating from Chapter 5 of [Z. Zhang, On Derivative-Free Optimization Methods, PhD thesis, Chinese Academy of Sciences, Beijing, 2012 (supervisor Ya-xiang Yuan)]. At each iteration, the framework defines a low-dimensional subspace based on an approximate gradient, and then solves a subproblem in this subspace to generate a new iterate. We sketch the global convergence and worst-case complexity analysis of the framework, elaborate on its implementation, and present some numerical results on problems with dimension as high as 10,000.
The same framework was presented by Zhang during ICCOPT 2013 in Lisbon under the title “A Derivative-Free Optimization Algorithm with Low-Dimensional Subspace Techniques for Large-Scale Problems", although it remained nearly forgotten by the community until very recently. An algorithm following this framework named NEWUOAs was implemented by Zhang in MATLAB in 2011 (https: //github.com/newuoas/newuoas), ported to Module-3 by Nystroem (Intel) in 2017, and included in cm3 in 2019 (https://github.com/modula3/cm3/blob/master/caltech-other/newuoa/src/NewUOAs.m3).
This talk is based on [Z. Zhang, Scalable Derivative-Free Optimization Algorithms With Low-Dimensional Subspace Techniques, arXiv:2501.04536, 2025].
学院
皇冠hg·体育(中国)官方网站