网赌

网赌 > 学术报告 > 正文
0/1 Optimization Solving Sample Average Approximation for Chance Constrained Programming
报告人:周声龙博士,帝国理工学院 时间:2022年11月23日18:30 字号:

报告地点:腾讯会议 999-558-284

邀请人:孙海琳教授

报告摘要:Sample average approximation (SAA) is a tractable approach to deal with the chance constrained programming, a challenging issue in stochastic programming. The constraint is usually characterized by the 0/1 loss function which results in enormous diffi- culties in designing numerical algorithms. Most current methods have been created based on the SAA reformulation, such as binary integer programming or the relaxation. However, no viable algorithms have been developed to tackle SAA directly, not to mention theoreti- cal guarantees. In this paper, we investigate a novel 0/1 constrained optimization problem, which provides a new way to address SAA. Specifically, by deriving the Bouligand tangent and Frechet normal cones of the 0/1 constraint, we establish several optimality conditions including the one that can be equivalently expressed by a system of equations, thereby allowing us to design a smoothing Newton type method. We show that the proposed algo- rithm has a locally quadratic convergence rate and high numerical performance.

报告人简介:Shenglong Zhou currently is a Research Fellow at the Department of Electrical and Electronic Engineering, Imperial College London, the United Kingdom. He received his BSc degree in Information and Computing Science and MSc degree in Operational Research from Beijing Jiaotong University, China, in 2011 and 2014, respectively, and received his PhD degree in Operational Research from the University of Southampton, the United Kingdom, in 2018.

His research interests include the theory and methods for optimization in the areas of sparse optimization, 0/1 loss optimization, low-rank matrix optimization, bilevel optimiza- tion, and machine learning-related optimization. He received the New World Mathematics Awards in 2019 and served as the workshop chair of the IEEE VTC2022-Fall. Part of his research has been published on SIOPT, SISC, ACHA, IEEE TPAMI, IEEE TSP, JMLR.



【打印此页】 【关闭窗口】