报告地点:行健楼学术活动室526
邀请人:蔡邢菊教授
Abstract:Large-scale optimization problems arising from data science and statistics often look for optimal solutions with certain structured sparsity properties. In this talk, we shall introduce a dual semismooth Newton based proximal point algorithm (PPDNA) to solve such problems and explain how our method can be much more efficient than various first-order methods. The key idea is to make use of the second-order sparsity of the solutions, in addition to data sparsity, to make the per-iteration cost of our second-order method to be as low as that of first-order methods. We demonstrate that by incorporating the PPDNA within an adaptive sieving framework, we can efficiently generate the solution paths of large-scale problems corresponding to a sequence of regularization parameters. We shall illustrate the high efficiency of our approach on several popular models including convex clustering, lasso, and exclusive lasso.
Short Biography: Kim-Chuan Toh is currently a Chair Professor in the Department of Mathematics at the National University of Singapore. He works extensively on convex programming, particularly large-scale matrix optimization problems such as semidefinite programming and sparse optimization problems arising from data science and machine learning. Currently he serves as a co-Editor for Mathematical Programming, an Area Editor for Mathematical Programming Computation, and an Associate Editor for several journals including SIAM J. Optimization and Operations Research. He received the Farkas Prize in 2017 from the INFORMS Optimization Society, and the triennial Beale-Orchard Hays Prize in 2018 and Pual Tseng Memorial Lectureship in 2024 from the Mathematical Optimization Society. He is a Fellow of the Society for Industrial and Applied Mathematics, and a Fellow of the Singapore National Academy of Science.