International Joint Conference On Theoretical Computer Science – Frontier of Algorithmic Wisdom

August 15-19, 2022, City University of Hong Kong, Hong Kong

 

Invited Speakers

Undergraduate Research Forum


Characterizing Parametric and Convergence Stability in Nonconvex Nonsmooth Optimization

Hanyu Li

Peking University

Abstract:
In this talk, we introduce two notions of stability in minimizing continuous functions: parametric stability, which looks at how the local optima change in response to changes in the input parameters; and convergence stability, which looks at whether an iterative method started near a local optimum should be expected to stay near and eventually converge to this optimum. We give several rather tight conditions of these two stability concepts for a wide range of functions and optimization algorithms considered in nonconvex nonsmooth optimization. We further present an application in optimization of deep neural networks. We prove that for an overparameterized smooth neural network trained with gradient descent and quadratic loss, all zero-loss points are not convergence-stable. This indicates that a slight difference in initialization (even sufficiently close to a global minimum point) can lead to qualitatively different trained parameters.
This talk is based on joint work with Xiaotie Deng and Ningyuan Li.

Bio:
Hanyu Li is an incoming PhD candidate at Peking University. He received his bachelor's degree in computer science from Peking University. His current works focus on computational game theory and continuous optimization.