Home > Events > 2017 Seminars & Colloquia > Yuxin Chen, Princeton University

Yuxin Chen, Princeton University

Main Content

Random initialization and implicit regularization in nonconvex statistical estimation
When
12 April 2018 from 4:00 PM to 5:00 PM
Where
104 Thomas Building
Contact Name
Add event to calendar
vCal
iCal

Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation problems. Due to the highly nonconvex nature of the empirical loss, state-of-the-art procedures often require suitable initialization and proper regularization (e.g. trimming, regularized cost, projection) in order to guarantee fast convergence. For vanilla procedures such as gradient descent, however, prior theory is often either far from optimal or completely lacks theoretical guarantees.  

This talk is concerned with a striking phenomenon arising in two nonconvex problems (i.e. phase retrieval and matrix completion):  even in the absence of careful initialization and/or explicit regularization, gradient descent converges to the optimal solution within a logarithmic number of iterations, thus achieving near-optimal statistical and computational guarantees at once. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data. As a byproduct, for noisy matrix completion, we demonstrate that gradient descent achieves near-optimal entrywise error control. 

This is joint work with Cong Ma,  Kaizheng Wang, Yuejie Chi, and Jianqing Fan.  

Filed under: ,

Navigation for this Section

Events