Home > Events > 2018 Seminars & Colloquia > Yian Ma, University of California, Berkeley

Yian Ma, University of California, Berkeley

Main Content

Bridging MCMC and Optimization
When
24 January 2019 from 3:30 PM to 4:30 PM
Where
201 Thomas Building
Contact Name
Add event to calendar
vCal
iCal

 In this talk, Yian will discuss three ingredients of optimization theory in the context of MCMC: Non-convexity, Acceleration, and stochasticity. He will focus on a class of non-convex objective functions arising from mixture models. For that class of objective functions and will demonstrate that the computational complexity of a simple MCMC algorithm scales linearly with the model dimension, while optimization problems are NP-hard.

He will then study MCMC algorithms as optimization over the KL-divergence in the space of measures. By incorporating a momentum variable and will discuss an algorithm which performs accelerated gradient descent over the KL-divergence. Using optimization-like ideas, a suitable Lyapunov function is constructed to prove that an accelerated convergence rate is obtained.

Finally, he will present a complete recipe for constructing stochastic gradient MCMC algorithms that translates the task of finding a valid sampler into one of  choosing two matrices. He will then describe how stochastic gradient MCMC algorithms can be applied to applications involving temporally correlated data, where the challenge arises from the need to break the dependencies when considering minibatches of observations. 

Bio:

Yian Ma is currently a post-doctoral fellow at University of California, Berkeley, hosted by Michael I. Jordan at the Foundations of Data Analysis Institute and RISELab. Prior to that, he obtained his PhD from applied mathematics department at University of Washington, working with Emily B. Fox at Mode Lab and Hong Qian. Before that, he obtained his bachelor's degree from the department of computer science and engineering at Shanghai Jiao Tong University

Filed under: ,