Yian Ma, from the Statistics Department, will present an overview of methods combining sampling and optimization for addressing large scale inference problems, in particular stochastic gradient MCMC techniques.
Stochastic gradient MCMC for independent and correlated data
Abstract: In this talk, we will present a general recipe for constructing stochastic gradient samplers that translates the task of finding a valid sampler into one of choosing two matrices. Importantly, any continuous Markov process sampling from the target distribution can be written in our framework. We then describe how SG-MCMC algorithms can be applied to applications involving temporally dependent data, where the challenge arises from the need to break the dependencies when considering minibatches of observations. We propose an algorithm that harnesses the inherent memory decay of the process and provably leads to the correct stationary distribution.
Papers: https://arxiv.org/abs/1811.08413, http://papers.nips.cc/paper/5891-a-complete-recipe-for-stochastic-gradient-mcmc
The Berkeley Statistics and Machine Learning Discussion Group meets weekly to discuss current applications across a wide variety of research domains and software methodologies. Register here to view, propose and vote for this group's upcoming discussion topics. All interested members of the UC Berkeley and LBL communities are welcome and encouraged to attend. Questions may be directed to François Lanusse.
Speaker(s)

Yian Ma
Yian Ma is currently a post-doctoral fellow at University of California, Berkeley, hosted by Michael I. Jordan at the Foundations of Data Analysis Institute and RISELab. His current research is mainly focused on scalable inference methods and their theoretical guarantees. He has been designing new Bayesian inference algorithms (with a focus on applying them to time series data) and deriving theoretical guarantees for them.