When: Friday, February 20, 11:00 AM
Where: Pharmacy 240
Abstract
In this talk, we explore efficient Gaussian process surrogate modeling in two distinct contexts: bandit optimization and blackbox posterior approximation. For optimization, we propose novel noise-free Bayesian optimization strategies that incorporate a random exploration step to enhance the accuracy of Gaussian process surrogate models. The new algorithms retain the ease of implementation of the classical GP-UCB algorithm, but the additional random exploration step accelerates their convergence, nearly achieving the optimal convergence rate. For blackbox posterior approximation, we propose utilizing optimization iterates as design points to build a Gaussian process surrogate model for the unnormalized log-posterior density when the likelihood is intractable. We show that the Hellinger distance between the true and approximate posterior distributions decays at a near-optimal rate. We demonstrate the effectiveness of our optimization algorithms on benchmark non-convex test functions and in a black-box engineering design problem, and showcase our posterior approximation approach in Bayesian inference for parameters of dynamical systems.
Bio
Hwanwoo Kim is a postdoctoral associate in the Department of Statistical Science at Duke University, working with Eric Laber and Simon Mak. He received his Ph.D. in Computational and Applied Mathematics from the University of Chicago, where he studied statistical inverse problems under the guidance of Daniel Sanz-Alonso and stochastic approximation and optimization with Panos Toulis. His research interests lie broadly in developing computational and statistical tools for Bayesian and probabilistic modeling, experimental design, and sequential decision-making. He is also passionate about applying these methodological tools to problems arising in the domain sciences.
