Laplace Approximation in High-dimensional Bayesian Regression

Research paper by Rina Foygel Barber, Mathias Drton, Kean Ming Tan

Indexed on: 28 Mar '15Published on: 28 Mar '15Published in: Mathematics - Statistics


We consider Bayesian variable selection in sparse high-dimensional regression, where the number of covariates $p$ may be large relative to the samples size $n$, but at most a moderate number $q$ of covariates are active. Specifically, we treat generalized linear models. For a single fixed sparse model with well-behaved prior distribution, classical theory proves that the Laplace approximation to the marginal likelihood of the model is accurate for sufficiently large sample size $n$. We extend this theory by giving results on uniform accuracy of the Laplace approximation across all models in a high-dimensional scenario in which $p$ and $q$, and thus also the number of considered models, may increase with $n$. Moreover, we show how this connection between marginal likelihood and Laplace approximation can be used to obtain consistency results for Bayesian approaches to variable selection in high-dimensional regression.