We use Gaussian copulas (combined with fixed/free-form margins) as automated inference engines for variational approximation in generic hierarchical Bayesian models (the only two model-specific terms are the log likelihood & prior term and its derivatives). We evaluate the peculiarities reproduced in the univariate margins and the posterior dependence captured broadly across latent variables.
Shaobo Han, Xuejun Liao, David B. Dunson, and Lawrence Carin, "Variational Gaussian Copula Inference", The 19th International Conference on Artificial Intelligence and Statistics (AISTATS 2016), Cadiz, Spain, May, 2016
>> demo_SkewNormal
>> demo_StudentT
>> demo_Gamma
>> demo_Beta
The accuracy of marginal approximation for real, positive real, and truncated [0,1] variables is shown as follows,
>> demo_BivariateLN
We approximate bivariate log-normal distributions using a bivariate Gaussian copula with (1) fixed-form log-normal distributed margins (2) free-form Bernstein polynomial based margins,
Baseline comparisons include:
- Gibbs sampler
- Mean-field VB
- VGC-LN-full: Gaussian copula with log-normal margins
- VGC-LN-diag: Independence copula with Log-normal margins
- VGC-BP-full: Gaussian copula with Bernstein polynomial margins
>> demo_Horseshoe
MCMC sampler is implemented in JAGS:
>> demo_JAGS_PoissonLogLinear
Variational Gaussian copula (VGC) inference:
>> demo_VGC_PoissonLogLinear
The univaraite margins and pairwise posteriors (JAGS v.s. VGC-BP) are shown below:
If you find this code helpful, please cite the work using the following information:
@inproceedings{VGC_2016,
title={Variational Gaussian Copula Inference},
author={Shaobo Han and Xuejun Liao and David B. Dunson and Lawrence Carin},
booktitle={The 19th International Conference on Artificial Intelligence and Statistics (AISTATS)},
year={2016},
}