MCMC Algorithms for Bayesian inference in the context of Big Data : Comparison and assessment of five recent algorithms.
-
You can find the pdf at https://adrienhai.github.io/thesis.pdf
-
You can find the implementations of the algorithms in the 'code' folder :
Quiroz, M., Kohn, R., Villani, M., and Tran, M.-N. (2018). Speeding up mcmc by efficient data subsampling. Journal of the American Statistical Association, (just-accepted):1–35.
Korattikara, A., Chen, Y., and Welling, M. (2014). Austerity in mcmc land: Cutting the metropolis-hastings budget. In International Conference on Machine Learning, pages 181–189.
Neiswanger, W., Wang, C., and Xing, E. (2013). Asymptotically exact, embarrassingly parallel mcmc. arXiv preprint arXiv:1311.4780.
Scott, S. L., Blocker, A. W., Bonassi, F. V., Chipman, H. A., George, E. I., and McCulloch, R. E. (2016). Bayes and big data: The consensus monte carlo algorithm. International Journal of Management Science and Engineering Management, 11(2):78–88.
Welling, M. and Teh, Y. W. (2011). Bayesian learning via stochastic gradient langevin dynamics. In Proceedings of the 28th International Conference on Machine Learning (ICML-11), pages 681–688.