Abstract
We propose both the first embarrassingly parallel consensus variational inference algorithm and a new consensus Monte Carlo algorithm for efficient implementation of Bayesian nonparametric mixture models. The proposed algorithms are based on a group clustering approach, and they substantially accelerate inference and reduce memory costs compared with standard Markov chain Monte Carlo and variational inference algorithms for clustering. We demonstrate that our proposed algorithms are significantly faster than competing methods while maintaining the same clustering accuracy. Due to their simplicity and embarrassingly parallel nature, our proposed algorithms are straightforward to implement and widely applicable beyond the models and applications considered in this paper.