Asymptotic posterior approximation and efficient MCMC sampling for Generalized Linear Mixed Models
Generalized linear mixed models (GLMMs) provide statisticians, scientists, and analysts great flexibility to model data in a variety of situations. However, GLMMs frequently produce unrecognizable conditional distributions when attempting to analyze them in the Bayesian framework with Gibbs sampling. Traditionally, complex sampling schemes are used to obtain samples from these distributions. Our focus is to obtain asymptotic normal for these distributions and apply the theoretical results to speed up the process of MCMC.