Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Previously Published Works bannerUC Berkeley

Reliable Estimation of Generalized Linear Mixed Models using Adaptive Quadrature

Published Web Location

http://www.stata-journal.com/article.html?article=st0005
No data is associated with this publication.
Creative Commons 'BY-NC-ND' version 4.0 license
Abstract

Generalized linear mixed models or multilevel regression models have become increasingly popular. Several methods have been proposed for estimating such models. However, to date there is no single method that can be assumed to work well in all circumstances in terms of both parameter recovery and computational efficiency. Stata's xt commands for two-level generalized linear mixed models (e.g., xtlogit) employ Gauss–Hermite quadrature to evaluate and maximize the marginal log likelihood. The method generally works very well, and often better than common contenders such as MQL and PQL, but there are cases where quadrature performs poorly. Adaptive quadrature has been suggested to overcome these problems in the two-level case. We have recently implemented a multilevel versionofthismethodin gllamm, a program that fits a large class of multilevel latent variable models including multilevel generalized linear mixed models. As far as we know, this is the first time that adaptive quadrature has been proposed for multilevel models. We show that adaptive quadrature works well in problems where ordinary quadrature fails. Furthermore, even when ordinary quadrature works, adaptive quadrature is often computationally more efficient since it requires fewer quadrature points to achieve the same precision.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Item not freely available? Link broken?
Report a problem accessing this item