Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Estimating Mean and Covariance Structure with Reweighted Least Squares

Abstract

Does Reweighted Least Squares (RLS) perform better in small samples than maximum likelihood (ML) for mean and covariance structure? ML statistics in covariance structure analysis are based on the asymptotic normality assumption; however, actual applications of structural equation modeling (SEM) in social and behavioral science research usually involve small samples. It has been found that chi-square tests often incorrectly over-reject the null hypothesis: Σ=Σ(θ), because when sample is small the sample covariance matrix becomes ill-conditioned and entails unstable estimates. In certain SEM models, the vector of parameter must contain both means, variances and covariances. Yet, whether RLS also works in mean and covariance structure remains unexamined. This research is an extended examination of reweighted least squares in mean and covariance structure. Specifically, we replace biased covariance matrix in traditional GLS function (Browne, 1974) with the unbiased sample covariance matrix that derives from ML estimation. Moreover, under the assumption of multivariate normality, a Monte Carlo simulation study was carried out to examine the statistical performance as compared with ML methods in different sample sizes. Based on empirical rejection frequencies and empirical averages of test statistic, this study shows that RLS performs much better than ML in mean and covariance structure models when sample sizes are small.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View