Skip to main content
eScholarship
Open Access Publications from the University of California

Markov-switching Model Selection Using Kullback-Leibler Divergence

Abstract

In Markov-switching regression models, we use Kullback-Leibler (KL) divergence between the true and candidate models to select the number of states and variables simultaneously. In applying Akaike information criterion (AIC), which is an estimate of KL divergence, we find that AIC retains too many states and variables in the model. Hence, we derive a new information criterion, Markov switching criterion (MSC), which yields a marked improvement in state determination and variable selection because it imposes an approriate penalty to mitigate the over-retention of states in the Markov chain. MSC performs well in Monte Carlo studies with single and multiple states, small and large samples, and low and high noise. Furthermore, it not only applies to Markov-switching regression models, but also performs well in Markov-switching autoregression models. Finally, the usefulness of MSC is illustrated via applications to the U.S. business cycle and the effectiveness of media advertising.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View