Skip to main content
eScholarship
Open Access Publications from the University of California

Boundaries of Predictability: Noisy Predictive Regressions

  • Author(s): Torous, Walter
  • Valkanov, Rossen
  • et al.
Abstract

Even if returns are truly forecasted by variables such as the dividend yield, the noise in such a predictive regression may overwhelm the signal of the conditioning variable and render estimation, inference and forecasting unreliable. Unfortunately, traditional asymptotic approximations are not suitable to investigate the small sample properties of forecasting regressions with excessive noise. To systematically analyze predictive regressions, it is useful to quantify a forecasting variable’s signal relative to the noisiness of returns in a given sample. We define an index of signal strength, or information accumulation, by renormalizing the signal-noise ratio. The novelty of our parameterization is that this index explicitly influences rates of convergence and can lead to inconsistent estimation and testing, unreliable R2s, and no out-of-sample forecasting power. Indeed, we prove that if the signal-noise ratio is close to zero, as is the case for many of the explanatory variables previously suggested in the finance literature, model based forecasts will do no better than the corresponding simple unconditional mean return. Our analytic framework is general enough to capture most of the previous findings surrounding predictive regressions using dividend yields and other persistent forecasting variables.

Main Content
Current View