Skip to main content
eScholarship
Open Access Publications from the University of California

UC Riverside

UC Riverside Electronic Theses and Dissertations bannerUC Riverside

The Second-Order Bias and MSE of Quantile and Expectile Estimators

Creative Commons 'BY-NC-SA' version 4.0 license
Abstract

This dissertation covers several topics in the second-order bias and mean squared error (MSE) of quantile and expectile estimators.

Chapter one presents the introduction of this dissertation. Thefinite sample theory using higher order asymptotics provides better approximations of the bias and MSE for a class of estimators. Rilstone, Srivastava and Ullah (1996) provided the second-order bias results of conditional mean regression. The goal of this dissertation is to develops analytical results on the second-order bias and MSE for quantile and expectile estimators.

Chapter two develops new analytical results on the second-order bias up to order O(N^-1) and MSE up to order O(N^-2) of the conditional quantile regression estimators. First, we provide the general results on the second-order bias and MSE of conditional quantile estimators. The second-order bias result enables an improved bias correction and thus to obtain improved quantile estimation. In particular, we show that the second-order bias are much larger towards the tails of the conditional density than near the median, and therefore the benet of the second order bias correction is greater when we are interested in the deeper tail quantiles, e.g., for the study of income distribution and financial risk management. The higher order MSE result for the quantile estimation also enables us to better understand the sources of estimation uncertainty. Next, we consider three special cases of the general results, for the unconditional quantile estimation, for the conditional quantile regression with a binary covariate, and for the instrumental variable quantile regression (IVQR). For each of these special cases, we provide the second-order bias and MSE to illustrate their behavior which depends on certain parameters and distributional characteristics. The Monte Carlo simulation indicates that the bias is larger at the extreme low and high tail quantiles, and the second-order bias corrected estimator has better behavior than the uncorrected ones in both conditional and unconditional quantile regression. The second-order bias corrected estimators are numerically much closer to the true estimators of data generating processes. As the higher order bias and MSE decrease as the sample size increases or as the regression error variance decreases, the benefits of the finite sample theory are more apparent when there are larger sampling errors in estimation.

Chapter three develops the second-order asymptotic properties (bias and mean squared error) of the asymmetric least squares (ALS) or expectile estimator, extending the second-order asymptotic results for the symmetric least squares (LS) estimators of Rilstone, Srivastava and Ullah (1996). The LS gives the mean regression function while the ALS gives the "expectile" regression function, a generalization of the usual regression function. The second-order bias result enables an improved bias correction and thus to obtain improved ALS estimation. In particular, we show that the second-order bias is much larger as the asymmetry is stronger, and therefore the benet of the second-order bias correction is greater when we are interested in extreme expectiles which are used as a risk measure in financial economics. The higher order MSE result for the ALS estimation also enables us to better understand the sources of estimation uncertainty. The Monte Carlo simulation confirms the benefits of the second-order asymptotic theory and indicates that the second-order bias is larger at the extreme low and high expectiles, and the second-order bias correction improves the ALS estimator in bias.

Chapter four introduces the predictive quantile regression and predictive expectile regression. Predictive regression is a fundamental econometric model and widely discussed in finance literature. This chapter focuses on the second-order bias reduction for both regression models, which enable us to obtain a better predictive estimates. An empirical application to stock return prediction using the dividend yield illustrates the benet of the proposed second-order bias reduction method. We show that the bias is larger at the tails of the stock return distribution.

Chapter five contains the conclusion.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View