Search Results

You are looking at 1 - 2 of 2 items

  • Author: Garry D. A. Phillips x
Clear All Modify Search


We compare a number of bias-correction methodologies in terms of mean squared error and remaining bias, including the residual bootstrap, the relatively unexplored Quenouille jackknife, and methods based on analytical approximation of moments. We introduce a new higher-order jackknife estimator for the AR(1) with constant. Simulation results are presented for four different error structures, including GARCH. We include results for a relatively extreme situation where the errors are highly skewed and leptokurtic. It is argued that the bootstrap and analytical-correction (COLS) approaches are to be favoured overall, though the jackknife methods are the least biased. We find that COLS tends to have the lowest mean squared error, though the bootstrap also does well.


In this paper we extend the results in [] in two directions: First, we show that by bias correcting the estimated mean reversion parameter we can also have better finite sample properties of the testing procedure using a t-statistic in the near unit root situation when the mean reversion parameter is approaching its lower bound versus using the Jackknife estimator of Phillips and Yu []. Second, we show that although Tang and Chen [] demonstrate that the variance of the maximum likelihood estimator of the long term mean parameter is of an order equal to the reciprocal of the sample size (the same order as that of the bias and variance of the mean reversion parameter estimator and so it does not converge very fast to its true value), the t-statistic related to that parameter does not exhibit large empirical size distortions and so does not need to be bias corrected in practice.