World Conference Econometric Society, 2000, Seattle

Yue Fang, University of Oregon |

When Should Time be Continuous? Volatility Modeling and Estimation of High-Frequency Data |

Session: C-11-20 Tuesday 15 August 2000 by Fang, Yue |

The paper studies the problem of volatility modeling and estimation of high-frequency data undercontinuous record asymptotics. The approach decomposes the observed data into pricediffusion and stationary components. The diffusion component may be identified as the"true" value of the underlying asset. The stationary component, termed as thehigh-frequency "noise" (HFN), accommodates pertinent market microstructure features.A simple condition, characterizing the HFN component on which conventional volatilityestimators on the basis of noisy observations will be consistent for diffusion volatility, is derived, and is applied to Reuters FXFX data. It is shown that conventional volatility estimators lead to substantial spurious volatility in high-frequency returns. The failure of conventional estimators in providing consistent estimates is due to the higher irregularities of the HFN sample path, which is induced, at least in part, by trader heterogeneity. In addition, the optimal sampling frequency is acquired which justifies theappropriateness of the use of the 10- to 15-minute sampling intervals - the benchmark noisefilter used in many recent empirical studies dealing with high-frequency foreign exchangedata. |

Submitted paper full-text in .pdf |

File created by Jurgen Doornik with eswc2000.ox on 2-01-2001