|
on Econometric Time Series |
By: | Tim Bollerslev (Department of Economics, Duke University, and NBER and CREATES); Viktor Todorov (Department of Finance, Kellogg School of Management, Northwestern University) |
Abstract: | We propose a new and flexible non-parametric framework for estimating the jump tails of Itô semimartingale processes. The approach is based on a relatively simple-to-implement set of estimating equations associated with the compensator for the jump measure, or its "intensity", that only utilizes the weak assumption of regular variation in the jump tails, along with in-fill asymptotic arguments for uniquely identifying the "large" jumps from the data. The estimation allows for very general dynamic dependencies in the jump tails, and does not restrict the continuous part of the process and the temporal variation in the stochastic volatility. On implementing the new estimation procedure with actual high-frequency data for the S&P 500 aggregate market portfolio, we find strong evidence for richer and more complex dynamic dependencies in the jump tails than hitherto entertained in the literature. |
Keywords: | Extreme events, jumps, high-frequency data, jump tails, non-parametric estimation, stochastic volatility |
JEL: | C13 C14 G10 G12 |
Date: | 2010–04–14 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-16&r=ets |
By: | Claudia Miani (Bank of Italy); Stefano Siviero (Bank of Italy) |
Abstract: | It has increasingly become standard practice to supplement point macroeconomic forecasts with an appraisal of the degree of uncertainty and the prevailing direction of risks. Several alternative approaches have been proposed in the literature to compute the probability distribution of macroeconomic forecasts; all of them rely on combining the predictive density of model-based forecasts with subjective judgment about the direction and intensity of prevailing risks. We propose a non-parametric, model-based simulation approach, which does not require specific assumptions to be made regarding the probability distribution of the sources of risk. The probability distribution of macroeconomic forecasts is computed as the result of model-based stochastic simulations which rely on re-sampling from the historical distribution of risk factors and are designed to deliver the desired degree of skewness. By contrast, other approaches typically make a specific, parametric assumption about the distribution of risk factors. The approach is illustrated using the Bank of Italy’s Quarterly Macroeconometric Model. The results suggest that the distribution of macroeconomic forecasts quickly tends to become symmetric, even if all risk factors are assumed to be asymmetrically distributed. |
Keywords: | macroeconomic forecasts, stochastic simulations, balance of risks, uncertainty, fan-charts |
JEL: | C14 C53 E37 |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_758_10&r=ets |
By: | Giovanni De Luca (Dipartimento di Statistica e Matematica per la Ricerca Economica Università di Napoli Parthenope.); Giampiero Gallo (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") |
Abstract: | In this paper we model the dynamics of realized volatility as a Multiplicative Error Model with a mixture of distributions for the innovation term with time-varying mixing weights forced by past behavior of volatility. The mixture considers innovations as a source of time-varying volatility of volatility and is able to capture the right tail behavior of the distribution of volatility. The empirical results show that there is no substantial difference in the one-step ahead conditional expectations obtained according to various mixing schemes but that fixity of mixing weights may be a binding constraint in deriving accurate quantiles of the predicted distribution. |
Keywords: | Multiplicative Error Models, Realized Volatility, Mixture Distributions |
JEL: | C22 C51 C53 |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:fir:econom:wp2010_03&r=ets |
By: | Francq, Christian; Zakoian, Jean-Michel |
Abstract: | This paper studies the asymptotic properties of the quasi-maximum likelihood estimator of ARCH(1) models without strict stationarity constraints, and considers applications to testing problems. The estimator is unrestricted, in the sense that the value of the intercept, which cannot be consistently estimated in the explosive case, is not fixed. A specific behavior of the estimator of the ARCH coefficient is obtained at the boundary of the stationarity region, but this estimator remains consistent and asymptotically normal in every situation. The asymptotic variance is different in the stationary and non stationary situations, but is consistently estimated, with the same estimator, in both cases. Tests of strict stationarity and non stationarity are proposed. Their behaviors are studied under the null assumption and under local alternatives. The tests developed for the ARCH(1) model are able to detect non-stationarity in more general GARCH models. A numerical illustration based on stock indices is proposed. |
Keywords: | ARCH model; Inconsistency of estimators; Local power of tests; Nonstationarity; Quasi Maximum Likelihood Estimation |
JEL: | C13 C12 C22 C01 |
Date: | 2010–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:22414&r=ets |
By: | Jesús Fernández-Villaverde (Department of Economics, University of Pennsylvania); Pablo Guerrón-Quintana (Federal Reserve Bank of Philadelphia); Juan F. Rubio-Ramírez (Department of Economics, Duke University) |
Abstract: | This paper compares the role of stochastic volatility versus changes in monetary policy rules in accounting for the time-varying volatility of U.S. aggregate data. Of special interest to us is understanding the sources of the great moderation of business cycle fluctuations that the U.S. economy experienced between 1984 and 2007. To explore this issue, we build a medium-scale dynamic stochastic general equilibrium (DSGE) model with both stochastic volatility and parameter drifting in the Taylor rule and we estimate it non-linearly using U.S. data and Bayesian methods. Methodologically, we show how to confront such a rich model with the data by exploiting the structure of the high-order approximation to the decision rules that characterize the equilibrium of the economy. Our main empirical findings are: 1) even after controlling for stochastic volatility (and there is a fair amount of it), there is overwhelming evidence of changes in monetary policy during the analyzed period; 2) however, these changes in monetary policy mattered little for the great moderation; 3) most of the great performance of the U.S. economy during the 1990s was a result of good shocks; and 4) the response of monetary policy to inflation under Burns, Miller, and Greenspan was similar, while it was much higher under Volcker. |
Keywords: | DSGE models, Stochastic volatility, Parameter drifting, Bayesian methods |
JEL: | E10 E30 C11 |
Date: | 2010–04–15 |
URL: | http://d.repec.org/n?u=RePEc:pen:papers:10-015&r=ets |