|
on Econometric Time Series |
By: | Pablo Duarte; Bernd Süssmuth |
Abstract: | Quarterly GDP figures usually are published with a delay of some weeks. A common way to generate GDP series of higher frequency, i.e. to nowcast GDP, is to use available indicators to calculate a single index by means of a common factor derived from a dynamic factor model (DFM). This paper deals with the implementation stage of this practice. We propose a two-tiered mechanism consisting in the identification of variables highly correlated with GDP as “core” indicators and a check of robustness of these variables in the sense of extreme bounds analysis. Accordingly selected indicators are used in an approximate DFM framework to exemplarily nowcast Spanish GDP growth. We show that our implementation produces more accurate nowcasts than both a benchmark stochastic process and the implementation based on the total set of core indicators. |
Keywords: | small-scale nowcasting models, Kalman Filter, extreme bounds analysis |
JEL: | C38 C53 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_4574&r=ets |
By: | Andreas Blöchl; Gebhard Flaig |
Abstract: | In this paper we use the Hodrick-Prescott filter for analysing global temperature data. We are especially concerned with a reliable estimation of the trend component at the end of the data sample. To this end we employ time-varying values for the penalization parameter. The optimal values are derived by a comparison with an ideal filter. The method is applied to temperature data for the northern hemisphere from 1850 to 2012. The main result is that for the optimal specification of the flexible penalization the trend component of temperature is still increasing, possibly with a somewhat lower pace. |
Keywords: | climate change, global warming, trend, Hodrick-Prescott filter, flexible penalization |
JEL: | Q54 C22 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_4577&r=ets |
By: | Matteo Barigozzi (London School of Economics and Political Science – Department of Statistics); Christian T. Brownlees (Universitat Pompeu Fabra – Department of Economics and Business & Barcelona GSE); Giampiero M. Gallo (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze); David Veredas (ECARES – Solvay Brussels School of Economics and Management – Université libre de Bruxelles) |
Abstract: | Realized volatilities measured on several assets exhibit a common secular trend and some idiosyncratic pattern. We accommodate such an empirical regularity extending the class of Multiplicative Error Models (MEMs) to a model where the common trend is estimated nonparametrically while the idiosyncratic dynamics are assumed to follow univariate MEMs. Estimation theory based on seminonparametric methods is developed for this class of models for large cross-sections and large time dimensions. The methodology is illustrated using two panels of realized volatility measures between 2001 and 2008: the SPDR Sectoral Indices of the S&P500 and the constituents of the S&P100. Results show that the shape of the common volatility trend captures the overall level of risk in the market and that the idiosyncratic dynamics have an heterogeneous degree of persistence around the trend. An out–of–sample forecasting exercise shows that the proposed methodology improves volatility prediction over a number of benchmark specifications. |
Keywords: | Vector Multiplicative Error Model, Seminonparametric Estimation, Volatility. |
JEL: | C32 C51 G01 |
Date: | 2014–02 |
URL: | http://d.repec.org/n?u=RePEc:fir:econom:wp2014_02&r=ets |
By: | Giampiero M. Gallo (Dipartimento di Statistica, Informatica, Applicazioni "G.Parenti", Università di Firenze); Edoardo Otranto (Dipartimento di Scienze Cognitive e della Formazione, Università degli Studi di Messina) |
Abstract: | Realized volatility of financial time series generally shows a slow–moving average level from the early 2000s to recent times, with alternating periods of turmoil and quiet. Modeling such a pattern has been variously tackled in the literature with solutions spanning from long–memory, Markov switching and spline interpolation. In this paper, we explore the extension of Multiplicative Error Models to include a Markovian dynamics (MS-MEM). Such a model is able to capture some sudden changes in volatility following an abrupt crisis and to accommodate different dynamic responses within each regime. The model is applied to the realized volatility of the S&P500 index: next to an interesting interpretation of the regimes in terms of market events, the MS-MEM has better in–sample fitting capability and achieves good out–of–sample forecasting performances relative to alternative specifications. |
Keywords: | MEM, regime switching, realized volatility, volatility persistence, volatility forecasting |
JEL: | C22 C24 C58 |
Date: | 2014–02 |
URL: | http://d.repec.org/n?u=RePEc:fir:econom:wp2014_03&r=ets |
By: | Makoto Takahashi (Center for the Study of Finance and Insurance, Osaka University and Department of Finance, Kellogg School of Management, Northwestern University); Toshiaki Watanabe (Institute of Economic Research, Hitotsubashi University); Yasuhiro Omori (Faculty of Economics, The University of Tokyo) |
Abstract: |    The realized stochastic volatility model of Takahashi, Omori, and Watanabe (2009), which incorporates the asymmetric stochastic volatility model with the realized volatility, is extended with more general form of bias correction in realized volatility and wider class distribution, the generalized hyperbolic skew Student's t -distribution, fornancial returns. The extensions make it possible to adjust the bias due to the market microstructure noise and non-trading hours, which possibly depends on the level of the volatility, and to consider the heavy tail and skewness in nancial returns. With the Bayesian estimation scheme via Markov chain Monte Carlo method, the model enables us to estimate the parameters in the return distribution and in the model jointly. It also makes it possible to forecast volatility and return quantiles by sampling from their posterior distributions jointly. The model is applied to quantile forecasts of nancial returns such as value-at-risk and expected shortfall as well as volatility forecasts and those forecasts are evaluated by several backtesting procedures. Empirical results with SPDR, the S&P 500 exchange-traded fund, show that the heavy tail and skewness of daily returns are important for the model fit and the quantile forecasts but not for the volatility forecasts, and that the additional bias correction improves the quantile forecasts but does not substantially improve the model fit nor the volatility forecasts. |
Date: | 2014–02 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2014cf921&r=ets |
By: | Hautsch, Nikolaus; Okhrin, Ostap; Ristig, Alexander |
Abstract: | We propose an iterative procedure to efficiently estimate models with complex log-likelihood functions and the number of parameters relative to the observations being potentially high. Given consistent but inefficient estimates of sub-vectors of the parameter vector, the procedure yields computationally tractable, consistent and asymptotic efficient estimates of all parameters. We show the asymptotic normality and derive the estimator's asymptotic covariance in dependence of the number of iteration steps. To mitigate the curse of dimensionality in high-parameterized models, we combine the procedure with a penalization approach yielding sparsity and reducing model complexity. Small sample properties of the estimator are illustrated for two time series models in a simulation study. In an empirical application, we use the proposed method to estimate the connectedness between companies by extending the approach by Diebold and Yilmaz (2014) to a high-dimensional non-Gaussian setting. -- |
Keywords: | Multi-Step estimation,Sparse estimation,Multivariate time series,Maximum likelihood estimation,Copula |
JEL: | C13 C32 C50 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:zbw:cfswop:450&r=ets |