|
on Econometric Time Series |
By: | Xu, Yongdeng (Cardiff Business School) |
Abstract: | This paper proposes a new class of multivariate volatility model that utilising high-frequency data. We call this model the DCC-HEAVY model as key ingredients are the Engle (2002) DCC model and Shephard and Sheppard (2012) HEAVY model. We discuss the models' dynamics and highlight their differences from DCC-GARCH models. Specifically, the dynamics of conditional variances are driven by the lagged realized variances, while the dynamics of conditional correlations are driven by the lagged realized correlations in the DCC-HEAVY model. The new model removes well known asymptotic bias in DCC-GARCH model estimation and has more desirable asymptotic properties. We also derive a Quasi-maximum likelihood estimation and provide closed-form formulas for multi-step forecasts. Empirical results suggest that the DCC-HEAVY model outperforms the DCC-GARCH model in and out-of-sample. |
Keywords: | HEAVY model, Multivariate volatility, High-frequency data, Forecasting, Wishart distribution |
JEL: | C32 C58 G17 |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:cdf:wpaper:2019/5&r=all |
By: | Lubik, Thomas A.; Matthes, Christian; Verona, Fabio |
Abstract: | We study the behavior of key macroeconomic variables in the time and frequency domain. For this purpose, we decompose U.S. time series into various frequency components. This allows us to identify a set of stylized facts: GDP growth is largely a high-frequency phenomenon whereby infl ation and nominal interest rates are characterized largely by low-frequency components. In contrast, unemployment is a medium-term phenomenon. We use these decompositions jointly in a structural VAR where we identify monetary policy shocks using a sign restriction approach. We fi nd that monetary policy shocks affect these key variables in a broadly similar manner across all frequency bands. Finally, we assess the ability of standard DSGE models to replicate these fi ndings. While the models generally capture low-frequency movements via stochastic trends and business cycle fl uctuations through various frictions they fail at capturing the medium-term cycle. |
JEL: | C32 C51 E32 |
Date: | 2019–02–20 |
URL: | http://d.repec.org/n?u=RePEc:bof:bofrdp:2019_005&r=all |
By: | Gao, Yan; Zhang, Xinyu; Wang, Shouyang; Chong, Terence Tai Leung; Zou, Guohua |
Abstract: | This paper develops a frequentist model averaging approach for threshold model specifications. The resulting estimator is proved to be asymptotically optimal in the sense of achieving the lowest possible squared errors. In particular, when com-bining estimators from threshold autoregressive models, this approach is also proved to be asymptotically optimal. Simulation results show that for the situation where the existing model averaging approach is not applicable, our proposed model averaging approach has a good performance; for the other situations, our proposed model aver-aging approach performs marginally better than other commonly used model selection and model averaging methods. An empirical application of our approach on the US unemployment data is given. |
Keywords: | Asymptotic optimality · Generalized cross-validation · Model averaging, Threshold model |
JEL: | C13 C52 |
Date: | 2017–11–28 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:92036&r=all |
By: | Joshua C. C. Chan |
Abstract: | Bayesian vector autoregressions are widely used for macroeconomic forecasting and structural analysis. Until recently, however, most empirical work had considered only small systems with a few variables due to parameter proliferation concern and computational limitations. We first review a variety of shrinkage priors that are useful for tackling the parameter proliferation problem in large Bayesian VARs, followed by a detailed discussion of efficient sampling methods for overcoming the computational problem. We then give an overview of some recent models that incorporate various important model features into conventional large Bayesian VARs, including stochastic volatility, non-Gaussian and serially correlated errors. Efficient estimation methods for fitting these more flexible models are also discussed. These models and methods are illustrated using a forecasting exercise that involves a real-time macroeconomic dataset. The corresponding MATLAB code is also provided. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:een:camaaa:2019-19&r=all |
By: | Chambers, Marcus J; Taylor, AM Robert |
Abstract: | We consider a model of deterministic one-time parameter change in a continuous time autoregressive model around a deterministic trend function. The exact discrete time analogue model is detailed and compared to corresponding parameter change models adopted in the discrete time literature. The relationships between the parameters in the continuous time model and the discrete time analogue model are also explored. Our results show that the discrete time models used in the literature can be justified by the corresponding continuous time model, with a only a minor modification needed for the (most likely) case where the changepoint does not coincide with one of the discrete time observation points. The implications of our results for a number of extant discrete time models and testing procedures are discussed. |
Date: | 2019–02–14 |
URL: | http://d.repec.org/n?u=RePEc:esy:uefcwp:24072&r=all |
By: | Nikolaos Passalis; Anastasios Tefas; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis |
Abstract: | Deep Learning (DL) models can be used to tackle time series analysis tasks with great success. However, the performance of DL models can degenerate rapidly if the data are not appropriately normalized. This issue is even more apparent when DL is used for financial time series forecasting tasks, where the non-stationary and multimodal nature of the data pose significant challenges and severely affect the performance of DL models. In this work, a simple, yet effective, neural layer, that is capable of adaptively normalizing the input time series, while taking into account the distribution of the data, is proposed. The proposed layer is trained in an end-to-end fashion using back-propagation and can lead to significant performance improvements. The effectiveness of the proposed method is demonstrated using a large-scale limit order book dataset. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1902.07892&r=all |
By: | Fan Yingying; Lv Jinchi; Sharifvaghefi Mahrad; Uematsu Yoshimasa |
Abstract: | Interpretability and stability are two important features that are desired in many contemporary big data applications arising in economics and finance. While the former is enjoyed to some extent by many existing forecasting approaches, the latter in the sense of controlling the fraction of wrongly discovered features which can enhance greatly the interpretability is still largely underdeveloped in the econometric settings. To this end, in this paper we exploit the general framework of model-X knockoffs introduced recently in Candes, Fan, Janson and Lv (2018), which is nonconventional for reproducible large-scale inference in that the framework is completely free of the use of p-values for significance testing, and suggest a new method of intertwined probabilistic factors decoupling (IPAD) for stable interpretable forecasting with knockoffs inference in high-dimensional models. The recipe of the method is constructing the knockoff variables by assuming a latent factor model that is exploited widely in economics and finance for the association structure of covariates. Our method and work are distinct from the existing literature in that we estimate the covariate distribution from data instead of assuming that it is known when constructing the knockoff variables, our procedure does not require any sample splitting, we provide theoretical justifications on the asymptotic false discovery rate control, and the theory for the power analysis is also established. Several simulation examples and the real data analysis further demonstrate that the newly suggested method has appealing finite-sample performance with desired interpretability and stability compared to some popularly used forecasting methods. |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:toh:dssraa:92&r=all |
By: | Pinto, Jeronymo Marcondes; Marçal, Emerson Fernandes |
Abstract: | Our paper aims to evaluate two novel methods on selecting the best forecasting model or its combination based on a Machine Learning approach. The methods are based on the selection of the ”best” model, or combination of models, by crossvalidation technique, from a set of possible models. The first one is based on the seminal paper of Granger-Bates (1969) but weights are estimated by a process of cross-validation applied on the training set. The second one selects the model with the best forecasting performance in the process described above, which we called CvML (Cross-Validation Machine Learning Method). The following models are used: exponential smoothing, SARIMA, artificial neural networks and Threshold autoregression (TAR). Model specification is chosen by R packages: forecast and TSA. Both methods – CvML and MGB - are applied to these models to generate forecasts from one up to twelve periods ahead. Frequency of data is monthly. We run the forecasts exercise to the following to monthly series of Industrial Product Indices for seven countries: Canada, Brazil, Belgium, Germany, Portugal, UK and USA. The data was collected at OECD data, with 504 observations. We choose Average Forecast Combination, Granger Bates Method, MCS model, Naive and Seasonal Naive Model as benchmarks.Our results suggest that MGB did not performed well. However, CvML had a lower mean absolute error for most of countries and forecast horizons, particularly at longer horizons, surpassing all the proposed benchmarks. Similar results hold for absolute mean forecast error. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:fgv:eesptd:498&r=all |
By: | Galí, Jordi; Gambetti, Luca |
Abstract: | Unconditional reduced form estimates of a conventional wage Phillips curve for the U.S. economy point to a decline in its slope coefficient in recent years, as well as a shrinking role of lagged price inflation in the determination of wage inflation. We provide estimates of a conditional wage Phillips curve, based on a structural decomposition of wage, price and unemployment data generated by a VAR with time varying coefficients, identified by a combination of long-run and sign restrictions. Our estimates show that the key qualitative findings from the unconditional reduced form regressions also emerge in the conditional evidence, suggesting that they are not entirely driven by endogeneity problems or possible changes over time in the importance of of wage markup shocks. The conditional evidence, however, suggests that actual changes in the slope of the wage Phillips curve may not have been as large as implied by the unconditional estimates. |
JEL: | E24 E31 |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13452&r=all |
By: | Carriero, Andrea; Corsello, Francesco; Marcellino, Massimiliano |
Abstract: | Global developments play an important role for domestic inflation rates. Earlier literature has found that a substantial amount of the variation in a large set of national inflation rates can be explained by a single global factor. However, inflation volatility has been typically neglected, while it is clearly relevant both from a policy point of view and for structural analysis and forecasting. We study the evolution of inflation rates in several countries, using a novel model that allows for commonality in both levels and volatilities, in addition to country-specific components. We find that inflation volatility is indeed important, and a substantial fraction of it can be attributed to a global factor that is also driving inflation levels and their persistence. While various phenomena may contribute to global inflation dynamics, it turns out that since the early '90s level and volatility of the estimated global factor are correlated with the Chinese PPI and Oil inflation. The extent of commonality among core inflation rates and volatilities is substantially smaller than for overall inflation, which leaves scope for national monetary policies. Finally, we show that the point and density forecasting performance of the model is good relative to standard benchmarks, which provides additional evidence on its reliability. |
Keywords: | Forecasting; Global factors; inflation; large datasets; Multivariate Autoregressive Index models; Reduced Rank Regressions; volatility |
JEL: | C32 C53 E31 E37 |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13470&r=all |
By: | Ekaterina V. Peneva; Nadia Sadee |
Abstract: | In this Note, we take another look at residual seasonality in several measures of core inflation. |
Date: | 2019–02–12 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfn:2019-02-12-2&r=all |
By: | Adam Elbourne (CPB Netherlands Bureau for Economic Policy Analysis); Kan Ji (CPB Netherlands Bureau for Economic Policy Analysis) |
Abstract: | This research re-examines the findings of the existing literature on the effects of unconventional monetary policy. It concludes that the existing estimates based on vector autoregressions in combination with zero and sign restrictions do not successfully isolate unconventional monetary policy shocks from other shocks impacting the euro area economy. In our research, we show that altering existing published studies by making the incorrect assumption that expansionary monetary shocks shrink the ECB’s balance sheet or even ignoring all information about the stance of monetary policy results in the same shocks and, therefore, the same estimated responses of output and prices. As a consequence, it is implausible that the shocks previously identified in the literature are true unconventional monetary policy shocks. Since correctly isolating unconventional monetary policy shocks is a prerequisite for subsequently estimating the effects of unconventional monetary policy shocks, the conclusions from previous vector autoregression models are unwarranted. We show this lack of identification for different specifications of the vector autoregression models and different sample periods. |
JEL: | C32 E52 |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:cpb:discus:391&r=all |
By: | Canova, Fabio; Matthes, Christian |
Abstract: | We consider a set of potentially misspecified structural models, geometrically combine their likelihood functions, and estimate the parameters using composite methods. Composite estimators may be preferable to likelihood-based estimators in the mean squared error. Composite models may be superior to individual models in the Kullback-Leibler sense. We describe Bayesian quasi-posterior computations and compare the approach to Bayesian model averaging, finite mixture methods, and robustness procedures. We robustify inference using the composite posterior distribution of the parameters and the pool of models. We provide estimates of the marginal propensity to consume and evaluate the role of technology shocks for output fluctuations. |
Keywords: | Bayesian model averaging; composite likelihood; finite mixture; model misspecification |
JEL: | C13 C51 E17 |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13511&r=all |
By: | Laséen, Stefan; Lindé, Jesper; Ratto, Marco |
Abstract: | In this paper, we study identification and misspecification problems in standard closed and open-economy empirical New-Keynesian DSGE models used in monetary policy analysis. We find that problems with model misspecification still appear to be a first-order issue in monetary DSGE models, and argue that it is problems with model misspecification that may benefit the most from moving from a classical to a Bayesian framework. We also argue that lack of identification should neither be ignored nor be assumed to affect all DSGE models. Fortunately, identification problems can be readily assessed on a case-by-case basis, by applying recently developed pre-tests of identification. |
Keywords: | Bayesian estimation; Closed economy; DSGE model; Maximum Likelihood Estimation; Monte-Carlo methods; Open economy |
JEL: | C13 C51 E30 |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13492&r=all |