|
on Econometric Time Series |
Issue of 2021‒03‒01
fourteen papers chosen by Jaqueson K. Galimberti Auckland University of Technology |
By: | Alain Hecq; Marie Ternes; Ines Wilms |
Abstract: | Mixed-frequency Vector AutoRegressions (MF-VAR) model the dynamics between variables recorded at different frequencies. However, as the number of series and high-frequency observations per low-frequency period grow, MF-VARs suffer from the "curse of dimensionality". We curb this curse through a regularizer that permits various hierarchical sparsity patterns by prioritizing the inclusion of coefficients according to the recency of the information they contain. Additionally, we investigate the presence of nowcasting relations by sparsely estimating the MF-VAR error covariance matrix. We study predictive Granger causality relations in a MF-VAR for the U.S. economy and construct a coincident indicator of GDP growth. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.11780&r=all |
By: | Tobias Hartl |
Abstract: | This paper aims to provide reliable estimates for the COVID-19 contact rate of a Susceptible-Infected-Recovered (SIR) model. From observable data on confirmed, recovered, and deceased cases, a noisy measurement for the contact rate can be constructed. To filter out measurement errors and seasonality, a novel unobserved components (UC) model is set up. It specifies the log contact rate as a latent, fractionally integrated process of unknown integration order. The fractional specification reflects key characteristics of aggregate social behavior such as strong persistence and gradual adjustments to new information. A computationally simple modification of the Kalman filter is introduced and is termed the fractional filter. It allows to estimate UC models with richer long-run dynamics, and provides a closed-form expression for the prediction error of UC models. Based on the latter, a conditional-sum-of-squares (CSS) estimator for the model parameters is set up that is shown to be consistent and asymptotically normally distributed. The resulting contact rate estimates for several countries are well in line with the chronology of the pandemic, and allow to identify different contact regimes generated by policy interventions. As the fractional filter is shown to provide precise contact rate estimates at the end of the sample, it bears great potential for monitoring the pandemic in real time. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.10067&r=all |
By: | Ding, Y. |
Abstract: | We propose a model that extends Smetanina's (2017) original RT-GARCH model by allowing conditional heteroskedasticity in the variance of volatility process. We show we are able to filter and forecast both volatility and volatility of volatility simultaneously in this simple setting. The volatility forecast function follows a second-order difference equation as opposed to first-order under GARCH(1,1) and RT-GARCH(1,1). Empirical studies confirm the presence of conditional heteroskedasticity in the volatility process and the standardised residuals of return are close to Gaussian under this model. We show we are able to obtain better in-sample nowcast and out-of-sample forecast of volatility. |
Keywords: | GARCH, diffusion limit, forecasting, volatility of volatility |
JEL: | C22 C32 C53 C58 |
Date: | 2021–02–16 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:2112&r=all |
By: | Luke De Clerk; Sergey Savel'ev |
Abstract: | Here we have analysed a GARCH(1,1) model with the aim to fit higher order moments for different companies' stock prices. When we assume a gaussian conditional distribution, we fail to capture any empirical data. We show instead that a double gaussian conditional probability better captures the higher order moments of the data. To demonstrate this point, we construct regions (phase diagrams) in higher order moment space, where a GARCH(1,1) model can be used to fit the higher order moments and compare this with empirical data from different sectors of the economy. We found that, the ability of the GARCH model to fit higher order moments is dictated by the time window our data spans. Primarily, if we have a time series, using a GARCH(1,1) model with a double gaussian conditional probability (a GARCH-double-normal model) we cannot necessarily fit the statistical moments of the time series. Highlighting, that the GARCH-double-normal model only allows fitting of specific lengths of time series. This is indicated by the migration of the companies' data out of the region of the phase diagram where GARCH is able to fit these higher order moments. In order to overcome the non-stationarity of our modelling, we assume that one of the parameters of the GARCH model, $\alpha_0$, has a time dependence. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.11627&r=all |
By: | Jiangtao Duan; Jushan Bai; Xu Han |
Abstract: | This paper estimates the break point for large-dimensional factor models with a single structural break in factor loadings at a common unknown date. First, we propose a quasi-maximum likelihood (QML) estimator of the change point based on the second moments of factors, which are estimated by principal component analysis. We show that the QML estimator performs consistently when the covariance matrix of the pre- or post-break factor loading, or both, is singular. When the loading matrix undergoes a rotational type of change while the number of factors remains constant over time, the QML estimator incurs a stochastically bounded estimation error. In this case, we establish an asymptotic distribution of the QML estimator. The simulation results validate the feasibility of this estimator when used in finite samples. In addition, we demonstrate empirical applications of the proposed method by applying it to estimate the break points in a U.S. macroeconomic dataset and a stock return dataset. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.12666&r=all |
By: | Xiuqin Xu; Ying Chen |
Abstract: | Volatility for financial assets returns can be used to gauge the risk for financial market. We propose a deep stochastic volatility model (DSVM) based on the framework of deep latent variable models. It uses flexible deep learning models to automatically detect the dependence of the future volatility on past returns, past volatilities and the stochastic noise, and thus provides a flexible volatility model without the need to manually select features. We develop a scalable inference and learning algorithm based on variational inference. In real data analysis, the DSVM outperforms several popular alternative volatility models. In addition, the predicted volatility of the DSVM provides a more reliable risk measure that can better reflex the risk in the financial market, reaching more quickly to a higher level when the market becomes more risky and to a lower level when the market is more stable, compared with the commonly used GARCH type model with a huge data set on the U.S. stock market. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.12658&r=all |
By: | Jianqing Fan; Ricardo Masini; Marcelo C. Medeiros |
Abstract: | Factor and sparse models are two widely used methods to impose a low-dimensional structure in high dimension. They are seemingly mutually exclusive. In this paper, we propose a simple lifting method that combines the merits of these two models in a supervised learning methodology that allows to efficiently explore all the information in high-dimensional datasets. The method is based on a very flexible linear model for panel data, called factor-augmented regression model with both observable, latent common factors, as well as idiosyncratic components as high-dimensional covariate variables. This model not only includes both factor regression and sparse regression as specific models but also significantly weakens the cross-sectional dependence and hence facilitates model selection and interpretability. The methodology consists of three steps. At each step, the remaining cross-section dependence can be inferred by a novel test for covariance structure in high-dimensions. We developed asymptotic theory for the factor-augmented sparse regression model and demonstrated the validity of the multiplier bootstrap for testing high-dimensional covariance structure. This is further extended to testing high-dimensional partial covariance structures. The theory and methods are further supported by an extensive simulation study and applications to the construction of a partial covariance network of the financial returns for the constituents of the S\&P500 index and prediction exercise for a large panel of macroeconomic time series from FRED-MD database. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.11341&r=all |
By: | Fischer, Manfred M.; Hauzenberger, Niko; Huber, Florian; Pfarrhofer, Michael |
Abstract: | Time-varying parameter (TVP) regressions commonly assume that time-variation in the coefficients is determined by a simple stochastic process such as a random walk. While such models are capable of capturing a wide range of dynamic patterns, the true nature of time variation might stem from other sources, or arise from different laws of motion. In this paper, we propose a flexible TVP VAR that assumes the TVPs to depend on a panel of partially latent covariates. The latent part of these covariates differ in their state dynamics and thus capture smoothly evolving or abruptly changing coefficients. To determine which of these covariates are important, and thus to decide on the appropriate state evolution, we introduce Bayesian shrinkage priors to perform model selection. As an empirical application, we forecast the US term structure of interest rates and show that our approach performs well relative to a set of competing models. We then show how the model can be used to explain structural breaks in coefficients related to the US yield curve. |
Keywords: | Bayesian shrinkage, interest rate forecasting, latent effect modifers, MCMC sampling, time-varying parameter regression |
Date: | 2021–02–22 |
URL: | http://d.repec.org/n?u=RePEc:wiw:wus046:8006&r=all |
By: | Sergey Seleznev (Bank of Russia, Russian Federation); Natalia Turdyeva (Bank of Russia, Russian Federation); Ramis Khabibullin (Bank of Russia, Russian Federation); Anna Tsvetkova (Bank of Russia, Russian Federation) |
Abstract: | This paper describes the seasonal adjustment algorithm used by the Bank of Russia to clean up data for ‘Monitoring of Sectoral Financial Flows’ weekly publication. We have developed a simple and fast procedure based on a set of trigonometric functions and dummy variables that demonstrates good results in terms of various quality metrics and can be easily modified for working with more flexible model specifications. |
Keywords: | daily seasonal adjustment, time series, sectoral financial flows, Bayesian estimator. |
JEL: | C11 C22 E32 E37 |
Date: | 2020–12 |
URL: | http://d.repec.org/n?u=RePEc:bkr:wpaper:wps65&r=all |
By: | Gregory Casey (Williams College); Marc Klemp (University of Copenhagen) |
Abstract: | We provide a simple framework for interpreting instrumental variable regressions when there is a gap in time between the impact of the instrument and the measurement of the endogenous variable, highlighting a particular violation of the exclusion restriction that can arise in this setting. In the presence of this violation, conventional IV regressions do not consistently estimate a structural parameter of interest. Building on our framework, we develop a simple empirical method to estimate the long-run effect of the endogenous variable. We use our bias correction method to examine the role of institutions in economic development, following Acemoglu et al. (2001). We find long-run coefficients that are smaller than the coefficients from the original work, demonstrating the quantitative importance of our framework. |
Keywords: | Long-Run Economic Development, Instrumental Variable Regression |
JEL: | C10 C30 O10 O40 |
Date: | 2021–01–14 |
URL: | http://d.repec.org/n?u=RePEc:wil:wileco:2021-02&r=all |
By: | Zhen Zeng; Tucker Balch; Manuela Veloso |
Abstract: | Time series forecasting is essential for decision making in many domains. In this work, we address the challenge of predicting prices evolution among multiple potentially interacting financial assets. A solution to this problem has obvious importance for governments, banks, and investors. Statistical methods such as Auto Regressive Integrated Moving Average (ARIMA) are widely applied to these problems. In this paper, we propose to approach economic time series forecasting of multiple financial assets in a novel way via video prediction. Given past prices of multiple potentially interacting financial assets, we aim to predict the prices evolution in the future. Instead of treating the snapshot of prices at each time point as a vector, we spatially layout these prices in 2D as an image, such that we can harness the power of CNNs in learning a latent representation for these financial assets. Thus, the history of these prices becomes a sequence of images, and our goal becomes predicting future images. We build on a state-of-the-art video prediction method for forecasting future images. Our experiments involve the prediction task of the price evolution of nine financial assets traded in U.S. stock markets. The proposed method outperforms baselines including ARIMA, Prophet, and variations of the proposed method, demonstrating the benefits of harnessing the power of CNNs in the problem of economic time series forecasting. |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2102.12061&r=all |
By: | Andrea Carriero; Todd E. Clark; Massimiliano Marcellino; Elmar Mertens |
Abstract: | Incoming data in 2020 posed sizable challenges for the use of VARs in economic analysis: Enormous movements in a number of series have had strong effects on parameters and forecasts constructed with standard VAR methods. We propose the use of VAR models with time-varying volatility that include a treatment of the COVID extremes as outlier observations. Typical VARs with time-varying volatility assume changes in uncertainty to be highly persistent. Instead, we adopt an outlier-adjusted stochastic volatility (SV) model for VAR residuals that combines transitory and persistent changes in volatility. In addition, we consider the treatment of outliers as missing data. Evaluating forecast performance over the last few decades in quasi-real time, we find that the outlier-augmented SV scheme does at least as well as a conventional SV model, while both outperform standard homoskedastic VARs. Point forecasts made in 2020 from heteroskedastic VARs are much less sensitive to outliers in the data, and the outlier-adjusted SV model generates more reasonable gauges of forecast uncertainty than a standard SV model. At least pre-COVID, a close alternative to the outlier-adjusted model is an SV model with t-distributed shocks. Treating outliers as missing data also generates better-behaved forecasts than the conventional SV model. However, since uncertainty about the incidence of outliers is ignored in that approach, it leads to strikingly tight predictive densities. |
Keywords: | Bayesian VARs; stochastic volatility; outliers; pandemics; forecasts |
JEL: | C53 E17 E37 F47 |
Date: | 2021–02–02 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedcwq:89757&r=all |
By: | Jaqueson Galimberti (School of Economics, Faculty of Business, Economics and Law at AUT University) |
Abstract: | This paper evaluates how the way agents weight information when forming expectations can affect the econometric estimation of models with adaptive learning. One key new finding is that misspecification of the uncertainty about initial beliefs under constantgain least squares learning can generate a time-varying profile of weights given to past observations, distorting the estimation and behavioural interpretation of this mechanism in small samples of data. This result is derived under a new representation of the learning algorithm that penalizes the effects of misspecification of the learning initials. Simulations of a forward-looking Phillips curve model with learning indicate that (i) misspecification of initials uncertainty can lead to substantial biases to estimates of expectations relevance for inflation, and (ii) that these biases can spill over to estimates of inflation rates responsiveness to output gaps. An empirical application with U.S. data shows the relevance of these effects. |
Keywords: | expectations, adaptive learning, bounded rationality, macroeconomics |
JEL: | E70 D83 D84 D90 E37 C32 C63 |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:aut:wpaper:202101&r=all |
By: | Barend Abeln; Jan P.A.M. Jacobs |
Abstract: | The COVID19 crisis has a huge impact on economies all over the world. In this note we compare seasonal adjustments of X13 and CAMPLET before and after the COVID19 crisis. We show results of Quasi Real Time analyses for the quarterly series real GDP and the monthly series Consumption of Households in the Netherlands, and STL and CAMPLET seasonal adjustments for the weekly series US Initial Claims. We find that differences in SA values are generally small and that X13 and STL seasonal adjustments are subject to revision. From the analysis of the weekly series initial claims we learn that STL and CAMPLET SAs follow NSA values closely. In addition, the COVID19 crisis caused a structural increase in initial claims. Before the crisis initial claims fluctuated around a lower level than after the crisis. |
Keywords: | COVID19 Crisis,Seasonal Adjustment,Real GDP,Consumption of Households,Initial Claims, |
Date: | 2021–02–18 |
URL: | http://d.repec.org/n?u=RePEc:cir:cirwor:2021s-05&r=all |