|
on Econometric Time Series |
By: | Koen Jochmans; Taisuke Otsu |
Abstract: | The use of two-way fixed-effect models is widespread. The presence of incidental parameter bias, however, invalidates statistical inference based on the likelihood. In this paper we consider modifications to the (profile) likelihood that yield asymptotically unbiased estimators as well as likelihood-ratio and score tests with correct size. The modifications are widely applicable and easy to implement. Our examples illustrate that the modifications can lead to dramatic improvements relative to the maximum likelihood method both in terms of point estimation and inference. |
Keywords: | asymptotic bias, bias correction, fixed effects, information bias, modified profile likelihood, panel data, MCMC, penalization, rectangular-array asymptotics |
JEL: | C12 C14 |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:cep:stiecm:598&r=ets |
By: | Javier Hidalgo; Marcia M Schafgans |
Abstract: | This paper addresses inference in large panel data models in the presence of both cross-sectional and temporal dependence of unknown form. We are interested in making inferences without relying on the choice of any smoothing parameter as is the case with the often employed HACestimator for the covariance matrix. To that end, we propose a cluster estimator for the asymptotic covariance of the estimators and a valid bootstrap which accommodates the nonparametric nature of both temporal and cross-sectional dependence. Our approach is based on the observation that the spectral representation of the fixed effect panel data model is such that the errors become approximately temporal uncorrelated. Our proposed bootstrap can be viewed as a wild bootstrap in the frequency domain. We present some Monte-Carlo simulations to shed some light on the small sample performance of our inferential procedure and illustrate our results using an empirical example. |
Keywords: | Large panel data models, cross-sectional strong-dependence, central Limit Theorems, clustering, discrete Fourier Transformation, nonparametric bootstrap algorithms |
JEL: | C12 C13 C23 |
Date: | 2017–12 |
URL: | http://d.repec.org/n?u=RePEc:cep:stiecm:597&r=ets |
By: | Giovanni Angelini; Emanuele Bacchiocchi; Giovanni Caggiano; Luca Fanelli |
Abstract: | We propose a new non-recursive identification scheme for uncertainty shocks, which exploits breaks in the unconditional volatility of macroeconomic variables. Such identification approach allows us to simultaneously address two major questions in the empirical literature on uncertainty: (i) Does the relationship between uncertainty and economic activity change across macroeconomic regimes? (ii) Is uncertainty a major cause or effect (or both) of decline in economic activity? Empirical results based on a small-scale VAR with US monthly data for the period 1960-2015 suggest that (i) the effects of uncertainty shocks are regime-dependent, and (ii) uncertainty is an exogenous source of decline of economic activity, rather than an endogenous response to it. |
Keywords: | heteroscedasticity, identification, non-recursive SVAR, uncertainty shocks, volatility regime |
JEL: | C32 C51 E44 G01 |
Date: | 2017 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_6799&r=ets |
By: | Fisher, Mark (Federal Reserve Bank of Atlanta); Jensen, Mark J. (Federal Reserve Bank of Atlanta) |
Abstract: | Change point models using hierarchical priors share in the information of each regime when estimating the parameter values of a regime. Because of this sharing, hierarchical priors have been very successful when estimating the parameter values of short-lived regimes and predicting the out-of-sample behavior of the regime parameters. However, the hierarchical priors have been parametric. Their parametric nature leads to global shrinkage that biases the estimates of the parameter coefficient of extraordinary regimes toward the value of the average regime. To overcome this shrinkage, we model the hierarchical prior nonparametrically by letting the hyperparameter's prior—in other words, the hyperprior—be unknown and modeling it with a Dirichlet processes prior. To apply a nonparametric hierarchical prior to the probability of a break occurring, we extend the change point model to a multiple-change-point panel model. The hierarchical prior then shares in the cross-sectional information of the break processes to estimate the transition probabilities. We apply our multiple-change-point panel model to a longitudinal data set of actively managed, U.S. equity, mutual fund returns to measure fund performance and investigate the chances of a skilled fund being skilled in the future. |
Keywords: | Bayesian nonparametric analysis; change points; Dirichlet process; hierarchical priors; mutual fund performance |
JEL: | C11 C14 C41 G11 G17 |
Date: | 2018–02–01 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedawp:2018-02&r=ets |
By: | He, Zhongfang |
Abstract: | This paper proposes a class of parametric correlation models that apply a two-layer autoregressive-moving-average structure to the dynamics of correlation matrices. The proposed model contains the Dynamic Conditional Correlation model of Engle (2002) and the Varying Correlation model of Tse and Tsui (2002) as special cases and offers greater flexibility in a parsimonious way. Performance of the proposed model is illustrated in a simulation exercise and an application to the U.S. stock indices. |
Keywords: | ARMA, Bayes, MCMC, multivariate GARCH, time series |
JEL: | C01 C11 C13 C58 |
Date: | 2018–02–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:84820&r=ets |
By: | Yuzhi Cai (School of Management, Swansea University); Julian Stander (Plymouth University) |
Abstract: | This paper develops a novel density forecasting method for financial time series following a threshold GARCH model that does not require the estimation of the model itself. Instead, Bayesian inference is performed about an induced multiple threshold one-step ahead value-at-risk process at a single quantile level. This is achieved by a quasi-likelihood approach that uses quantile information. We describe simulation studies that provide insight into our method and illustrate it using empirical work on market returns. The results show that our forecasting method outperforms some benchmark models for density forecasting of financial returns. |
Keywords: | Density forecasting, multiple thresholds, one-step ahead value-at-risk (VaR), quantile regression, quasi-likelihood. |
JEL: | C1 C5 |
Date: | 2018–02–27 |
URL: | http://d.repec.org/n?u=RePEc:swn:wpaper:2018-23&r=ets |
By: | Yuzhi Cai (School of Management, Swansea University); Guodong Li (University of Hong Kong) |
Abstract: | We develop a novel quantile function threshold GARCH model for studying the distribution function, rather than the volatility function, of financial returns that follow a threshold GARCH model. We propose a Bayesian method to do estimation and forecasting simultaneously, which allows us to handle multiple thresholds easily and ensures the forecasts can take account of the variation of model parameters. We apply the method to simulated data and Nasdaq returns. We show that our model is robust to model specification errors and outperforms some commonly used threshold GARCH models. |
Keywords: | Density forecasts, financial returns, quantile function, threshold GARCH |
JEL: | C10 C51 C53 |
Date: | 2018–02–27 |
URL: | http://d.repec.org/n?u=RePEc:swn:wpaper:2018-22&r=ets |
By: | Asai, M.; McAleer, M.J. |
Abstract: | The paper develops a new realized matrix-exponential GARCH (MEGARCH) model, which uses the information of returns and realized measure of co-volatility matrix simultaneously. The paper also considers an alternative multivariate asymmetric function to develop news impact curves. We consider Bayesian MCMC estimation to allow non-normal posterior distributions. For three US nancial assets, we compare the realized MEGARCH models with existing multivariate GARCH class models. The empirical results indicate that the realized MEGARCH models outperform the other models regarding in-sample and out-of-sample performance. The news impact curves based on the posterior densities provide reasonable results. |
Keywords: | Multivariate GARCH, Realized Measure, Matrix-Exponential, Bayesian Markov, chain Monte Carlo method, Asymmetry |
JEL: | C11 C32 |
Date: | 2018–01–01 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:104259&r=ets |
By: | Leon, Costas |
Abstract: | In this paper, the Singular Spectrum Analysis (SSA) is presented and applied in the US air traffic emplacements for the period Jan. 1954 – Sept. 2011. I decompose the US air traffic emplacements in trend, cycle, seasonal and noise components. In turn, I apply several spectral criteria in order to evaluate the SSA as a seasonal adjustment filter. SSA detects, beyond trend, strong cycles and seasonal components and leaves as a residual a GARCH process. SSA performs quite well as a seasonal adjustment mechanism in the case of the GARCH process but it performs even better in the case of a simulated white noise process. SSA is a serious candidate in economics in dealing with filtering, denoising, smoothing and seasonal adjustment. |
Keywords: | Singular Spectrum Analysis, Seasonal Adjustment, Spectral Analysis, Economic Time Series, Air Traffic Emplacements. |
JEL: | C10 C50 E3 E37 |
Date: | 2018–02–15 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:84594&r=ets |
By: | Timmermann, Allan G |
Abstract: | Our review highlights some of the key challenges in financial forecasting problems along with opportunities arising from the unique features of financiall data. We analyze the difficulty of establishing predictability in an environment with a low signal-to-noise ratio, persistent predictors, and instability in predictive relations arising from competitive pressures and investors' learning. We discuss approaches for forecasting the mean, variance, and probability distribution of asset returns. Finally, we cover how to evaluate financial forecasts while accounting for the possibility that numerous forecasting models may have been considered, leading to concerns of data mining. |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:12692&r=ets |
By: | Chang, Jinyuan; Yao, Qiwei; Zhou, Wen |
Abstract: | We propose a new omnibus test for vector white noise using the maximum absolute autocorrelations and cross-correlations of the component series. Based on an approximation by the L[infinity]-norm of a normal random vector, the critical value of the test can be evaluated by bootstrapping from a multivariate normal distribution. In contrast to the conventional white noise test, the new method is proved to be valid for testing the departure from white noise that is not independent and identically distributed. We illustrate the accuracy and the power of the proposed test by simulation, which also shows that the new test outperforms several commonly used methods including, for example, the Lagrange multiplier test and the multivariate Box–Pierce portmanteau tests, especially when the dimension of time series is high in relation to the sample size. The numerical results also indicate that the performance of the new test can be further enhanced when it is applied to pre-transformed data obtained via the time series principal component analysis proposed by Chang, Guo and Yao (arXiv:1410.2323). The proposed procedures have been implemented in an 'R' package. |
Keywords: | autocorrelation; normal approximation; parametric bootstrap; portmanteau test; time series principal component analysis; vector white noise |
JEL: | C1 |
Date: | 2017–02–18 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:68531&r=ets |