|
on Econometric Time Series |
By: | Tetsuya Takaishi |
Abstract: | The stochastic volatility model is one of volatility models which infer latent volatility of asset returns. The Bayesian inference of the stochastic volatility (SV) model is performed by the hybrid Monte Carlo (HMC) algorithm which is superior to other Markov Chain Monte Carlo methods in sampling volatility variables. We perform the HMC simulations of the SV model for two liquid stock returns traded on the Tokyo Stock Exchange and measure the volatilities of those stock returns. Then we calculate the accuracy of the volatility measurement using the realized volatility as a proxy of the true volatility and compare the SV model with the GARCH model which is one of other volatility models. Using the accuracy calculated with the realized volatility we find that empirically the SV model performs better than the GARCH model. |
Date: | 2013–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1305.3184&r=ets |
By: | Marco Zamparo; Fulvio Baldovin; Michele Caraglio; Attilio L. Stella |
Abstract: | We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous auto-regressive component and a random rescaling factor embodying exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance in terms of obtaining closed formulas for derivative pricing. Further important features are: The possibility of making contact, in certain limits, with auto-regressive models widely used in finance; The possibility of partially resolving the endogenous and exogenous components of the volatility, with consistent results when applied to historical series. |
Date: | 2013–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1305.3243&r=ets |
By: | Siem Jan Koopman (VU University Amsterdam); Kai Ming Lee (VU University Amsterdam) |
Abstract: | Unobserved components time series models decompose a time series into a trend, a season, a cycle, an irregular disturbance, and possibly other components. These models have been successfully applied to many economic time series. The standard assumption of a linear model, often appropriate after a logarithmic transformation of the data, facilitates estimation, testing, forecasting and interpretation. However, in some settings the linear-additive framework may be too restrictive. In this paper, we formulate a non-linear unobserved components time series model which allows interactions between the trend-cycle component and the seasonal component. The resulting model is cast into a non-linear state space form and estimated by the extended Kalman filter, adapted for models with diffuse initial conditions. We apply our model to UK travel data and US unemployment and production series, and show that it can capture increasing seasonal variation and cycle dependent seasonal fluctuations. |
Keywords: | Seasonal interaction; Unobserved components; Non-linear state space models |
JEL: | C13 C22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:0000028&r=ets |
By: | Siem Jan Koopman (VU University Amsterdam); Thuy Minh Nguyen (Deutsche Bank, London) |
Abstract: | We show that efficient importance sampling for nonlinear non-Gaussian state space models can be implemented by computationally efficient Kalman filter and smoothing methods. The result provides some new insights but it primarily leads to a simple and fast method for efficient importance sampling. A simulation study and empirical illustration provide some evidence of the computational gains. |
Keywords: | Kalman filter, Monte Carlo maximum likelihood, Simulation smoothing |
JEL: | C32 C51 |
Date: | 2012–01–12 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012008&r=ets |
By: | Geert Mesters (Netherlands Institute for the Study of Crime and Law Enforcement, and VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) |
Abstract: | An exact maximum likelihood method is developed for the estimation of parameters in a nonlinear non-Gaussian dynamic panel data model with unobserved random individual-specific and time-varying effects. We propose an estimation procedure based on the importance sampling technique. In particular, a sequence of conditional importance densities is derived which integrates out all random effects from the joint distribution of endogenous variables. We disentangle the integration over both the cross-section and the time series dimensions. The estimation method facilitates the flexible modeling of large panels in both dimensions. We evaluate the method in a Monte Carlo study for dynamic panel data models with observations from the Student's <i>t</i> distribution. We finally present an extensive empirical study into the interrelationships between the economic growth figures of countries listed in the Penn World Tables. It is shown that our dynamic panel data model can provide an insightful analysis of common and heterogeneous features in world-wide economic growth. |
Keywords: | Panel data, Non-Gaussian, Importance sampling, Random effects, Student's t, Economic growth |
JEL: | C33 C51 F44 |
Date: | 2012–02–06 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012009&r=ets |
By: | Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam); Marcel Scharth (VU University Amsterdam) |
Abstract: | We study whether and when parameter-driven time-varying parameter models lead to forecasting gains over observation-driven models. We consider dynamic count, intensity, duration, volatility and copula models, including new specifications that have not been studied earlier in the literature. In an extensive Monte Carlo study, we find that observation-driven generalised autoregressive score (GAS) models have similar predictive accuracy to correctly specified parameter-driven models. In most cases, differences in mean squared errors are smaller than 1% and model confidence sets have low power when comparing these two alternatives. We also find that GAS models outperform many familiar observation-driven models in terms of forecasting accuracy. The results point to a class of observation-driven models with comparable forecasting ability to parameter-driven models, but lower computational complexity. |
Keywords: | Generalised autoregressive score model, Importance sampling, Model confidence set, Nonlinear state space model, Weibull-gamma mixture |
JEL: | C53 C58 C22 |
Date: | 2012–03–06 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012020&r=ets |
By: | Lennart Hoogerheide (VU University Amsterdam); Anne Opschoor (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam) |
Abstract: | A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of Student-<I>t</I> densities that approximates accurately the target distribution - typically a posterior distribution, of which we only require a kernel - in the sense that the Kullback-Leibler divergence between target and mixture is minimized. We label this approach <I>Mixture of t by Importance Sampling and Expectation Maximization</I> (MitISEM). The constructed mixture is used as a candidate density for quick and reliable application of either Importance Sampling (IS) or the Metropolis-Hastings (MH) method. We also introduce three extensions of the basic MitISEM approach. First, we propose a method for applying MitISEM in a <I>sequential</I> manner. Second, we introduce a <I>permutation-augmented</I> MitISEM approach. Third, we propose a <I>partial</I> MitISEM approach, which aims at approximating the joint distribution by estimating a product of marginal and conditional distributions. This division can substantially reduce the dimension of the approximation problem, which facilitates the application of adaptive importance sampling for posterior simulation in more complex models with larger numbers of parameters. Our results indicate that the proposed methods can substantially reduce the computational burden in econometric models like DCC or mixture GARCH models and a mixture instrumental variables model. |
Keywords: | mixture of Student-t distributions, importance sampling, Kullback-Leibler divergence, Expectation Maximization, Metropolis-Hastings algorithm, predictive likelihood, DCC GARCH, mixture GARCH, instrumental variables |
JEL: | C11 C22 C26 |
Date: | 2012–03–23 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012026&r=ets |
By: | Falk Brauning (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) |
Abstract: | We explore a new approach to the forecasting of macroeconomic variables based on a dynamic factor state space analysis. Key economic variables are modeled jointly with principal components from a large time series panel of macroeconomic indicators using a multivariate unobserved components time series model. When the key economic variables are observed at a low frequency and the panel of macroeconomic variables is at a high frequency, we can use our approach for both nowcasting and forecasting purposes. Given a dynamic factor model as the data generation process, we provide Monte Carlo evidence for the finite-sample justification of our parsimonious and feasible approach. We also provide empirical evidence for a U.S. macroeconomic dataset. The unbalanced panel contain quarterly and monthly variables. The forecasting accuracy is measured against a set of benchmark models. We conclude that our dynamic factor state space analysis can lead to higher forecasting precisions when panel size and time series dimensions are moderate. |
Keywords: | Kalman filter, Mixed frequency; Nowcasting, Principal components, State space model, Unobserved Components Time Series Model |
JEL: | C33 C53 E17 |
Date: | 2012–04–20 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012042&r=ets |
By: | Istvan Barra (VU University Amsterdam); Lennart Hoogerheide (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam) |
Abstract: | We propose a new methodology for the Bayesian analysis of nonlinear non-Gaussian state space models with a Gaussian time-varying signal, where the signal is a function of a possibly high-dimensional state vector. The novelty of our approach is the development of proposal densities for the joint posterior density of parameter and state vectors: a mixture of Student's t-densities as the marginal proposal density for the parameter vector, and a Gaussian density as the conditional proposal density for the signal given the parameter vector. We argue that a highly efficient procedure emerges when these proposal densities are used in an independent Metropolis-Hastings algorithm. A particular feature of our approach is that smoothed estimates of the states and an estimate of the marginal likelihood are obtained directly as an output of the algorithm. Our methods are computationally efficient and produce more accurate estimates when compared to recently proposed alternativ es. We present extensive simulation evidence for stochastic volatility and stochastic intensity models. For our empirical study, we analyse the performance of our method for stock return data and corporate default panel data. |
Keywords: | nonlinear non-Gaussian state space model, Bayesian inference, Monte Carlo estimation, Metropolis-Hastings algorithm, mixture of Student's t-distributions |
JEL: | C11 C15 C22 C32 C58 |
Date: | 2012–03–26 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012050&r=ets |
By: | Jiangyu Ji (VU University Amsterdam); Andre Lucas (VU University Amsterdam, and Duisenberg school of finance) |
Abstract: | We propose a new semiparametric observation-driven volatility model where the form of the error density directly influences the volatility dynamics. This feature distinguishes our model from standard semiparametric GARCH models. The link between the estimated error density and the volatility dynamics follows from the application of the generalized autoregressive score framework of Creal, Koopman, and Lucas (2012). We provide simulated evidence for the estimation efficiency and forecast accuracy of the new model, particularly if errors are fat-tailed and possibly skewed. In an application to equity return data we find that the model also does well in density forecasting. |
Keywords: | volatility clustering, Generalized Autoregressive Score model, kernel density estimation, density forecast evaluation |
JEL: | C10 C14 C22 |
Date: | 2012–05–22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012055&r=ets |
By: | Francisco Blasques (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam) |
Abstract: | We characterize the dynamic properties of Generalized Autoregressive Score (GAS) processes by identifying regions of the parameter space that imply stationarity and ergodicity. We show how these regions are affected by the choice of parameterization and scaling, which are key features of GAS models compared to other observation driven models. The Dudley entropy integral is used to ensure the non-degeneracy of such regions. Furthermore, we show how to obtain bounds for these regions in models for time-varying means, variances, or higher-order moments. |
Keywords: | Dudley integral, Durations, Higher-order models, Nonlinear dynamics, Time-varying parameters, Volatility |
JEL: | C13 C22 C58 |
Date: | 2012–06–22 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012059&r=ets |
By: | H. Peter Boswijk (University of Amsterdam); Michael Jansson (UC Berkeley, and CREATES); Morten Ø. Nielsen (Queen's University, and CREATES) |
Abstract: | We suggest improved tests for cointegration rank in the vector autoregressive (VAR) model and develop asymptotic distribution theory and local power results. The tests are (quasi-)likelihood ratio tests based on a Gaussian likelihood, but of course the asymptotic results apply more generally. The power gains relative to existing tests are due to two factors. First, instead of basing our tests on the conditional (with respect to the initial observations) likelihood, we follow the recent unit root literature and base our tests on the full likelihood as in, e.g., Elliott, Rothenberg, and Stock (1996). Secondly, our tests incorporate a 'sign' restriction which generalizes the one-sided unit root test. We show that the asymptotic local power of the proposed tests dominates that of existing cointegration rank tests. |
Keywords: | Cointegration rank, efficiency, likelihood ratio test, vector autoregression |
JEL: | C12 C32 |
Date: | 2012–09–21 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012097&r=ets |
By: | Monica Billio (University of Venice, GRETA Assoc. and School for Advanced Studies in Venice); Roberto Casarin (University of Venice, GRETA Assoc. and School for Advanced Studies in Venice); Francesco Ravazzolo (Norges Bank and BI Norwegian Business School); Herman K. van Dijk (Erasmus University Rotterdam, VU University Amsterdam) |
Abstract: | We propose a Bayesian combination approach for multivariate predictive densities which relies upon a distributional state space representation of the combination weights. Several specifications of multivariate time-varying weights are introduced with a particular focus on weight dynamics driven by the past performance of the predictive densities and the use of learning mechanisms. In the proposed approach the model set can be incomplete, meaning that all models can be individually misspecified. A Sequential Monte Carlo method is proposed to approximate the filtering and predictive densities. The combination approach is assessed using statistical and utility-based performance measures for evaluating density forecasts. Simulation results indicate that, for a set of linear autoregressive models, the combination strategy is successful in selecting, with probability close to one, the true model when the model set is complete and it is able to detect parameter instability when the model set includes the true model that has generated subsamples of data. For the macro series we find that incompleteness of the models is relatively large in the 70's, the beginning of the 80's and during the recent financial crisis, and lower during the Great Moderation. With respect to returns of the S&P 500 series, we find that an investment strategy using a combination of predictions from professional forecasters and from a white noise model puts more weight on the white noise model in the beginning of the 90's and switches to giving more weight to the professional forecasts over time. |
Keywords: | Density Forecast Combination, Survey Forecast, Bayesian Filtering, Sequential Monte Carlo |
JEL: | C11 C15 C53 E37 |
Date: | 2012–11–07 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2012118&r=ets |
By: | Manabu Asai (Soka University, Japan, and University of Pennsylvania); Michael McAleer (Erasmus School of Economics, Kyoto University, Japan, and Complutense University of Madrid, Spain) |
Abstract: | There has recently been growing interest in modeling and estimating alternative continuous time multivariate stochastic volatility models. We propose a continuous time fractionally integrated Wishart stochastic volatility (FIWSV) process. We derive the conditional Laplace transform of the FIWSV model in order to obtain a closed form expression of moments. We conduct a two-step procedure, namely estimating the parameter of fractional integration via log-periodgram regression in the first step, and estimating the remaining parameters via the generalized method of moments in the second step. Monte Carlo results for the procedure shows reasonable performances in finite samples. The empirical results for the bivariate data of the S&P 500 and FTSE 100 indexes show that the data favor the new FIWSV processes rather than one-factor and two-factor models of Wishart autoregressive processes for the covariance structure. |
Keywords: | Diffusion process; Multivariate stochastic volatility; Long memory; Fractional Brownian motion, Generalized method of moments |
JEL: | C32 C51 G13 |
Date: | 2013–01–31 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013025&r=ets |
By: | Francisco Blasques (VU University Amsterdam) |
Abstract: | This paper proposes the use of a double correlation coefficient as a nonpara- metric measure of phase-dependence in time-varying correlations. An asymp- totically Gaussian test statistic for the null hypothesis of no phase-dependence is derived from the proposed measure. Finite-sample distributions, power and size are analyzed in a Monte-Carlo exercise. An application of this test provides evidence that correlation strength between major macroeconomic aggregates is both time-varying and phase dependent in the business cycle. |
Keywords: | nonparametric, phase-dependence, time-varying correlation |
JEL: | C01 C14 C32 |
Date: | 2013–04–04 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:2013054&r=ets |
By: | Jiaqi Chen; Michael L. Tindall |
Abstract: | This paper describes the structure of a rule-based econometric forecasting system designed to produce multi-equation econometric models. The paper describes the functioning of a working system which builds the econometric forecasting equation for each series submitted and produces forecasts of the series. The system employs information criteria and cross validation in the equation building process, and it uses Bayesian model averaging to combine forecasts of individual series. The system outperforms standard benchmarks for a variety of national economic datasets. |
Keywords: | Econometrics |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:fip:feddop:1:x:1&r=ets |
By: | Fady Barsoum (Department of Economics, University of Konstanz, Germany); Sandra Stankiewicz (Department of Economics, University of Konstanz, Germany) |
Abstract: | For modelling mixed-frequency data with business cycle pattern we introduce the Markovswitching Mixed Data Sampling model with unrestricted lag polynomial (MS-U-MIDAS). Usually models of the MIDAS-class use lag polynomials of a specific function, which impose some structure on the weights of regressors included in the model. This may deteriorate the predictive power of the model if the imposed structure differs from the data generating process. When the difference between the available data frequencies is small and there is no risk of parameter proliferation, using an unrestricted lag polynomial might not only simplify the model estimation, but also improve its forecasting performance. We allow the parameters of the MIDAS model with unrestricted lag polynomial to change according to a Markov-switching scheme in order to account for the business cycle pattern observed in many macroeconomic variables. Thus we combine the unrestricted MIDAS with a Markov-switching approach and propose a new Markov-switching MIDAS model with unrestricted lag polynomial (MS-U-MIDAS). We apply this model to a large dataset with the help of factor analysis. Monte Carlo experiments and an empirical forecasting comparison carried out for the U.S. GDP growth show that the models of the MS-UMIDAS class exhibit similar or better nowcasting and forecasting performance than their counterparts with restricted lag polynomials. |
Keywords: | Markov-switching, Business cycle, Mixed-frequency data analysis, Forecastsing |
JEL: | C22 C53 E37 |
Date: | 2013–05–08 |
URL: | http://d.repec.org/n?u=RePEc:knz:dpteco:1310&r=ets |