|
on Econometrics |
By: | Xun Tang (Department of Economics, University of Pennsylvania) |
Abstract: | In this paper we study the identification and estimation of a class of binary regressions where conditional medians of additive disturbances are bounded between known or exogenously identified functions of regressors. This class includes several important microeconometric models, such as simultaneous discrete games with incomplete information, binary regressions with censored regressors, and binary regressions with interval data or measurement errors on regressors. We characterize the identification region of linear coefficients in this class of models and show how point-identification can be achieved in various microeconometric models under fairly general restrictions on structural primitives. We define a novel, two-step smooth extreme estimator, and prove its consistency for the identification region of coefficients. We also provide encouraging Monte Carlo evidence of the estimator’s performance in finite samples. |
Keywords: | Binary response, median dependence, games with incomplete information, censored regressors, interval data, measurement error, partial identification, point identification, consistent estimation |
JEL: | C14 C25 C51 |
Date: | 2009–01–20 |
URL: | http://d.repec.org/n?u=RePEc:pen:papers:09-003&r=ecm |
By: | Nikolay Gospodinov (Concordia University); Ye Tao (Concordia University) |
Abstract: | This paper proposes a bootstrap unit root test in models with GARCH(1,1) errors and establishes its asymptotic validity under mild moment and distributional restrictions. While the proposed bootstrap test for a unit root shares the power enhancing properties of its asymptotic counterpart (Ling and Li, 2003), it offers a number of important advantages. In particular, the bootstrap procedure does not require explicit estimation of nuisance parameters that enter the distribution of the test statistic and corrects the substantial size distortions of the asymptotic test that occur for strongly heteroskedastic processes. The simulation results demonstrate the excellent finite-sample properties of the bootstrap unit root test for a wide range of GARCH specifications. |
Keywords: | Unit root test; GARCH; Bootstrap |
JEL: | C12 C15 C22 |
Date: | 2009–01 |
URL: | http://d.repec.org/n?u=RePEc:crd:wpaper:09001&r=ecm |
By: | Fabio Canova; Filippo Ferroni |
Abstract: | We propose a method to estimate time invariant cyclical DSGE models using the information provided by a variety of filtering approaches. We treat data filtered with alternative procedures as contaminated proxy of the relevant model-based quantities and estimate structural and nonstructural parameters jointly using an unobservable component structure. We employ simulated data to illustrate the properties of the procedure and compare our estimates with those obtained when just one filter is used. We revisit the role of money in the transmission of monetary business cycles. |
Keywords: | DSGE models, Filters, Structural estimation, Business cycles |
JEL: | E32 C32 |
Date: | 2009–01 |
URL: | http://d.repec.org/n?u=RePEc:upf:upfgen:1135&r=ecm |
By: | Bernd Droge; Deniz Dilan Karaman Örsal |
Abstract: | The purpose of this paper is to propose a new likelihood-based panel cointegration test in the presence of a linear time trend in the data generating process. This new test is an extension of the likelihood ratio (LR) test of Saikkonen & Lütkepohl (2000) for trend-adjusted data to the panel data framework, and is called the panel SL test. The idea is first to take the average of the individual LR (trace) statistics over the cross-sections and then to standardize the test statistic with the appropriate asymptotic moments. Under the null hypothesis, this standardized statistic has a limiting normal distribution as the number of time periods (T) and the number of cross-sections (N) tend to infinity sequentially. In addition to the approximation based on asymptotic moments, a second approximation approach involving the moments from a vector autoregressive process of order one is also introduced. By means of a Monte Carlo study the finite sample size and size-adjusted power properties of the test are investigated. The test presents reasonable size with the increase in T and N, and has high power in small samples. |
Keywords: | Panel Cointegration Test, Likelihood Ratio, Time Trend, Monte Carlo Study |
JEL: | C33 C12 C15 |
Date: | 2009–01 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2009-005&r=ecm |
By: | Nikolay Gospodinov (Concordia University); Taisuke Otsu (Yale University) |
Abstract: | This paper investigates statistical properties of the local GMM (LGMM) estimator for some time series models defined by conditional moment restrictions. First, we consider Markov processes with possible conditional heteroskedasticity of unknown form and establish the consistency, asymptotic normality, and semi-parametric efficiency of the estimator. Second, inspired by simulation results showing that the LGMM estimator has a significantly smaller bias than the OLS estimator, we undertake a higher-order asymptotic expansion and analyze the bias properties of the LGMM estimator. The structure of the asymptotic expansion of the LGMM estimator reveals an interesting contrast with the OLS estimator that helps to explain the bias reduction in the LGMM estimator. The practical importance of these findings is evaluated in terms of a bond and option pricing exercise based on a diffusion model for spot interest rate. |
Keywords: | Conditional moment restrictions; Local GMM; Higher-order expansion; Conditional heteroskedasticity |
JEL: | C13 C22 G12 |
Date: | 2008–12 |
URL: | http://d.repec.org/n?u=RePEc:crd:wpaper:08010&r=ecm |
By: | Davy Paindaveine |
Abstract: | This paper proposes several extensions of the concept of runs to the multivariate setup, and studies the resulting tests of multivariate randomness against serial dependence. Two types of multivariate runs are defined: (i) an elliptical extension of the spherical runs proposed by Marden (1999), and (ii) an original concept of matrix-valued runs. The resulting runs tests themselves exist in various versions, one of which is a function of the number of data-based hyperplanes separating pairs of observations only. All proposed multivariate runs tests are affine-invariant and highly robust: in particular, they allow for heteroskedasticity and do not require any moment assumption. Their limiting distributions are derived under the null hypothesis and under sequences of local vector ARMA alternatives. Asymptotic relative efficiencies with respect to Gaussian Portmanteau tests are computed, and show that, while Mardentype runs tests suffer severe consistency problems, tests based on matrix-valued runs perform uniformly well for moderate-to-large dimensions. A Monte-Carlo study confirms the theoretical results and investigates the robustness properties of the proposed procedures. A real data example is also treated, and shows that combining both types of runs tests may provide some insight on the reason why rejection occurs, hence that Marden-type runs tests, despite their lack of consistency, also are of interest for practical purposes. |
Keywords: | elliptical distributions, interdirections, local asymptotic nrmality, multivariate signs, Shape matrix |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2009_002&r=ecm |
By: | S. Sanfelici; S. Ogawa |
Abstract: | We are concerned with the problem of parameter estimation in Finance, namely the estimation of the spot volatility in the presence of the so-called microstructure noise. In [16] a scheme based on the technique of multi-step regularization was presented. It was shown that this scheme can work in a real-time manner. However, the main drawback of this scheme is that it needs a lot of observation data. The aim of the present paper is to introduce an improvement of the scheme such that the modified estimator can work more efficiently and with a data set of smaller size. The technical aspects of implementation of the scheme and its performance on simulated data are analyzed. The proposed scheme is tested against other estimators, namely a realized volatility type estimator, the Fourier estimator and two kernel estimators. |
Keywords: | Spot volatility, Nonparametric estimation, Multi-step regularization, Microstructure |
JEL: | G10 C14 C22 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:par:dipeco:2008-me02&r=ecm |
By: | Laurent Ferrara (Centre d'Economie de la Sorbonne et Banque de France); Dominique Guegan (Paris School of Economics - Centre d'Economie de la Sorbonne); Patrick Rakotomarolahy (Centre d'Economie de la Sorbonne) |
Abstract: | This papier formalizes the process of forecasting unbalanced monthly data sets in order to obtain robust nowcasts and forecasts of quarterly GDP growth rate through a semi-parametric modelling. This innovative approach lies on the use on non-parametric methods, based on nearest neighbors and on radial basis function approaches, ti forecast the monthly variables involved in the parametric modelling of GDP using bridge equations. A real-time experience is carried out on Euro area vintage data in order to anticipate, with an advance ranging from six to one months, the GDP flash estimate for the whole zone. |
Keywords: | Euro area GDP, real-time nowcasting, forescasting, non-parametric methods. |
JEL: | C22 C53 E32 |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:b08082&r=ecm |
By: | Hallin, M.; Akker, R. van den; Werker, B.J.M. (Tilburg University, Center for Economic Research) |
Abstract: | AMS 1980 subject classification : 62G10 and 62G20. |
Keywords: | Dickey-Fuller test;Local Asymptotic Normality. |
JEL: | C12 C22 |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:20092&r=ecm |
By: | Don Harding (La Trobe University); Adrian Pagan (QUT) |
Abstract: | Macroeconometric and financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in micro-econometric studies. One important difference between primary and secondary binary variables is that, while the former are, in many instances, independently distributed (i.d.), the latter are rarely i.d. We show how popular rules for constructing the binary states interact with the stochastic processes for of the variables they are constructed from, so that the binary states need to be treated as Markov processes. Consequently, one needs to recognize this when performing analyses with the binary variables, and it is not valid to adopt a model like static Probit which fails to recognize such dependence. Moreover, these binary variables are often censored, in that they are constructed in such a way as to result in sequences of them possessing the same sign. Such censoring imposes restrictions upon the DGP of the binary states and it creates difficulties if one tries to utilize a dynamic Probit model with them. Given this we describe methods for modeling with these variables that both respects their Markov process nature and which explicitly deals with any censoring constraints. An application is provided that investigates the relation between the business cycle and the yield spread. |
Keywords: | Business cycle; binary variable, Markov process, Probit model, yield curve |
JEL: | C22 C53 E32 E37 |
Date: | 2009–01–21 |
URL: | http://d.repec.org/n?u=RePEc:qut:auncer:2009_39&r=ecm |
By: | Damba Lkhagvasuren (Concordia University); Ragchaasuren Galindev (Queens University Belfast) |
Abstract: | The finite state Markov-Chain approximation method developed by Tauchen (1986) and Tauchen and Hussey (1991) is widely used in economics, finance and econometrics in solving for functional equations where state variables follow an autoregressive process. For highly persistent processes, the method requires a large number of discrete values for the state variables to produce close approximations which leads to an undesirable reduction in computational speed, especially in multidimensional case. This paper proposes an alternative method of discretizing vector autoregressions. The method works well as an approximation and its numerical efficiency applies to a wide range of the parameter space. |
Keywords: | Finite State Markov-Chain Approximation, Transition Matrix, Numerical Methods, VAR, |
JEL: | C15 C63 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:crd:wpaper:08012&r=ecm |
By: | David E. Giles (Department of Economics, University of Victoria); Hui Feng (Department of Economics, Business & Mathematics, King's College, University of Western Ontario) |
Abstract: | We derive analytic expressions for the biases, to O(n-1) of the maximum likelihood estimators of the parameters of the generalized Pareto distribution. Using these expressions to bias-correct the estimators is found to be extremely effective in terms of bias reduction, and generally results in some reduction in relative mean squared error. The analytic bias-corrected estimators are also shown to be dramatically superior to the alternative of bias-correction via the bootstrap. |
Keywords: | Bias reduction; Extreme values; Generalized Pareto distribution; Peaks over threshold |
JEL: | C13 C16 C41 C46 |
Date: | 2009–01–23 |
URL: | http://d.repec.org/n?u=RePEc:vic:vicewp:0902&r=ecm |
By: | D'Agostino, Antonello (Central Bank and Financial Services Authority of Ireland); McQuinn, Kieran (Central Bank and Financial Services Authority of Ireland); O'Brien, Derry (Central Bank and Financial Services Authority of Ireland) |
Abstract: | In this paper we present "now-casts" of Irish GDP using timely data from a panel data set of 41 different variables. The approach seeks to resolve two issues which commonly confront forecastors of GDP - how to parsimoniously avail of the many different series, which can potentially influence GDP and how to reconcile the within-quarterly release of many of these series with the quarterly estimates of GDP? The now-casts in this paper are generated by firstly, using dynamic factor analysis to extract a common factor from the panel data set and, secondly, through use of bridging equations to relate the monthly data to the quarterly GDP estimates. We conduct an out-of-sample forecasting simulation exercise, where the results of the now-casting exercise are compared with those of a standard benchmark model. |
Date: | 2008–11 |
URL: | http://d.repec.org/n?u=RePEc:cbi:wpaper:9/rt/08&r=ecm |
By: | Jesús Fernández-Villaverde |
Abstract: | In this paper, I review the literature on the formulation and estimation of dynamic stochastic general equilibrium (DSGE) models with a special emphasis on Bayesian methods. First, I discuss the evolution of DSGE models over the last couple of decades. Second, I explain why the profession has decided to estimate these models using Bayesian methods. Third, I briefly introduce some of the techniques required to compute and estimate these models. Fourth, I illustrate the techniques under consideration by estimating a benchmark DSGE model with real and nominal rigidities. I conclude by offering some pointers for future research. |
JEL: | C11 C13 E10 |
Date: | 2009–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:14677&r=ecm |
By: | Dellino, G.; Kleijnen, J.P.C.; Meloni, C. (Tilburg University, Center for Economic Research) |
Abstract: | Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by Response Surface Methodology (RSM). George Box originated RSM, and Douglas Montgomery recently extended RSM to robust optimization of real (non-simulated) systems. We combine Taguchi's view with RSM for simulated systems, and apply the resulting methodology to classic Economic Order Quantity (EOQ) inventory models. Our results demonstrate that in general robust optimization requires order quantities that differ from the classic EOQ. |
Keywords: | Pareto frontier;bootstrap;Latin hypercube sampling |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:200869&r=ecm |
By: | Ozlem Tasseven (Okan University, banking and Finance Department, Istanbul Turkey) |
Abstract: | In this paper the HEGY testing procedure (Hylleberg et al. 1990) of analyzing seasonal unit roots is tried to be re-examined by allowing for seasonal mean shifts with exogenous break points. Using some Monte Carlo experiments the distribution of the HEGY and the extended HEGY tests for seasonal unit roots subject to mean shifts and the small sample behavior of the test statistics have been investigated. Based on an empirical analysis upon the conventional money demand relationships in the Turkish economy, our results indicate that seasonal unit roots appear for the GDP deflator, real M2 and the expected inflation variables while seasonal unit roots at annual frequency seem to be disappear for the real M1 balances when the possible structural changes in one or more seasons at 1994 and 2001 crisis years have been taken into account. |
Keywords: | HEGY Seasonal unit root test, Deterministic seasonality, Structural breaks, Money demand, Turkish economy |
JEL: | C01 C15 C51 C88 E41 |
Date: | 2008–09 |
URL: | http://d.repec.org/n?u=RePEc:voj:wpaper:200843&r=ecm |
By: | Mishra, SK |
Abstract: | The classical canonical correlation analysis is extremely greedy to maximize the squared correlation between two sets of variables. As a result, if one of the variables in the dataset-1 is very highly correlated with another variable in the dataset-2, the canonical correlation will be very high irrespective of the correlation among the rest of the variables in the two datasets. We intend here to propose an alternative measure of association between two sets of variables that will not permit the greed of a select few variables in the datasets to prevail upon the fellow variables so much as to deprive the latter of contributing to their representative variables or canonical variates. Our proposed Representation-Constrained Canonical correlation (RCCCA) Analysis has the Classical Canonical Correlation Analysis (CCCA) at its one end (λ=0) and the Classical Principal Component Analysis (CPCA) at the other (as λ tends to be very large). In between it gives us a compromise solution. By a proper choice of λ, one can avoid hijacking of the representation issue of two datasets by a lone couple of highly correlated variables across those datasets. This advantage of the RCCCA over the CCCA deserves a serious attention by the researchers using statistical tools for data analysis. |
Keywords: | Representation; constrained; canonical; correlation; principal components; variates; global optimization; particle swarm; ordinal variables; computer program; FORTRAN |
JEL: | C13 C43 C63 C61 C89 |
Date: | 2009–01–22 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:12948&r=ecm |
By: | María José Rodríguez; Esther Ruiz |
Abstract: | In this paper, we compare the statistical properties of some of the most popular GARCH models with leverage e¤ect when their parameters satisfy the positivity, stationarity and nite fourth order moment restrictions. We show that the EGARCH speci cation is the most exible while the GJR model may have important limitations when restricted to have nite kurtosis. On the other hand, we show empirically that the conditional standard deviations estimated by the TGARCH and EGARCH models are almost identical and very similar to those estimated by the APARCH model. However, the estimates of the QGARCH and GJR models di¤er among them and with respect to the other three speci cations. |
Keywords: | EGARCH, GJR, QGARCH, TGARCH, APARCH |
JEL: | C22 |
Date: | 2009–01 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws090301&r=ecm |