|
on Econometrics |
By: | Ladislav Kristoufek |
Abstract: | We examine the performance of six estimators of the power-law cross-correlations -- the detrended cross-correlation analysis, the detrending moving-average cross-correlation analysis, the height cross-correlation analysis, the averaged periodogram estimator, the cross-periodogram estimator and the local cross-Whittle estimator -- under heavy-tailed distributions. The selection of estimators allows to separate these into the time and frequency domain estimators. By varying the characteristic exponent of the $\alpha$-stable distributions which controls the tails behavior, we report several interesting findings. First, the frequency domain estimators are practically unaffected by heavy tails bias-wise. Second, the time domain estimators are upward biased for heavy tails but they have lower estimator variance than the other group for short series. Third, specific estimators are more appropriate depending on distributional properties and length of the analyzed series. In addition, we provide a discussion of implications of these results for empirical applications as well as theoretical explanations. |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1602.05385&r=ecm |
By: | Papa Ousmane Cissé (Centre d'Economie de la Sorbonne et Université Gaston Berger - LERSTAD); Abdou Kâ Diongue (Université Gaston Berger - LERSTAD); Dominique Guegan (Centre d'Economie de la Sorbonne) |
Abstract: | In this paper, we introduce a new model called Fractionally Integrated Separable Spatial Autoregressive processes with Seasonality and denoted Seasonal FISSAR for two-dimensional spatial data. We focus on the class of separable spatial models whose correlation structure can be expressed as a product of correlations. This new modelling allows taking into account the seasonality patterns observed in spatial data. We investigate the properties of this new model providing stationary conditions, some explicit expressions form of the autocovariance function and the spectral density function. We establish the asymptotic behaviour of the spectral density function near the seasonal frequencies and perform some simulations to illustrate the behaviour of the model |
Keywords: | seasonality; spatial short memory; seasonal long memory; two-dimensional data; separable process; spatial stationary process; spatial autocovariance |
JEL: | C02 C21 C51 C52 |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:16013&r=ecm |
By: | Anusha (Indira Gandhi Institute of Development Research) |
Abstract: | This paper examines statistical reliability of univariate filters for estimation of trend in leading indicators of cyclical changes. For this purpose, three measures are used: mean square error for quantitative accuracy, minimum revisions with additional data for statistical accuracy and directional accuracy to capture property of signaling cyclical movements. Our focus is on the widely used Hodrick-Prescott and Henderson filters and their generalizations to splines and RKHS(Reproducing Kernel Hilbert Spaces) embedding respectively. Comparison of trend fitted by the filters is illustrated with Indian and US Industrial production data and a simulated data series. We find that although Henderson smoothers based on RKHS preform better than classical filter, they are not better than spline based methods on the selected criterion for Indian macroeconomic time series. Overall findings suggest that in cases when penalized splines converge in quasi real time, they are better than HP filter on the three criterion. |
Keywords: | Hodrick-Presscott filter, Penalized splines, Henderson smoothers in RKHS, end-of-sample reliability, leading indicators |
JEL: | C32 E37 |
URL: | http://d.repec.org/n?u=RePEc:ind:igiwpp:2015-030&r=ecm |
By: | García, A. |
Abstract: | Existing inference procedures to perform counterfactual decomposition of the difference between distributional features, applicable when data is fully observed, are not suitable for censored outcomes. This may explain the lack of counterfac- tual analyses using target variables related to duration outcomes, typically observed under right censoring. For instance, there are many studies performing counterfac- tual decomposition of the gender wage gaps, but very few on gender unemployment duration gaps. We provide an Oaxaca-Blinder type decomposition method of the mean for censored data. Consistent estimation of the decomposition components is based on a prior estimator of the joint distribution of duration and covariates under suitable restrictions on the censoring mechanism. To decompose other distribu- tional features, such as the median or the Gini coefficient, we propose an inferential method for the counterfactual decomposition by introducing restrictions on the func tional form of the conditional distribution of duration given covariates. We provide formal justification for asymptotic inference and study the finite sample performance through Monte Carlo experiments. Finally, we apply the proposed methodology to the analysis of unemployment duration gaps in Spain. This study suggests that factors beyond the workers' socioeconomic characteristics play a relevant role in explaining the difference between several unemployment duration distribution fea- tures such as the mean, the probability of being long term unemployed and the Gini coefficient. |
Keywords: | Oaxaca-Blinder Decomposition, Right Censoring, Counterfactual Out- comes, Duration Data, Hazard Models, Unemployment Duration, Gender Gaps. |
JEL: | C14 C24 C41 J64 |
Date: | 2016–01–18 |
URL: | http://d.repec.org/n?u=RePEc:col:000092:014186&r=ecm |
By: | Chambers, Marcus J |
Abstract: | This paper analyses the effects of sampling frequency on detrending methods based on an underlying continuous time representation of the process of interest. Such an approach has the advantage of allowing for the explicit - and different - treatment of the ways in which stock and flow variables are actually observed. Some general results are provided before the focus turns to three particular detrending methods that have found widespread use in the conduct of tests for a unit root, these being GLS detrending, OLS detrending, and first differencing, and which correspond to particular values of the generic detrending parameter. In addition, three different scenarios concerning sampling frequency and data span, in each of which the number of observations increases, are considered for each detrending method. The limit properties of the detrending coeffcient estimates, as well as an invariance principle for the detrended variable, are derived. An example of the application of the techniques to testing for a unit root, using GLS detrending on an intercept, is provided and the results of a simulation exercise to analyse the size and power properties of the test in the three different sampling scenarios are reported. |
Keywords: | Continuous time; detrending; sampling frequency |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:esx:essedp:16062&r=ecm |
By: | Wang, Xuexin |
Abstract: | In this paper we propose a new consistent conditional moment test, which synergizes Bierens’ approach with the consistent test of overidentifying restrictions. It relies on a transformation-based empirical process combining both approaches. This new empirical process enjoys some advantages. Firstly it is not affected by the uncertainty from the parameter estimation. Moreover this estimation-effect-free property requires much less restrictive rate condition than in the consistent test of overidentifying restrictions alone. Furthermore the integrated conditional moment (ICM) test based on the new empirical process have power against Pitman local alternatives. We prove, under some regularity conditions, the admissibility of the ICM test based on this transformation-based empirical process in the case that there exists heteroskedasticity of unknown form, extending the result in Bierens and Ploberger (1997). The new consistent test also allows us to propose a much simpler bootstrap procedure than the standard ones. A version of Bierens (1990) test based on the new empirical process is also discussed, and its asymptotic properties are analyzed. Monte Carlo simulations show that Bierens (1990) test based on the new empirical process is more powerful for a large number of alternatives when heteroskedasticity of unknown form is presented. |
Keywords: | Consistent Conditional Moment Test; Consistent Test of Overidentifying Restrictions; ICM Test; Admissibility |
JEL: | C12 C21 |
Date: | 2015–07–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:69005&r=ecm |
By: | Schoonees, P.C.; Groenen, P.J.F.; van de Velden, M. |
Abstract: | __Abstract__ A least-squares bilinear clustering framework for modelling three-way data, where each observation consists of an ordinary two-way matrix, is introduced. The method combines bilinear decompositions of the two-way matrices into overall means, row margins, column margins and row-column interactions with clustering along the third way. Different clusterings are defined for each part of the decomposition, so that up to four different classifications are defined jointly. The computational burden is greatly reduced by the orthogonality of the bilinear model, such that the joint clustering problem reduces to separate ones which can be handled independently. Three of these sub-problems are specific cases of $k$-means clustering; a special algorithm is formulated for the row-column interactions, which are displayed in clusterwise biplots. The method is illustrated via two empirical examples and interpreting the interaction biplots are discussed. |
Keywords: | hree-way data, bilinear decomposition, k-means cluster analysis, least-squares, estimation, biplots. |
Date: | 2015–02–16 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:77757&r=ecm |
By: | Egger, Peter; Staub, Kevin E |
Abstract: | Many empirical gravity models are now based on generalized linear models (GLM), of which the Poisson pseudo-maximum likelihood estimator is a prominent example and the most-frequently used estimator. Previous literature on the performance of these estimators has primarily focussed on the role of the variance function for the estimators’ behavior. We add to this literature by studying the small-sample performance of estimators in a data-generating process that is fully consistent with general equilibrium economic models of international trade. Economic theory suggests that (i) importer- and exporter-specific effects need to be accounted for in estimation, and (ii) that they are correlated with bilateral trade costs through general-equilibrium (or balance-of-payments) restrictions. We compare the performance of structural estimators, fixed effects estimators, and quasi-differences estimators in such settings, using the GLM approach as a unifying framework. |
Keywords: | fixed effects; generalized linear models; gravity models |
JEL: | C23 F14 |
Date: | 2015–02 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:10428&r=ecm |
By: | Markku Lanne (University of Helsinki and CREATES); Jani Luoto (University of Helsinki) |
Abstract: | We propose a sequential Monte Carlo (SMC) method augmented with an importance sampling step for estimation of DSGE models. In addition to being theoretically well motivated, the new method facilitates the assessment of estimation accuracy. Furthermore, in order to alleviate the problem of multimodal posterior distributions due to poor identification of DSGE models when uninformative prior distributions are assumed, we recommend imposing data-driven identification constraints and devise a procedure for finding them. An empirical application to the Smets-Wouters (2007) model demonstrates the properties of the estimation method, and shows how the problem of multimodal posterior distributions caused by parameter redundancy is eliminated by identification constraints. Out-of-sample forecast comparisons as well as Bayes factors lend support to the constrained model. |
Keywords: | Particle filter, importance sampling, Bayesian identification |
JEL: | D58 C11 C32 C52 |
Date: | 2015–08–18 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2015-37&r=ecm |
By: | de Bruijn, L.P.; Segers, R.; Franses, Ph.H.B.F. |
Abstract: | __Abstract__ This paper puts forward a new data collection method to measure daily consumer confidence at the individual level. The data thus obtained allow to statistically analyze the dynamic correlation of such a consumer confidence indicator and to draw inference on transition rates. The latter is not possible for currently available monthly data collected by statistical agencies on the basis of repeated cross-sections. In an application to measuring Dutch consumer confidence, we show that the incremental information content in the novel indicator helps to better forecast consumption. |
Keywords: | Consumer confidence, Randomized sampling, Markov transition model, consumption |
JEL: | C33 C42 C81 E20 |
Date: | 2014–11–01 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:77640&r=ecm |
By: | Hamidi Sahneh, Mehdi |
Abstract: | We propose a test for noncausal vector autoregressive representation generated by non-Gaussian shocks. We prove that in these models the Wold innovations are martingale difference if and only if the model is correctly specified. We propose a test based on a generalized spectral density to check for martingale difference property of the Wold innovations. Our approach does not require to identify and estimate the noncausal models. No specific estimation method is required, and the test has the appealing nuisance parameter free property. The test statistic uses all lags in the sample and it has a convenient asymptotic standard normal distribution under the null hypothesis. A Monte Carlo study is conducted to examine the �finite-sample performance of our test. |
Keywords: | Explosive Bubble; Identification; Noncausal Process; Vector Autoregressive. |
JEL: | C32 C5 |
Date: | 2013–08–05 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:68867&r=ecm |
By: | Sundström, David (Department of Economics, Umeå University) |
Abstract: | Auction theory suggests that as the number of bidders (competition) increases, the sizes of the participants’ bids decrease. An issue in the empirical literature on auctions is which measurement(s) of competition to use. Utilizing a dataset on public procurements containing measurements on both the actual and potential number of bidders I find that a workhorse model of public procurements is best fitted to data using only actual bidders as measurement for competition. Acknowledging that all measurements of competition may be erroneous, I propose an instrumental variable estimator that (given my data) brings about a competition effect bounded by those generated from models using the actual and potential number of bidders, respectively. Also, some asymptotic results are provided for non-linear least squares estimators obtained from a dependent variable transformation model. |
Keywords: | dependent variable transformation model; instrumental variable; measurement error; non-linear least squares |
JEL: | C26 C51 D22 D44 |
Date: | 2016–01–11 |
URL: | http://d.repec.org/n?u=RePEc:hhs:umnees:0920&r=ecm |
By: | Seok Young Hong (Institute for Fiscal Studies); Oliver Linton (Institute for Fiscal Studies and University of Cambridge); Hui Jun Zhang (Institute for Fiscal Studies) |
Abstract: | We propose several multivariate variance ratio statistics. We derive the asymptotic distribution of the statistics and scalar functions thereof under the null hypothesis that returns are unpredictable after a constant mean adjustment (i.e., under the Efficient Market Hypothesis). We do not impose the no leverage assumption of Lo and MacKinlay (1988) but our asymptotic standard errors are relatively simple and in particular do not require the selection of a bandwidth parameter. We extend the framework to allow for a smoothly varying risk premium in calendar time, and show that the limiting distribution is the same as in the constant mean adjustment case. We show the limiting behaviour of the statistic under a multivariate fads model and under a moderately explosive bubble process: these alternative hypotheses give opposite predictions with regards to the long run value of the statistics. We apply the methodology to three weekly size-sorted CRSP portfolio returns from 1962 to 2013 in three subperiods. We ?find evidence of a reduction of linear predictability in the most recent period, for small and medium cap stocks. We ?find similar results for the main UK stock indexes. The main findings are not substantially affected by allowing for a slowly varying risk premium. |
Date: | 2014–06 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:29/14&r=ecm |
By: | Koloch, Grzegorz |
Abstract: | In this paper we provide formulae for likelihood function, filtration densities and prediction densities of a linear state space model in which shocks are allowed to be skewed. In particular we work with the closed skew normal distribution, see González-Farías et al. (2004), which nests a normal distribution as a special case. Closure of the csn distribution with respect to all necessary transformations in the state space setting is guaranteed by a simple state dimension reduction procedure which does not influence the value of the likelihood function. Presented formulae allow for estimation, filtration and prediction of vector autoregressions and first order perturbations of DSGE models with skewed shocks. This allows to assess asymmetries in shocks, observed data, impulse responses and forecasts confidence intervals. Some of the advantages of using the outlined approach may involve capturing asymmetric inflation risks in central banks forecasts or producing more plausible probabilities of deep but rare recessionary episodes with DSGE/VAR filtration. Exemplary estimation results are provided which show that within a linear setting with skewness frequency of big shocks can be rather plausibly identifed. |
Keywords: | Maximum likelihood estimation, state space models, closed skew-normal distribution, DSGE, VAR |
JEL: | C13 C51 E32 |
Date: | 2016–01–25 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:69001&r=ecm |
By: | Dilip M. Nachane (Indira Gandhi Institute of Development Research) |
Abstract: | In recent years DSGE (dynamic stochastic general equilibrium) models have come to play an increasing role in central banks, as an aid in the formulation of monetary policy (and increasingly after the global crisis, for maintaining financial stability). DSGE models, compared to other widely prevalent econometric models (such as VAR, or large-scale econometric models) are less a theoretic and with secure micro-foundations based on the optimizing behavior of rational economic agents. Apart from being "structural", the models bring out the key role of expectations and (being of a general equilibrium nature) can help the policy maker by explicitly projecting the macro - economic scenarios in response to various contemplated policy outcomes. Additionally the models in spite of being strongly tied to theory, can be "taken to the data" in a meaningful way. A major feature of these models is that their theoretical underpinnings lie in what has now come to be called as the New Consensus Macro -economics (NCM). Using the prototype real business cycle model as an illustration, this paper brings out the econometric structure underpinning such models. Estimation and inferential issues are discussed at length with a special emphasis on the role of Bayesian maximum likelihood methods. A detailed analytical critique is also presented together with some promising leads for future research. |
Keywords: | real business cycle, log-linearization, stochastic singularity, Bayesian maximum likelihood, complexity theory, agent-based modeling, robustness |
JEL: | C52 E32 |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:ind:igiwpp:2016-004&r=ecm |
By: | Pingel, Ronnie (Department of Statistics, Uppsala University); Waernbaum, Ingeborg (IFAU - Institute for Evaluation of Labour Market and Education Policy) |
Abstract: | Propensity score based-estimators are commonly used to estimate causal effects in evaluation research. To reduce bias in observational studies researchers might be tempted to include many, perhaps correlated, covariates when estimating the propensity score model. Taking into account that the propensity score is estimated, this study investigates how the efficiency of matching, inverse probability weighting and doubly robust estimators change under the case of correlated covariates. Propositions regarding the large sample variances under certain assumptions of the data generating process are given. The propositions are supplemented by several numerical large sample and finite sample results from a wide range of models. The results show that the correlation may increase or decrease the variances of the estimators. There are several factors that influence how correlation affects the variance of the estimators, including the choice of estimator, the strength of the confounding towards outcome and treatment, and whether a constant or non-constant causal effect is present. |
Keywords: | Double robust; inverse probability weighting; matching; observational study |
JEL: | C13 C40 C52 |
Date: | 2015–02–12 |
URL: | http://d.repec.org/n?u=RePEc:hhs:ifauwp:2015_003&r=ecm |
By: | Le-Yu Chen (Institute for Fiscal Studies and Academia Sinica); Sokbae (Simon) Lee (Institute for Fiscal Studies); Myung Jae Sung (Institute for Fiscal Studies) |
Abstract: | The estimation problem in this paper is motivated by maximum score estimation of preference parameters in the binary choice model under uncertainty in which the decision rule is affected by conditional expectations. The preference parameters are estimated in two stages: we estimate conditional expectations nonparametrically in the fi?rst stage and then the preference parameters in the second stage based on Manski (1975, 1985)?s maximum score estimator using the choice data and ?first stage estimates. This setting can be extended to maximum score estimation with nonparametrically generated regressors. The paper establishes consistency and derives rate of convergence of the two-stage maximum score estimator. Moreover, the paper also provides sufficient conditions under which the two-stage estimator is asymptotically equivalent in distribution to the corresponding single-stage estimator that assumes the ?first stage input is known. The paper also presents some Monte Carlo simulation results for ?finite-sample behavior of the two-stage estimator. |
Date: | 2014–05 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:27/14&r=ecm |
By: | Shin, Minchul (University of Illinois); Zhong, Molin (Board of Governors of the Federal Reserve System (U.S.)) |
Abstract: | We suggest using "realized volatility" as a volatility proxy to aid in model-based multivariate bond yield density forecasting. To do so, we develop a general estimation approach to incorporate volatility proxy information into dynamic factor models with stochastic volatility. The resulting model parameter estimates are highly efficient, which one hopes would translate into superior predictive performance. We explore this conjecture in the context of density prediction of U.S. bond yields by incorporating realized volatility into a dynamic Nelson-Siegel (DNS) model with stochastic volatility. The results clearly indicate that using realized volatility improves density forecasts relative to popular specifications in the DNS literature that neglect realized volatility. |
Keywords: | Dynamic factor model; forecasting; stochastic volatility; term structure of interest rates; dynamic Nelson-Siegel model |
JEL: | C5 E4 G1 |
Date: | 2015–12–18 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-115&r=ecm |
By: | D'Agostino, Antonello (European Stability Mechanism); Giannone, Domenico (Federal Reserve Bank of New York); Lenza, Michele (European Central Bank); Modugno, Michele (Board of Governors of the Federal Reserve System (U.S.)) |
Abstract: | We develop a framework for measuring and monitoring business cycles in real time. Following a long tradition in macroeconometrics, inference is based on a variety of indicators of economic activity, treated as imperfect measures of an underlying index of business cycle conditions. We extend existing approaches by permitting for heterogenous lead-lag patterns of the various indicators along the business cycles. The framework is well suited for high-frequency monitoring of current economic conditions in real time - nowcasting - since inference can be conducted in presence of mixed frequency data and irregular patterns of data availability. Our assessment of the underlying index of business cycle conditions is accurate and more timely than popular alternatives, including the Chicago Fed National Activity Index (CFNAI). A formal real-time forecasting evaluation shows that the framework produces well-calibrated probability nowcasts that resemble the consensus assessment of t he Survey of Professional Forecasters. |
Keywords: | Current Economic Conditions; Dynamic Factor Models; Dynamic Heterogeneity; Business Cycles; Real Time; Nowcasting. |
JEL: | C11 C32 C38 E32 |
Date: | 2015–08–06 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-66&r=ecm |
By: | Svetunkov, Ivan; Kourentzes, Nikolaos |
Abstract: | Exponential smoothing has been one of the most popular forecasting methods for business and industry. Its simplicity and transparency have made it very attractive. Nonetheless, modelling and identifying trends has been met with mixed success, resulting in the development of various modifications of trend models. We present a new approach to time series modelling, using the notion of ``information potential" and the theory of functions of complex variables. A new exponential smoothing method that uses this approach, ``Complex exponential smoothing" (CES), is proposed. It has an underlying statistical model described here and has several advantages over the conventional exponential smoothing models: it allows modelling and forecasting both trended and level time series, effectively sidestepping the model selection problem. CES is evaluated on real data demonstrating better performance than established benchmarks and other exponential smoothing methods. |
Keywords: | Forecasting, exponential smoothing, ETS, model selection, information potential, complex variables |
JEL: | C5 C53 |
Date: | 2015–05–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:69394&r=ecm |