|
on Econometrics |
By: | P.A.V.B. Swamy; George S. Tavlas; Stephen G. Hall; George Hondroyiannis |
Abstract: | Misspecifications of econometric models can lead to biased coefficients and error terms, which in turn can lead to incorrect inference and incorrect models. There are specific techniques such as instrumental variables which attempt to deal with some individual forms of model misspecification. However these can typically only address one problem at a time. This paper proposes a general method for estimating underlying parameters in the presence of a range of unknown model misspecifications. It is argued that this method can consistently estimate the direct effect of an independent variable on a dependent variable with all of its other determinants held constant even in the presence of a misspecified functional form, measurement error and omitted variables. |
Keywords: | Misspecified model; Correct interpretation of coefficients; Appropriate assumption; Time-varying coefficient model; Coefficient driver |
JEL: | C13 C19 C22 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:lec:leecon:08/27&r=ecm |
By: | Marcin Owczarczuk (Warsaw School of Economics) |
Abstract: | This paper presents maximum score type estimators for linear, binomial, tobit and truncated regression models. These estimators estimate the normalized vector of slopes and do not provide the estimator of intercept, although it may appear in the model. Strong consistency is proved. In addition, in the case of truncated and tobit regression models, maximum score estimators allow restriction of the sample in order to make ordinary least squares method consistent. |
Keywords: | maximum score estimation, linear regression, tobit, truncated, binomial, semiparametric |
JEL: | C24 C25 C21 |
Date: | 2008–08–01 |
URL: | http://d.repec.org/n?u=RePEc:wse:wpaper:28&r=ecm |
By: | Arnab Bhattacharjee |
Abstract: | Several omnibus tests of the proportional hazards assumption have been proposed in the literature. In the two-sample case, tests have also been developed against ordered alternatives like monotone hazard ratio and monotone ratio of cumulative hazards. Here we propose a natural extension of these partial orders to the case of continuous covariates. The work is motivated by applications in biomedicine and economics where covariate e¤ects often decay over lifetime. We develop tests for the proportional hazards assumption against ordered alternatives and propose a graphical method to identify the nature of departures from proportionality. The proposed tests do not make restrictive assumptions on the underlying regression model, and are applicable in the presence of multiple covariates and frailty. Small sample performance and applications to real data highlight the usefulness of the framework and methodology. |
Keywords: | Two-sample tests, Increasing hazard ratio, Continuous covariate,Proportional hazards, Frailty, Partial orders, Time varying coe¢cients. |
JEL: | C12 C14 C41 |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:san:wpecon:0807&r=ecm |
By: | Joachim Wilde |
Abstract: | The inference in probit models relies on the assumption of normality. However, tests of this assumption are not implemented in standard econometric software. Therefore, the paper presents a simple representation of the Bera-Jarque-Lee test, that does not require any matrix algebra. Furthermore, the representation is used to compare the Bera-Jarque- Lee test with the RESET-type test proposed by Papke and Wooldridge (1996). |
Date: | 2007–12 |
URL: | http://d.repec.org/n?u=RePEc:iwh:dispap:13-07&r=ecm |
By: | Frölich, Markus (University of Mannheim); Melly, Blaise (Brown University) |
Abstract: | This paper shows nonparametric identification of quantile treatment effects (QTE) in the regression discontinuity design (RDD) and proposes simple estimators. Quantile treatment effects are a very helpful tool to characterize the effects of certain interventions on the outcome distribution. The distributional impacts of social programs such as welfare, education, training programs and unemployment insurance are of large interest to economists. |
Keywords: | quantile treatment effect, causal effect, endogeneity, regression discontinuity |
JEL: | C13 C14 C21 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp3638&r=ecm |
By: | Chew Lian Chua (Melbourne Institute of Applied Economic and Social Research, The University of Melbourne); G. C. Lim (Melbourne Institute of Applied Economic and Social Research, The University of Melbourne); Penelope Smith (Westpac Banking Corporation, Sydney) |
Abstract: | This paper provides a Bayesian approach to inference on a multi-state latent factor intensity model to manage the problem of highly analytically intractable pdfs. The sampling algorithm used to obtain posterior distributions of the model parameters includes a particle filter step and a Metropolis-Hastings step within a Gibbs sampler. A simulated example is conducted to show the feasibility and accuracy of this sampling algorithm. The approach is applied to the case of credit ratings transition matrices. |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:iae:iaewps:wp2008n16&r=ecm |
By: | James Mitchell; Jore, A. S., Vahey, S. P. |
Abstract: | Clark and McCracken (2008) argue that combining real-time point forecasts from VARs of output, prices and interest rates improves point forecast accuracy in the presence of uncertain model instabilities. In this paper, we generalize their approach to consider forecast density combinations and evaluations. Whereas Clark and Mc-Cracken (2008) show that the point forecast errors from particular equal-weight pair wise averages are typically comparable or better than benchmark univariate time series models, we show that neither approach produces accurate real-time forecast densities for recent US data. If greater weight is given to models that allow for the shifts in volatilities associated with the Great Moderation, predictive density accuracy improves substantially. |
Date: | 2008–01 |
URL: | http://d.repec.org/n?u=RePEc:nsr:niesrd:303&r=ecm |
By: | Philippe Moës (National Bank of Belgium, Research Department) |
Abstract: | Structural time series models applied to the factor inputs of a production function often lead to small output gaps and consequently to erratic measures of potential growth. We introduce a dual cycle model which is an extension to the multivariate trend plus cycle model with phase shifts à la Rünstler. The dual cycle model is a combination of two types of models: the trend plus cycle model and the cyclical trend model, where the cycle appears in the growth rate of a variable. This property enables hysteresis to be taken into account. Hysteresis is likely to show up in unemployment but it can also affect the capital stock due to the existence of long investment cycles. In the proposed model, hysteresis may affect all the factor inputs of the production function and phase shifts are extended to the dual cycles. Genuine measures of potential growth can be computed that are hysteresis-free and less prone to volatility. A complementary measure of the output gap that takes hysteresis into account can be derived |
Keywords: | Output gap, potential growth, hysteresis, structural time series models |
JEL: | C32 E32 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:nbb:reswpp:200808-20&r=ecm |
By: | Sylvia Kaufmann (Oesterreichische Nationalbank, Economic Studies Division, P.O. Box 61, A-1010 Vienna,) |
Abstract: | The information contained in a large panel data set is used to date historical turning points of the Austrian business cycle and to forecast future ones. We estimate groups of series with similar time series dynamics and link the groups with a dynamic structure. The dynamic structure identifies a group of leading and a group of coincident series. Robust results across data vintages are obtained when series specific information is incorporated in the design of the prior group probability distribution. The results are consistent with common expectations, in particular the group of leading series includes Austrian confidence indicators and survey data, German survey indicators, some trade data, and, interestingly, the Austrian and the German stock market indices. The forecast evaluation confirms that the Markov switching panel with dynamic structure performs well when compared to other specifications. |
Keywords: | Bayesian clustering, parameter heterogeneity, latent dynamic structure, Markov switching, panel data, turning points. |
JEL: | C23 E32 |
Date: | 2008–06–19 |
URL: | http://d.repec.org/n?u=RePEc:onb:oenbwp:144&r=ecm |
By: | Laura Auria; Rouslan A. Moro |
Abstract: | This paper introduces a statistical technique, Support Vector Machines (SVM), which is considered by the Deutsche Bundesbank as an alternative for company rating. A special attention is paid to the features of the SVM which provide a higher accuracy of company classification into solvent and insolvent. The advantages and disadvantages of the method are discussed. The comparison of the SVM with more traditional approaches such as logistic regression (Logit) and discriminant analysis (DA) is made on the Deutsche Bundesbank data of annual income statements and balance sheets of German companies. The out-of-sample accuracy tests confirm that the SVM outperforms both DA and Logit on bootstrapped samples. |
Keywords: | Company rating, bankruptcy analysis, support vector machines |
JEL: | C13 G33 C45 |
Date: | 2008 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp811&r=ecm |
By: | T M Christensen (QUT); A S Hurn (QUT); K A Lindsay (University of Glasgow) |
Abstract: | Finding the minimum of an objective function, such as a least squares or negative log-likelihood function, with respect to the unknown model parameters is a problem often encountered in econometrics. Consequently, students of econometrics and applied econometricians are usually well-grounded in the broad differences between the numerical procedures employed to solve these problems. Often, however, relatively little time is given to understanding the practical subtleties of implementing these schemes when faced with illbehaved problems. This paper addresses some of the details involved in practical optimisation, such as dealing with constraints on the parameters, specifying starting values, termination criteria and analytical gradients, and illustrates some of the general ideas with several instructive examples. |
Keywords: | gradient algorithms, unconstrained optimisation, generalised method of moments. |
JEL: | C13 C63 |
Date: | 2008–08–18 |
URL: | http://d.repec.org/n?u=RePEc:qut:auncer:2008-21&r=ecm |
By: | Aureo de Paula (Department of Economics, University of Pennsylvania) |
Abstract: | This paper studies inference in a continuous time game where an agent's decision to quit an activity depends on the participation of other players. In equilibrium, similar actions can be explained not only by direct influences but also by correlated factors. Our model can be seen as a simultaneous duration model with multiple decision makers and interdependent durations. We study the problem of determining the existence and uniqueness of equilibrium stopping strategies in this setting. This paper provides results and conditions for the detection of these endogenous effects. First, we show that the presence of such effects is a necessary and sufficient condition for simultaneous exits. This allows us to set up a nonparametric test for the presence of such influences which is robust to multiple equilibria. Second, we provide conditions under which parameters in the game are identified. Finally, we apply the model to data on desertion in the Union Army during the American Civil War and find evidence of endogenous influences. |
Keywords: | duration models, social interactions, empirical games, optimal stopping |
JEL: | C10 C70 D70 |
Date: | 2004–10–01 |
URL: | http://d.repec.org/n?u=RePEc:pen:papers:08-032&r=ecm |
By: | Chauvet, Marcelle; Potter, Simon |
Abstract: | This paper examines the predictive content of coincident variables for monitoring U.S. recessions in the presence of instabilities. We propose several specifications of a probit model for classifying phases of the business cycle. We find strong evidence in favor of the ones that allow for the possibility that the economy has experienced recurrent breaks. The recession probabilities of these models provide a clearer classification of the business cycle into expansion and recession periods, and superior performance in the ability to correctly call recessions and to avoid false recession signals. Overall, the sensitivity, specificity, and accuracy of these models are far superior as well as their ability to timely signal recessions. The results indicate the importance of considering recurrent breaks for monitoring business cycles. |
Keywords: | Recession; Instability; Bayesian Methods; Probit model; Breaks. |
JEL: | E32 C35 C11 |
Date: | 2007–12–31 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:10149&r=ecm |
By: | Stephen G. Hall; George Hondroyiannis; P.A.V.B. Swamy; George S. Tavlas |
Abstract: | The New Keynesian Phillips Curve (NKPC) specifies a relationship between inflation and a forcing variable and the current period’s expectation of future inflation. Most empirical estimates of the NKPC, typically based on Generalized Method of Moments (GMM) estimation, have found a significant role for lagged inflation, producing a “hybrid” NKPC. Using U.S. quarterly data, this paper examines whether the role of lagged inflation in the NKPC might be due to the spurious outcome of specification biases. Like previous investigators, we employ GMM estimation and, like those investigators, we find a significant effect for lagged inflation. We also use time varying-coefficient (TVC) estimation, a procedure that allows us to directly confront specification biases and spurious relationships. Using three separate measures of expected inflation, we find strong support for the view that, under TVC estimation, the coefficient on expected inflation is near unity and that the role of lagged inflation in the NKPC is spurious. |
Keywords: | New Keynesian Phillips Curve; time-varying coefficients; spurious relationships |
JEL: | C51 E31 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:lec:leecon:08/26&r=ecm |
By: | Heckman, James J. (University of Chicago) |
Abstract: | This paper uses data available from the National Opinion Research Center's (NORC) survey on religious attitudes and powerful statistical methods to evaluate the effect of prayer on the attitude of God toward human beings. |
Keywords: | unobserved variables, kernel estimator |
JEL: | Z12 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp3636&r=ecm |
By: | Agapie, Adriana |
Abstract: | This paper shows that, in case of an econometric model with a high sensitivity to data, using stochastic optimization algorithms is better than using classical gradient techniques. In addition, we showed that the Repetitive Stochastic Guesstimation (RSG) algorithm –invented by Charemza-is closer to Simulated Annealing (SA) than to Genetic Algorithms (GAs), so we produced hybrids between RSG and SA to study their joint behavior. The evaluation of all algorithms involved was performed on a short form of the Romanian macro model, derived from Dobrescu (1996). The subject of optimization was the model’s solution, as function of the initial values (in the first stage) and of the objective functions (in the second stage). We proved that a priori information help “elitist “ algorithms (like RSG and SA) to obtain best results; on the other hand, when one has equal believe concerning the choice among different objective functions, GA gives a straight answer. Analyzing the average related bias of the model’s solution proved the efficiency of the stochastic optimization methods presented. |
Keywords: | underground economy, Laffer curve, informal activity, fiscal policy, transitionmacroeconomic model, stochastic optimization, evolutionary algorithms, Repetitive Stochastic Guesstimation |
JEL: | E17 C15 C65 |
Date: | 2008–08 |
URL: | http://d.repec.org/n?u=RePEc:rjr:wpiecf:080825&r=ecm |