|
on Forecasting |
By: | Marie Diron (Brevan Howard Asset Management); Benoît Mojon (Corresponding author: European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany) |
Abstract: | This paper first shows that the forecast error incurred when assuming that future inflation will be equal to the inflation target announced by the central bank is typically at least as small and often smaller than forecast errors of model-based and published inflation forecasts. It then shows that there are substantial benefits in having rule-of-thumb agents who simply trust that the central bank will deliver its pre-announced inflation objective. |
Keywords: | Monetary policy, credibility, inflation targeting, inflation forecast. |
JEL: | E5 |
Date: | 2005–12 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20050564&r=for |
By: | Frédérick Demers; Annie De Champlain |
Abstract: | The authors investigate the behaviour of core inflation in Canada to analyze three key issues: (i) homogeneity in the response of various price indexes to demand or real exchange rate shocks relative to the response of aggregate core inflation; (ii) whether using disaggregate data helps to improve the forecast of core inflation; and (iii) whether using monthly data helps to improve quarterly forecasts. The authors show that the response of inflation to output-gap or real exchange rate shocks varies considerably across the components, although the average response remains low; they also show that the average response has decreased over time. To forecast monthly inflation, the use of disaggregate data is a significant improvement over the use of aggregate data. However, the improvements in forecasts of quarterly rates of inflation are only minor. Overall, it remains difficult to properly model and forecast monthly core inflation in Canada. |
Keywords: | Econometric and statistical methods; Inflation and prices |
JEL: | E37 C5 |
Date: | 2005 |
URL: | http://d.repec.org/n?u=RePEc:bca:bocawp:05-44&r=for |
By: | Nicoletta Batini (International Monetary Fund); Alejandro Justiniano (International Monetary Fund); Paul Levine (University of Surrey); Joseph Pearlman (London Metropolitan University) |
Abstract: | This paper provides a first attempt to quantify and at the same time utilize estimated measures of uncertainty for the design of robust interest rate rules. We estimate several variants of a linearized form of a New Keynesian model using quarterly US data. Both our theoretical and numerical results indicate that Inflation-Forecast-Based (IFB) rules are increasingly prone to the problem of indeterminacy as the forward horizon increases. As a consequence the stabilization performance of optimized rules of this type worsens too. Robust IFB rules can be designed to avoid indeterminacy in an uncertain environment, but at an increasing utility loss as rules become more forward-looking. |
Keywords: | robustness, Taylor rules, inflation-forecast-based rules, indeterminacy |
JEL: | E52 E37 E58 |
Date: | 2004–09 |
URL: | http://d.repec.org/n?u=RePEc:sur:surrec:0804&r=for |
By: | Amstad, Marlene; Fischer, Andreas M |
Abstract: | This paper analyzes the pass-through from import prices to CPI inflation in real time. Our strategy follows an event-study approach, which compares inflation forecasts before and after import price releases. Inflation forecasts are modelled using a dynamic factor procedure that relies on daily panels of Swiss data. We find strong evidence that monthly import price releases provide important information for CPI inflation forecasts and that the behaviour of updated forecasts is consistent with a time-varying pass-through. The robustness of this latter result is underpinned in two ways: an alternative CPI measure that excludes price components subject to administered pricing and as well as panels capturing different levels of information breadth. Besides implying a time-varying pass-through, our empirical findings cast doubt on a prominent role of sticky prices for the low pass-through findings. |
Keywords: | common factors; daily panels; pass-through |
JEL: | E52 E58 |
Date: | 2005–12 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:5395&r=for |
By: | Helge Berger (Free University Berlin, Department of Economics, Boltzmannstr. 20, 12161 Berlin, Germany & CESifo.); Michael Ehrmann (European Central Bank, Kaiserstrasse 29, Postfach 16 03 19, 60066 Frankfurt am Main, Germany.); Marcel Fratzscher (European Central Bank,Kaiserstrasse 29, Postfach 16 03 19, 60066 Frankfurt am Main, Germany.) |
Abstract: | Monetary policy in the euro area is conducted within a multi-country, multicultural, and multi-lingual context involving multiple central banking traditions. How does this heterogeneity affect the ability of economic agents to understand and to anticipate monetary policy by the ECB? Using a database of surveys of professional ECB policy forecasters in 24 countries, we find remarkable differences in forecast accuracy, and show that they are partly related to geography and clustering around informational hubs, as well as to country-specific economic conditions and traditions of independent central banking in the past. In large part this heterogeneity can be traced to differences in forecasting models. While some systematic differences between analysts have been transitional and are indicative of learning, others are more persistent. |
Keywords: | monetary policy; ECB; forecast; geography; history; heterogeneity; Taylor rule; learning; transmission; survey data; communication. |
JEL: | E52 E58 G14 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060578&r=for |
By: | Alex Cukierman (Tel-Aviv University and Center, Tilburg University); Francesco Lippi (Banca d'Italia) |
Abstract: | This paper characterizes endogenous monetary policy when policymakers are uncertain about the extent to which movements in output and inflation are due to changes in potential output or to cyclical demand and cost shocks. We refer to this informational limitation as the “information problem” (IP). Main results of the paper are: 1. Policy is likely to be excessively loose (restrictive) for some time when there is a large decrease (increase) in potential output in comparison with a full information benchmark. 2. Errors in forecasting potential output and the output gap are generally serially correlated. These ndings provide a partial explanation for the inflation of the seventies and the price stability of the nineties. 3. A quantitative assessment, based on an empirical model of the US economy developed by Rudebusch and Svensson (1999), indicates that during and following periods of large changes in potential output the IP significantly affects the dynamics of inflation and output. 4. The increase in the Fed’s conservativeness between the seventies and the nineties, and a more realistic appreciation of the uncertainties surrounding potential output in the second period, imply that the IP problem had a stronger impact in the seventies than in the nineties. |
Keywords: | monetary policy, potential output, filtering, inflation, output gap |
JEL: | E5 |
Date: | 2004–06 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_493_04&r=for |
By: | Francis Vitek (University of British Columbia) |
Abstract: | This paper develops and estimates an unobserved components model for purposes of monetary policy analysis and inflation targeting in a small open economy. Cyclical components are modeled as a multivariate linear rational expectations model of the monetary transmission mechanism, while trend components are modeled as unobserved components while ensuring the existence of a well defined balanced growth path. Full information maximum likelihood estimation of this unobserved components model, conditional on prior information concerning the values of trend components, provides a quantitative description of the monetary transmission mechanism in a small open economy, yields a mutually consistent set of indicators of inflationary pressure together with confidence intervals, and facilitates the generation of relatively accurate forecasts. |
Keywords: | Monetary policy analysis; Inflation targeting; Small open economy; Unobserved components model; Indicators of inflationary pressure; Monetary transmission mechanism; Forecast performance evaluation |
JEL: | E52 F41 F47 |
Date: | 2005–12–27 |
URL: | http://d.repec.org/n?u=RePEc:wpa:wuwpma:0512019&r=for |
By: | Refet Gurkaynak; Justin Wolfers |
Abstract: | In September 2002, a new market in “Economic Derivatives” was launched allowing traders to take positions on future values of several macroeconomic data releases. We provide an initial analysis of the prices of these options. We find that market-based measures of expectations are similar to survey-based forecasts although the market-based measures somewhat more accurately predict financial market responses to surprises in data. These markets also provide implied probabilities of the full range of specific outcomes, allowing us to measure uncertainty, assess its driving forces, and compare this measure of uncertainty with the dispersion of point-estimates among individual forecasters (a measure of disagreement). We also assess the accuracy of market-generated probability density forecasts. A consistent theme is that few of the behavioral anomalies present in surveys of professional forecasts survive in equilibrium, and that these markets are remarkably well calibrated. Finally we assess the role of risk, finding little evidence that risk-aversion drives a wedge between market prices and probabilities in this market. |
JEL: | C5 C82 D8 E3 E4 G15 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:11929&r=for |
By: | Osmani Teixeira de Carvalho Guillén (IBMEC Business School - Rio de Janeiro and Banco Central do Brasil); João Victor Issler (Graduate School of Economics - EPGE, Getulio Vargas Foundation); George Athanasopoulos (Department of Economics and Business Statistics, Monash University) |
Abstract: | Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The first reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modified information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of fitted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy - reaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models. |
Keywords: | reduced rank models, model selection criteria, forecasting accuracy |
JEL: | C32 C53 |
Date: | 2006–01–02 |
URL: | http://d.repec.org/n?u=RePEc:ibr:dpaper:2006-01&r=for |
By: | Mika Kuismanen (Research Department, European Central Bank, Kaiserstrasse 29, 60311 Frankfurt, Germany); Luigi Pistaferri (Stanford University, Department of Economics, 579 Serra Mall, Stanford, CA 94305-6072, U.S.A.) |
Abstract: | Most of the empirical literature on consumption behaviour over the last decades has focused on estimating Euler equations. However, there is now consensus that data-related problems make this approach unfruitful, especially for answering policy relevant issues. Alternatively, many papers have proposed using the consumption function to forecast behaviour. This paper follows in this tradition, by deriving an analytical consumption function in the presence of intertemporal non-separabilities, "superior information", and income shocks of different nature, both transitory and permanent. The results provide evidence for durability, and show that people are relatively better at forecasting short-term rather than long-term shocks. |
Keywords: | Consumption, Superior Information, Durability, Habit Persistence, Panel Data. |
JEL: | D11 D12 D82 E21 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060572&r=for |
By: | George Athanasopoulos; Farshid Vahid |
Abstract: | In this paper, we argue that there is no compelling reason for restricting the class of multivariate models considered for macroeconomic forecasting to VARs given the recent advances in VARMA modelling methodology and improvements in computing power. To support this claim, we use real macroeconomic data and show that VARMA models forecast macroeconomic variables more accurately than VAR models. |
Keywords: | Forecasting, Identification, Multivariate time series, Scalar components, VARMA models. |
JEL: | C32 C51 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2006-4&r=for |
By: | Wolfgang Härdle; Zdenek Hlavka; Gerhard Stahl |
Abstract: | The Value-at-Risk calculation reduces the dimensionality of the risk factor space. The main reasons for such simplifications are, e.g., technical efficiency, the logic and statistical appropriateness of the model. In Chapter 2 we present three simple mappings: the mapping on the market index, the principal components model and the model with equally correlated risk factors. The comparison of these models in Chapter 3 is based on the literatere on the verification of weather forecasts (Murphy and Winkler 1992, Murphy 1997). Some considerations on the quantitative analysis are presented in the fourth chapter. In the last chapter, we present empirical analysis of the DAX data using XploRe. |
Keywords: | Value-at-Risk, market index model, principal components, random effects model, probability forecast |
JEL: | C51 C52 G20 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2006-003&r=for |
By: | Tuomas A. Peltonen (European Central Bank, Postfach 16 03 19, 60066 Frankfurt am Main, Germany) |
Abstract: | This paper analyzes the predictability of emerging market currency crises by comparing the often used probit model to a new method, namely a multi-layer perceptron artificial neural network (ANN) model. According to the results, both models were able to signal currency crises reasonably well in-sample, but the forecasting power of these models out-ofsample was found to be rather poor. Only in the case of Russian (1998) crisis were both models able to signal the crisis well in advance. The results reinforced the view that developing a stable model that can predict or even explain currency crises is a challenging task. |
Keywords: | Currency crises, emerging markets, artificial neural networks. |
JEL: | F31 E44 C25 C23 C45 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060571&r=for |
By: | Charles Engel; John H. Rogers |
Abstract: | We investigate the possibility that the large current account deficits of the U.S. are the outcome of optimizing behavior. We develop a simple long-run world equilibrium model in which the current account is determined by the expected discounted present value of its future share of world GDP relative to its current share of world GDP. The model suggests that under some reasonable assumptions about future U.S. GDP growth relative to the rest of the advanced countries -- more modest than the growth over the past 20 years -- the current account deficit is near optimal levels. We then explore the implications for the real exchange rate. Under some plausible assumptions, the model implies little change in the real exchange rate over the adjustment path, though the conclusion is sensitive to assumptions about tastes and technology. Then we turn to empirical evidence. A test of current account sustainability suggests that the U.S. is not keeping on a long-run sustainable path. A direct test of our model finds that the dynamics of the U.S. current account -- the increasing deficits over the past decade -- are difficult to explain under a particular statistical model (Markov-switching) of expectations of future U.S. growth. But, if we use survey data on forecasted GDP growth in the G7, our very simple model appears to explain the evolution of the U.S. current account remarkably well. We conclude that expectations of robust performance of the U.S. economy relative to the rest of the advanced countries is a contender -- though not the only legitimate contender -- for explaining the U.S. |
JEL: | F3 F4 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:11921&r=for |
By: | Ando, Amy; Harrington, Winston (Resources For the Future); McConnell, Virginia (Resources For the Future) |
Abstract: | The expense and inconvenience of enhanced vehicle emissions testing using the full 240-second dynamometer test has led states to search for ways to shorten the test process. In fact, all states that currently use the IM240 allow some type of fast-pass, usually as early in the test as second 31, and Arizona allows vehicles to fast-fail after second 93. While these shorter tests save states millions of dollars in inspection lanes and driver costs, there is a loss in information since test results are no longer comparable across vehicles. This paper presents a methodology for estimating full 240 second results from partial-test results for three pollutants- HC, CO and NOx. Using random sample of vehicles in Arizona which received full 240 second tests, we use regression analysis to estimate the relationship between emissions at second 240 and emissions at earlier seconds in the test. We examine the influence of other variables such as age, model-year group, and the pollution level itself on this relationship. We then use the estimated coefficients in several applications. First, we attempt to shed light on the frequent assertion that the results of the dynamometer test provide guidance for vehicle repair of failing vehicles. Using a probit analysis, we find that the probability that a failing vehicle will passing the test on the first retest is greater the longer the test has progressed. Second, we test the accuracy of our estimates for forecasting fleet emissions from partial test emissions results in Arizona. We find that forecast fleet average emissions are very close to the actual fleet averages. |
URL: | http://d.repec.org/n?u=RePEc:rff:dpaper:dp-98-24&r=for |
By: | Darrell Duffie; Leandro Siata; Ke Wang |
Abstract: | We provide maximum likelihood estimators of term structures of conditional probabilities of corporate default, incorporating the dynamics of firm-specific and macroeconomic covariates. For U.S. Industrial firms, based on over 390,000 firm-months of data spanning 1979 to 2004, the level and shape of the estimated term structure of conditional future default probabilities depends on a firm's distance to default (a volatility-adjusted measure of leverage), on the firm's trailing stock return, on trailing S&P 500 returns, and on U.S. interest rates, among other covariates. Distance to default is the most influential covariate. Default intensities are estimated to be lower with higher short-term interest rates. The out-of-sample predictive performance of the model is an improvement over that of other available models. |
JEL: | C41 G33 E44 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:11962&r=for |
By: | Wernstedt, Kris (Resources For the Future); Hersh, Robert |
Abstract: | We examine the use of El Niño-Southern Oscillation (ENSO) forecasts for flood planning in the Pacific Northwest. Using theories of resource mobilization as a conceptual foundation, the paper relies on- 1) case studies of three communities vulnerable to flooding that have had access to long-term forecasts of ENSO conditions; and 2) analysis of data collected from a survey of nearly 60 local emergency managers, planners, and public works staff. We find that understanding the regulatory machinery and other institutions involved in using climate forecasts is critical to more effective use of these forecasts. Forecast use could be promoted by- 1) an extension service to broker climate information; 2) the identification or creation of federal authorities to fund activities to mitigate ENSO impacts; and 3) the proactive use of ENSO signals to identify areas most likely to be influenced by climate anomalies. |
Keywords: | flooding, ENSO, La Niña, climate variability, climate forecast, natural hazards, water policy |
JEL: | Q2 |
URL: | http://d.repec.org/n?u=RePEc:rff:dpaper:dp-02-27&r=for |
By: | Wernstedt, Kris (Resources For the Future); Hersh, Robert |
Abstract: | Recent scientific and technical advances have increased the potential use of longterm seasonal climate forecasts for improving water resource management. This paper examines the role that forecasts, in particular those based on the El Nino-Southern Oscillation (ENSO) cycle, can play in flood planning in the Pacific Northwest. While strong evidence of an association between ENSO signals and flooding in the region exists, this association is open to more than one interpretation depending on- a) the metric used to test the strength of the association; b) the definition of critical flood events; c) site-specific features of watersheds; and d) the characteristics of flood management institutions. A better understanding and appreciation of such ambiguities, both institutional and statistical, is needed to facilitate the use of climate forecast information for flood planning and response. |
Keywords: | Flooding, Climate, ENSO, Water Resources Planning, Water Policy, Water Management |
JEL: | Q2 |
URL: | http://d.repec.org/n?u=RePEc:rff:dpaper:dp-01-56-&r=for |
By: | Sedjo, Roger (Resources For the Future); Goetzel, Alberto |
Abstract: | This discussion paper reports on a Workshop on Wood Fiber Supply Modeling held October 3-4, 1996 in Washington, DC. The purpose of this discussion paper is to provide an overview of some of the modeling work being done related to timber supply modeling and some of the issues related to the more useful application of wood fiber supply and projections models. This paper includes brief presentations of three commonly used long-term timber projections and forecasting models- the Timber Assessment Market Model (TAMM) of the Forest Service; the Cintrafor Global Trade Model (CGTM) of the University of Washington; and the Timber Supply Model (TSM) of Resources for the Future. Also, issues related to the useful of the models are addressed as well as a discussion of some applications of other timber or fiber projection models. The usefulness of the models are addressed from both a technical perspective and also from the perspective of their usefulness to various model users. |
URL: | http://d.repec.org/n?u=RePEc:rff:dpaper:dp-97-22&r=for |
By: | Marc Wildi (University of Technical Sciences, Zurich); Bernd Schips (Swiss Institute for Business Cycle Research (KOF), Swiss Federal Institute of Technology Zurich (ETH)) |
Abstract: | Estimation of signals at the current boundary of time series is an important task in many practical applications. In order to apply the symmetric filter at current time, model-based approaches typically rely on forecasts generated from a time series model in order to extend (stretch) the time series into the future. In this paper we analyze performances of concurrent filters based on TRAMO and X-12-ARIMA for business survey data and compare the results to a new effcient estimation method which does not rely on forecasts. It is shown that both model-based procedures are subject to heavy model misspeci.cation related to false unit root identification at frequency zero and at seasonal frequencies. Our results strongly suggest that the traditional modelbased approach should not be used for problems involving multi-step ahead forecasts such as e.g. the determination of concurrent filters. |
Keywords: | Signalextraction, concurrent filter, unit root, amplitude and time delay. |
Date: | 2004–12 |
URL: | http://d.repec.org/n?u=RePEc:kof:wpskof:04-96&r=for |
By: | Rob J Hyndman; Muhammad Akram |
Abstract: | This paper discusses the instability of eleven nonlinear state space models that underly exponential smoothing. Hyndman et al. (2002) proposed a framework of 24 state space models for exponential smoothing, including the well-known simple exponential smoothing, Holt's linear and Holt-Winters' additive and multiplicative methods. This was extended to 30 models with Taylor's (2003) damped multiplicative methods. We show that eleven of these 30 models are unstable, having infinite forecast variances. The eleven models are those with additive errors and either multiplicative trend or multiplicative seasonality, as well as the models with multiplicative errors, multiplicative trend and additive seasonality. The multiplicative Holt-Winters' model with additive errors is among the eleven unstable models. We conclude that: (1) a model with a multiplicative trend or a multiplicative seasonal component should also have a multiplicative error; and (2) a multiplicative trend should not be mixed with additive seasonality. |
Keywords: | exponential smoothing, forecast variance, nonlinear models, prediction intervals, stability, state space models. |
JEL: | C53 C22 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2006-3&r=for |
By: | Aldy, Joseph (Resources For the Future) |
Abstract: | Understanding and considering the distribution of per capita carbon dioxide (CO2) emissions is important in designing international climate change proposals and incentives for participation. I evaluate historic international emissions distributions and forecast future distributions to assess whether per capita emissions have been converging or will converge. I find evidence of convergence among 23 member countries of the Organisation for Economic Co-operation and Development (OECD), whereas emissions appear to be diverging for an 88-country global sample over 1960–2000. Forecasts based on a Markov chain transition matrix provide little evidence of future emissions convergence and indicate that emissions may diverge in the near term. I also review the shortcomings of environmental Kuznets curve regressions and structural models in characterizing future emissions distributions. |
Keywords: | emissions distributions, environmental Kuznets curve, Markov chain transition matrix |
JEL: | O40 Q54 Q56 |
URL: | http://d.repec.org/n?u=RePEc:rff:dpaper:dp-05-53&r=for |