|
on Forecasting |
By: | Costantini, Mauro (Department of Economics and Finance, Brunel University London, United Kingdom); Gunter, Ulrich (Austrian National Bank, Vienna, Austria); Kunst, Robert M. (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria and Department of Economics, University of Vienna, Austria) |
Abstract: | We study the benefits of forecast combinations based on forecast-encompassing tests relative to uniformly weighted forecast averages across rival models. For a realistic simulation design, we generate multivariate time-series samples of size 40 to 200 from a macroeconomic DSGE-VAR model. Constituent forecasts of the combinations are formed from four linear autoregressive specifications, one of them a more sophisticated factor-augmented vector autoregression (FAVAR). The forecaster is assumed not to know the true data-generating model. Results depend on the prediction horizon. While one-step prediction fails to support test-based combinations at all sample sizes, the test-based procedure clearly dominates at prediction horizons greater than two. |
Keywords: | Combining forecasts, encompassing tests, model selection, time series, DGSE-VAR model |
JEL: | C15 C32 C53 |
Date: | 2012–10 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:292&r=for |
By: | Medel, Carlos A.; Salgado, Sergio C. |
Abstract: | We test two questions: (i) Is the Bayesian Information Criterion (BIC) more parsimonious than Akaike Information Criterion (AIC)?, and (ii) Is BIC better than AIC for forecasting purposes? By using simulated data, we provide statistical inference of both hypotheses individually and then jointly with a multiple hypotheses testing procedure to control better for type-I error. Both testing procedures deliver the same result: The BIC shows an in- and out-of-sample superiority over AIC only in a long-sample context. |
Keywords: | AIC; BIC; time-series models; overfitting; forecast comparison; joint hypothesis testing |
JEL: | C51 C53 C52 C22 |
Date: | 2012–10–25 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:42235&r=for |
By: | Nelson Mark (Department of Economics, University of Notre Dame) |
Abstract: | Factor analysis performed on a panel of 23 nominal exchange rates from January 1999 to December 2010 yields three common factors. This paper identifies the euro/dollar, Swiss- franc/dollar and yen/dollar exchange rates as empirical counterparts to these common factors. These empirical factors explain a large proportion of exchange rate variation over time and have significant in-sample and out-of-sample predictive power. |
Keywords: | Exchange Rates, Common Factors, Forecasting |
JEL: | F31 F37 |
Date: | 2012–03 |
URL: | http://d.repec.org/n?u=RePEc:nod:wpaper:011&r=for |
By: | Michael W. McCracken; Giorgio Valente |
Abstract: | Economic value calculations are increasingly used to compare the predictive performance of competing models of asset returns. However, they lack a rigorous way to validate their evidence. This paper proposes a new methodology to test whether utility gains accruing to investors using competing predictive models are equal to zero. Monte Carlo evidence indicates that our testing procedure, that can account for estimation error in the asymptotic variance of the test statistic, provides accurately sized and powerful tests in empirically relevant sample sizes. We apply the test statistics proposed in the paper to revisit the predictability of the US equity premium by means of various predictors. |
Keywords: | Forecasting |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedlwp:2012-049&r=for |
By: | Gary Koop (University of Strathclyde); Luca Onorante (European Central Bank) |
Abstract: | This paper uses forecasts from the European Central Bank’s Survey of Professional Forecasters to investigate the relationship between inflation and inflation expectations in the euro area. We use theoretical structures based on the New Keynesian and Neoclassical Phillips curves to inform our empirical work and dynamic model averaging in order to ensure an econometric specification capturing potential changes. We use both regression-based and VAR-based methods. The paper confirms that there have been shifts in the Phillips curve and identifies three sub-periods in the EMU: an initial period of price stability, a few years where inflation was driven mainly by external shocks, and the financial crisis, where the New Keynesian Phillips curve outperforms alternative formulations. This finding underlines the importance of introducing informed judgment in forecasting models and is also important for the conduct of monetary policy, as the crisis entails changes in the effect of expectations on inflation and a resurgence of the “sacrifice ratio”. JEL Classification: E31, C53, C11. |
Keywords: | Inflation expectations, survey of professional forecasters, Phillips curve, Bayesian, financial crisis. |
Date: | 2012–02 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121422&r=for |
By: | Eric R. Sims (Department of Economics, University of Notre Dame) |
Abstract: | This paper uses survey expectations data from both Germany and the United States to construct empirical proxies for time-varying business-level uncertainty. Access to the con?dential micro data from the German IFO Business Climate Survey permits construction of uncertainty measures based on both ex-ante disagreement and on ex-post forecast errors. Ex-ante disagreement is strongly correlated with dispersion in ex-post forecast errors, lending credence to the widespread practice of proxying for uncertainty with disagreement. Surprise movements in either measure are associated with signi?cant reductions in production that abate fairly quickly. We extend our analysis to US data, measuring uncertainty with forecast disagreement from the Business Outlook Survey administered by the Federal Reserve Bank of Philadelphia. In contrast to the German case, surprise increases in forecast dispersion lead to large and persistent reductions in production and employment. |
Keywords: | Macroeconomic Uncertainty, Forecast Errors |
JEL: | E |
Date: | 2012–06 |
URL: | http://d.repec.org/n?u=RePEc:nod:wpaper:014&r=for |
By: | Ronkainen , Vesa (Financial Supervisory Authority) |
Abstract: | This work studies and develops tools to quantify and manage the risks and uncertainty relating to the pricing of annuities in the long run. To this end, an idealized Monte-Carlo simulation model is formulated, estimated and implemented, which enables one to investigate some typical pension and life insurance products. The main risks in pension insurance relate to investment performance and mortality/longevity development. We first develop stochastic models for equity and bond returns. The S&P 500 yearly total return is modeled by an uncorrelated and Normally distributed process to which exogenous Gamma distributed negative shocks arrive with Geometrically distributed interarrival times. This regime switching jump model takes into account the empirical observations of infrequent exceptionally large losses. The 5-year US government bond yearly total return is modeled as an ARMA(1,1) process after suitably log-transforming the returns. This model is able to generate long term interest rate cycles and allows rapid year-to-year corrections in the returns. We also address the parameter uncertainty in these models. <p> We then develop a stochastic model for mortality. The chosen mortality forecasting model is the well-known model of Lee and Carter (1992), in which we use the Bayesian MCMC methods in the inference concerning the time index. Our analysis with a local version of the model showed that the assumptions of the Lee-Carter model are not fully compatible with Finnish mortality data. In particular we found that mortality has been lower than average for the cohort born in wartime. However, because the forecasts of these two models were not significantly different, we chose the more parsimonious Lee-Carter model. Although our main focus is on the total population data, we also analysed the data for males and females separately. Finally we build a flexible model for the dependence structure that allows us to generate stochastic scenarios in which mortality and economic processes are either uncorrelated, correlated or shock-correlated. <p> By using the simulation model to generate stochastic pension cash-flows, we are then able to analyse the financing of longevity risk in pension insurance and the resulting risk management issues. This is accomplished via three case studies. Two of these concentrate on the pricing and solvency questions of a pension portfolio. The first study covers a single cohort of different sizes, and the second allows for multiple cohorts of annuitants. The final case study discusses individual pension insurance from the customer and long-term points of view. <p> Realistic statistical long-term risk measurement is the key theme of this work, and so we compare our simulation results with the Value-at-Risk or VaR approach. The results show that the limitations of basic VaR approach must be carefully accounted for in applications. The VaR approach is the most commonly used risk measurement methodology in insurance and finance applications. For instance, it underlies the solvency capital requirement in Solvency II, which we also discuss in this work. |
Keywords: | equities; stocks; jump model; bond; longevity; Lee-Carter model; stochastic mortality; cohort mortality; dependence model; asymmetric dependence; parameter uncertainty; stochastic annuity; pension; cohort size; solvency; internal model |
JEL: | G12 J11 |
Date: | 2012–05–25 |
URL: | http://d.repec.org/n?u=RePEc:hhs:bofism:2012_044&r=for |
By: | Jeffrey S. Racine; Christopher F. Parmeter |
Abstract: | When comparing two competing approximate models using a particular loss function, the one having smallest `expected true error' for that loss function is expected to lie closest to the underlying data generating process (DGP) given this loss function and is therefore to be preferred. In this chapter we consider a data-driven method for testing whether or not two competing approximate models are equivalent in terms of their expected true error (i.e., their expected performance on unseen data drawn from the same DGP). The proposed test is quite flexible with regards to the types of models that can be compared (i.e., nested versus non-nested, parametric versus nonparametric) and is applicable in cross-sectional and time-series settings. Moreover, in time-series settings our method overcomes two of the drawbacks associated with dominant approaches, namely, their reliance on only one split of the data and the need to have a suciently large `hold-out' sample for these tests to possess adequate power. |
Keywords: | approximate, misspecified, model selection, predictive accuracy, data mining |
Date: | 2012–10 |
URL: | http://d.repec.org/n?u=RePEc:mcm:deptwp:2012-13&r=for |
By: | Michael McAleer (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam and Tinbergen Institute The Netherlands and Institute of Economic Research Kyoto University and Department of Quantitative Economics Complutense University of Madrid); Juan-Angel Jimenez-Martin (Department of Quantitative Economics Complutense University of Madrid); Teodosio Perez-Amaral (Department of Quantitative Economics Complutense University of Madrid) |
Abstract: | The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and we compare conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 2008- 09 global financial crisis. These issues are illustrated using Standard and Poor’s 500 Composite Index. |
Keywords: | Value-at-Risk (VaR), daily capital charges, violation penalties, optimizing strategy, risk forecasts, aggressive or conservative risk management strategies, Basel Accord, global financial crisis. |
JEL: | G32 G11 G17 C53 C22 |
Date: | 2012–11 |
URL: | http://d.repec.org/n?u=RePEc:kyo:wpaper:832&r=for |
By: | Dirk Bergemann; Stephen Morris |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:cla:levarc:786969000000000601&r=for |