|
on Forecasting |
By: | Heni Boubaker (International University of Rabat, BEAR LAB, Technopolis Rabat-Shore Rocade Rabat-Sale, Morocco); Giorgio Canarella (Department of Economics, Lee Business School, University of Nevada, Las Vegas; Las Vegas, Nevada); Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, 0002, South Africa); Stephen M. Miller (Department of Economics, Lee Business School, University of Nevada, Las Vegas; Las Vegas, Nevada) |
Abstract: | This paper proposes a hybrid modelling approach for forecasting returns and volatilities of the stock market. The model, called ARFIMA-WLLWNN model, integrates the advantages of the ARFIMA model, the wavelet decomposition technique (namely, the discrete MODWT with Daubechies least asymmetric wavelet filter) and artificial neural network (namely, the LLWNN neural network). The model develops through a two-phase approach. In phase one, a wavelet decomposition improves the forecasting accuracy of the LLWNN neural network, resulting in the Wavelet Local Linear Wavelet Neural Network (WLLWNN) model. The Back Propagation (BP) and Particle Swarm Optimization (PSO) learning algorithms optimize the WLLWNN structure. In phase two, the residuals of an ARFIMA model of the conditional mean become the input to the WLLWNN model. The hybrid ARFIMA-WLLWNN model is evaluated using daily closing prices for the Dow Jones Industrial Average (DJIA) index over 01/01/2010 to 02/11/2020. The experimental results indicate that the PSO-optimized version of the hybrid ARFIMA-WLLWNN outperforms the LLWNN, WLLWNN, ARFIMA-LLWNN, and the ARFIMA-HYAPARCH models and provides more accurate out-of-sample forecasts over validation horizons of one, five and twenty-two days. |
Keywords: | Wavelet decomposition, WLLWNN, Neural network, ARFIMA, HYGARCH |
JEL: | C45 C58 G17 |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:pre:wpaper:202056&r=all |
By: | Rossi, Barbara |
Abstract: | This article provides guidance on how to evaluate and improve the forecasting ability of models in the presence of instabilities, which are widespread in economic time series. Empirically relevant examples include predicting the financial crisis of 2007-2008, as well as, more broadly, fluctuations in asset prices, exchange rates, output growth and inflation. In the context of unstable environments, I discuss how to assess models' forecasting ability; how to robustify models' estimation; and how to correctly report measures of forecast uncertainty. Importantly, and perhaps surprisingly, breaks in models' parameters are neither necessary nor sufficient to generate time variation in models' forecasting performance: thus, one should not test for breaks in models' parameters, but rather evaluate their forecasting ability in a robust way. In addition, local measures of models' forecasting performance are more appropriate than traditional, average measures. |
Keywords: | business cycles; Density forecasts; Forecast Confidence Intervals; Forecasting; great recession; inflation; Instabilities; output growth; Structural Breaks; Time variation |
JEL: | D1 E21 E4 E52 H31 I3 |
Date: | 2020–03 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:14472&r=all |
By: | Hernández, Juan R. |
Abstract: | The neutral band is the interval where deviations from Covered Interest Parity (CIP) are not considered meaningful arbitrage opportunities. The band is determined by transaction costs and risk associated to arbitrage. Seemingly large deviations from CIP in the foreign exchange markets for the US Dollar crosses with Sterling, Euro and Mexican Peso have been the norm since the Global Financial Crisis. The topic has attracted a lot of attention in the literature. There are no estimates of the neutral band to assess whether deviations from CIP reflect actual arbitrage opportunities, however. This paper proposes an estimate of the neutral band based on the one-step-ahead density forecast obtained from a stochastic volatility model. Comparison across models is made using the log-score statistic and the probability integral transformation. The stochastic volatility models have the best fit and forecasting performance, hence superior neutral band estimates. |
Keywords: | Covered interest parity; stochastic volatility; forward filtering backward smoothing; auxiliary particle filter; density forecast |
JEL: | C53 C58 F31 F37 |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:100744&r=all |
By: | Constantin Bürgi; Dorine Boumans |
Abstract: | This paper introduces a new test of the predictive performance and market timing for categorical forecasts based on contingency tables when the user has non-categorical loss functions. For example, a user might be interested in the return of an underlying variable instead of just the direction. This new test statistic can also be used to determine whether directional forecasts are derived from non-directional forecasts and whether point forecast have predictive value when transformed into directional forecasts. The tests are applied to the categorical exchange rate forecasts in the ifo-Institute’s World Economic Survey and to the point forecasts for quarterly GDP in the Philadelphia Fed's Survey of Professional Forecasters. We find that the loss function matters as exchange rate forecasters perform better under non-categorical loss functions, and the GDP forecasts have value up to two quarters ahead. |
Keywords: | contingency tables, categorical forecast, profitability, World Economic Survey, directional accuracy, market timing, forecast value |
JEL: | C12 C52 E37 F37 |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_8266&r=all |
By: | Laura Liu; Hyungsik Roger Moon; Frank Schorfheide |
Abstract: | We use dynamic panel data models to generate density forecasts for daily Covid-19 infections for a panel of countries/regions. At the core of our model is a specification that assumes that the growth rate of active infections can be represented by autoregressive fluctuations around a downward sloping deterministic trend function with a break. Our fully Bayesian approach allows us to flexibly estimate the cross-sectional distribution of heterogeneous coefficients and then implicitly use this distribution as prior to construct Bayes forecasts for the individual time series. According to our model, there is a lot of uncertainty about the evolution of infection rates, due to parameter uncertainty and the realization of future shocks. We find that over a one-week horizon the empirical coverage frequency of our interval forecasts is close to the nominal credible level. Weekly forecasts from our model are published at https://laurayuliu.com/covid19-panel-for ecast/. |
JEL: | C11 C23 C53 |
Date: | 2020–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:27248&r=all |
By: | Dendramis, Yiannis; kapetanios, george; Marcellino, Massimiliano |
Abstract: | In the aftermath of the recent financial crisis there has been considerable focus on methods for predicting macroeconomic variables when their behavior is subject to abrupt changes, associated for example with crisis periods. In this paper we propose similarity based approaches as a way to handle parameter instability, and apply them to macroeconomic forecasting. The rationale is that clusters of past data that match the current economic conditions can be more informative for forecasting than the entire past behavior of the variable of interest. We apply our methods to predict both simulated data in a set of Monte Carlo experiments, and a broad set of key US macroeconomic indicators. The forecast evaluation exercises indicate that similarity-based approaches perform well, in general, in comparison with other common time-varying forecasting methods, and particularly well during crisis episodes. |
Keywords: | empirical similarity; Forecast comparison; Kernel estimation ; Macroeconomic forecasting; parameter time variation |
Date: | 2020–03 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:14469&r=all |
By: | Afees A. Salisu (Centre for Econometric & Allied Research, University of Ibadan, Ibadan, Nigeria); Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, 0002, South Africa); Ahamuefula E. Ogbonna (Centre for Econometric & Allied Research, University of Ibadan; Department of Statistics, University of Ibadan, Ibadan, Nigeria) |
Abstract: | We forecast macroeconomic and financial uncertainties of the US over the period of 1960:Q3 to 2018:Q4, based on a large data set of 303 predictors using a wide array of constant parameter and time varying models. We find that uncertainty is indeed forecastable, but while accurate point forecasts can be achieved without incorporating time-variation in the parameters of the small-scale models for macroeconomic uncertainty and large-scale models for financial uncertainty, it is indeed a requirement, along with a large data set, when producing precise density forecasts for both types of uncertainties. |
Keywords: | Macroeconomic and financial uncertainties, large number of predictors, constant parameter and time-varying models, forecasting |
JEL: | C22 C53 C55 |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:pre:wpaper:202058&r=all |
By: | Mueller, H.; Rauh, C. |
Abstract: | There is a growing interest in prevention in several policy areas and this provides a strong motivation for an improved integration of machine learning into models of decision making. In this article we propose a framework to tackle conflict prevention. A key problem of conflict forecasting for prevention is that predicting the start of conflict in previously peaceful countries needs to overcome a low baseline risk. To make progress in this hard problem this project combines unsupervised with supervised machine learning. Specifically, the latent Dirichlet allocation (LDA) model is used for feature extraction from 4.1 million newspaper articles and these features are then used in a random forest model to predict conflict. The output of the forecast model is then analyzed in a framework of cost minimization in which excessive intervention costs due to false positives can be traded off against the damages and destruction caused by conflict. News text is able provide a useful forecast for the hard problem even when evaluated in such a cost-benefit framework. The aggregation into topics allows the forecast to rely on subtle signals from news which are positively or negatively related to conflict risk. |
Keywords: | Conflict prediction, Conflict trap, Topic models, LDA, Random forest, News text, Machine learning |
JEL: | F51 C53 |
Date: | 2020–03–10 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:2015&r=all |
By: | Andrew Atkeson; Karen Kopecky; Tao Zha |
Abstract: | This paper presents a procedure for estimating and forecasting disease scenarios for COVID-19 using a structural SIR model of the pandemic. Our procedure combines the flexibility of noteworthy reduced-form approaches for estimating the progression of the COVID-19 pandemic to date with the benefits of a simple SIR structural model for interpreting these estimates and constructing forecast and counterfactual scenarios. We present forecast scenarios for a devastating second wave of the pandemic as well as for a long and slow continuation of current levels of infections and daily deaths. In our counterfactual scenarios, we find that there is no clear answer to the question of whether earlier mitigation measures would have reduced the long run cumulative death toll from this disease. In some cases, we find that it would have, but in other cases, we find the opposite — earlier mitigation would have led to a higher long-run death toll. |
JEL: | C01 C02 C11 |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:27335&r=all |
By: | William D. Larson (Federal Housing Finance Agency); Tara M. Sinclair (George Washington University) |
Abstract: | Near term forecasts, also called nowcasts, are most challenging but also most important when the economy experiences an abrupt change. In this paper, we explore the performance of models with different information sets and data structures in order to best nowcast US initial unemployment claims in spring of 2020 in the midst of the COVID-19 pandemic. We show that the best model, particularly near the structural break in claims, is a state-level panel model that includes dummy variables to capture the variation in timing of state-of-emergency declarations. Autoregressive models perform poorly at first but catch up relatively quickly. Models including Google Trends are outperformed by alternative models in nearly all periods. Our results suggest that in times of structural change there may be simple approaches to exploit relevant information in the cross sectional dimension to improve forecasts. |
Keywords: | land prices, price gradient, land value taxation, price dynamics |
JEL: | C53 E24 E27 J64 R23 |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:hfa:wpaper:20-02&r=all |
By: | Christiane Baumeister; Dimitris Korobilis; Thomas K. Lee |
Abstract: | This paper evaluates alternative indicators of global economic activity and other market fundamentals in terms of their usefulness for forecasting real oil prices and global petroleum consumption. We find that world industrial production is one of the most useful indicators that has been proposed in the literature. However, by combining measures from a number of different sources we can do even better. Our analysis results in a new index of global economic conditions and new measures for assessing future tightness of energy demand and expected oil price pressures. |
Keywords: | energy demand, forecasting, stochastic volatility, oil price pressures, petroleum consumption |
JEL: | C11 C32 C52 Q41 Q47 |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_8282&r=all |
By: | S M Raju; Ali Mohammad Tarif |
Abstract: | Bitcoin is the first digital decentralized cryptocurrency that has shown a significant increase in market capitalization in recent years. The objective of this paper is to determine the predictable price direction of Bitcoin in USD by machine learning techniques and sentiment analysis. Twitter and Reddit have attracted a great deal of attention from researchers to study public sentiment. We have applied sentiment analysis and supervised machine learning principles to the extracted tweets from Twitter and Reddit posts, and we analyze the correlation between bitcoin price movements and sentiments in tweets. We explored several algorithms of machine learning using supervised learning to develop a prediction model and provide informative analysis of future market prices. Due to the difficulty of evaluating the exact nature of a Time Series(ARIMA) model, it is often very difficult to produce appropriate forecasts. Then we continue to implement Recurrent Neural Networks (RNN) with long short-term memory cells (LSTM). Thus, we analyzed the time series model prediction of bitcoin prices with greater efficiency using long short-term memory (LSTM) techniques and compared the predictability of bitcoin price and sentiment analysis of bitcoin tweets to the standard method (ARIMA). The RMSE (Root-mean-square error) of LSTM are 198.448 (single feature) and 197.515 (multi-feature) whereas the ARIMA model RMSE is 209.263 which shows that LSTM with multi feature shows the more accurate result. |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2006.14473&r=all |
By: | Christoph Berninger; Almond St\"ocker; David R\"ugamer |
Abstract: | Motivated by the application to German interest rates, we propose a timevarying autoregressive model for short and long term prediction of time series that exhibit a temporary non-stationary behavior but are assumed to mean revert in the long run. We use a Bayesian formulation to incorporate prior assumptions on the mean reverting process in the model and thereby regularize predictions in the far future. We use MCMC-based inference by deriving relevant full conditional distributions and employ a Metropolis-Hastings within Gibbs Sampler approach to sample from the posterior (predictive) distribution. In combining data-driven short term predictions with long term distribution assumptions our model is competitive to the existing methods in the short horizon while yielding reasonable predictions in the long run. We apply our model to interest rate data and contrast the forecasting performance to the one of a 2-Additive-Factor Gaussian model as well as to the predictions of a dynamic Nelson-Siegel model. |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2006.05750&r=all |
By: | Akash Doshi; Alexander Issa; Puneet Sachdeva; Sina Rafati; Somnath Rakshit |
Abstract: | Forecasting stock prices can be interpreted as a time series prediction problem, for which Long Short Term Memory (LSTM) neural networks are often used due to their architecture specifically built to solve such problems. In this paper, we consider the design of a trading strategy that performs portfolio optimization using the LSTM stock price prediction for four different companies. We then customize the loss function used to train the LSTM to increase the profit earned. Moreover, we propose a data driven approach for optimal selection of window length and multi-step prediction length, and consider the addition of analyst calls as technical indicators to a multi-stack Bidirectional LSTM strengthened by the addition of Attention units. We find the LSTM model with the customized loss function to have an improved performance in the training bot over a regressive baseline such as ARIMA, while the addition of analyst call does improve the performance for certain datasets. |
Date: | 2020–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2006.04992&r=all |