New Economics Papers
on Risk Management
Issue of 2008‒10‒07
thirteen papers chosen by



  1. Risk Management and the Costs of the Banking Crisis By Patrick Honohan
  2. The Maximum Lq-Likelihood Method: an Application to Extreme Quantile Estimation in Finance By Davide Ferrari; Sandra Paterlini
  3. Bank Failures: The Limitations of Risk Modelling By Patrick Honohan
  4. Combining Canadian Interest-Rate Forecasts By David Jamieson Bolder; Yuliya Romanyuk
  5. Risk management in electricity markets: hedging and market incompleteness By Bert Willems; Joris Morbee
  6. Measuring bank capital requirements through Dynamic Factor analysis By Andrea Cipollini; Giuseppe Missaglia
  7. Credit Crises, Risk Management Systems and Liquidity Modelling By Frank Milne
  8. Differential Evolution for Multiobjective Portfolio Optimization By Thiemo Krink; Sandra Paterlini
  9. Real-time measurement of business conditions By S. Boragan Aruoba; Francis X. Diebold; Chiara Scotti
  10. Can Time-Varying Risk of Rare Disasters Explain Aggregate Stock Market Volatility? By Jessica Wachter
  11. Returns to investors in stocks in new industries By Gerald P. Dwyer, Jr.; Cora Barnhart
  12. Empirical analysis of corporate credit lines By Gabriel Jiménez; José A. López; Jesús Saurina
  13. Evaluating Value-at-Risk models via Quantile regressions By Piazza Gaglianone, Wagner; Linton, Oliver; Renato Lima, Luiz

  1. By: Patrick Honohan
    Abstract: The 2007-8 banking crisis in the advanced economies has exposed deficiencies in risk management and prudential regulation approaches that rely too heavily on mechanical, albeit sophisticated, risk management models. These have aggravated private and economic losses, while perhaps protecting the taxpayer from bearing quite as high a share of the direct costs as in typical crises of the past. Policymakers and bankers need to recognize the limitations of rules-based regulation and restore a more discretionary and holistic approach to risk management.
    Date: 2008–09–29
    URL: http://d.repec.org/n?u=RePEc:iis:dispap:iiisdp262&r=rmg
  2. By: Davide Ferrari; Sandra Paterlini
    Abstract: Estimating financial risk is a critical issue for banks and insurance companies. Recently, quantile estimation based on Extreme Value Theory (EVT) has found a successful domain of application in such a context, outperforming other approaches. Given a parametric model provided by EVT, a natural approach is Maximum Likelihood estimation. Although the resulting estimator is asymptotically efficient, often the number of observations available to estimate the parameters of the EVT models is too small in order to make the large sample property trustworthy. In this paper, we study a new estimator of the parameters, the Maximum Lq-Likelihood estimator (MLqE), introduced by Ferrari and Yang (2007). We show that the MLqE can outperform the standard MLE, when estimating tail probabilities and quantiles of the Generalized Extreme Value (GEV) and the Generalized Pareto (GP) distributions. First, we assess the relative efficiency between the the MLqE and the MLE for various sample sizes, using Monte Carlo simulations. Second, we analyze the performance of the MLqE for extreme quantile estimation using real-world financial data. The MLqE is characterized by a distortion parameter q and extends the traditional log-likelihood maximization procedure. When q→1, the new estimator approaches the traditionalMaximum Likelihood Estimator (MLE), recovering its desirable asymptotic properties; when q 6=1 and the sample size is moderate or small, the MLqE successfully trades bias for variance, resulting in an overall gain in terms of accuracy (Mean Squared Error).
    Keywords: Maximum Likelihood, Extreme Value Theory, q-Entropy, Tail-related Risk Measures
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:mod:recent:001&r=rmg
  3. By: Patrick Honohan
    Abstract: Overconfidence on the part of bankers and regulators in mechanical risk management models is an important and distinctive driver of bank failures in the current crisis. This paper illustrates the process by drawing on brief case studies of a handful of the biggest failures and losses. There are significant implications for a more holistic and less mechanical approach to risk management and prudential regulation.
    Date: 2008–10–01
    URL: http://d.repec.org/n?u=RePEc:iis:dispap:iiisdp263&r=rmg
  4. By: David Jamieson Bolder; Yuliya Romanyuk
    Abstract: Model risk is a constant danger for financial economists using interest-rate forecasts for the purposes of monetary policy analysis, portfolio allocations, or risk-management decisions. Use of multiple models does not necessarily solve the problem as it greatly increases the work required and still leaves the question "which model forecast should one use?" Simply put, structural shifts or regime changes (not to mention possible model misspecifications) make it difficult for any single model to capture all trends in the data and to dominate all alternative approaches. To address this issue, we examine various techniques for combining or averaging alternative models in the context of forecasting the Canadian term structure of interest rates using both yield and macroeconomic data. Following Bolder and Liu (2007), we study alternative implementations of four empirical term structure models: this includes the Diebold and Li (2003) approach and three associated generalizations. The analysis is performed using more than 400 months of data ranging from January 1973 to July 2007. We examine a number of model-averaging schemes in both frequentist and Bayesian settings, both following the literature in this field (such as de Pooter, Ravazzolo and van Dijk (2007)) in addition to introducing some new combination approaches. The forecasts from individual models and combination schemes are evaluated in a number of ways; preliminary results show that model averaging generally assists in mitigating model risk, and that simple combination schemes tend to outperform their more complex counterparts. Such findings carry significant implications for central-banking analysis: a unified approach towards accounting for model uncertainty can lead to improved forecasts and, consequently, better decisions.
    Keywords: Interest rates; Econometric and statistical methods
    JEL: C11 E43 E47
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:08-34&r=rmg
  5. By: Bert Willems; Joris Morbee
    Abstract: The high volatility of electricity markets gives producers and retailers an incentive to hedge their exposure to electricity prices by buying and selling derivatives. This paper studies how welfare and investment incentives are affected when markets for derivatives are introduced, and to what extent this depends on market completeness. We develop an equilibrium model of the electricity market with risk-averse firms and a set of traded financial products, more specifically: forwards and an increasing number of options. Using this model, we first show that aggregate welfare in the market increases with the number of derivatives offered. If firms are concerned with large negative shocks to their profitability due to liquidity constraints, option markets are particularly attractive from a welfare point of view. Secondly, we demonstrate that increasing the number of derivatives improves investment decisions of small firms (especially when firms are risk-averse), because the additional financial markets signal to firms how they can reduce the overall sector risk. Also the information content of prices increases: the quality of investment decisions based on risk-free probabilities, inferred from market prices, improves as markets become more complete Finally, we show that government intervention may be needed, because private investors may not have the right incentives to create the optimal number of markets.
    Date: 2008–08
    URL: http://d.repec.org/n?u=RePEc:ete:ceswps:ces0823&r=rmg
  6. By: Andrea Cipollini; Giuseppe Missaglia
    Abstract: In this paper, using industry sector stock returns as proxies of firm asset values, we obtain bank capital requirements (through the cycle). This is achieved by Montecarlo simulation of a bank loan portfolio loss density. We depart from the Basel 2 analytical formula developed by Gordy (2003) for the computation of the economic capital by, first, allowing dynamic heterogeneity in the factor loadings, and, also, by accounting for stochastic dependent recoveries. Dynamic heterogeneity in the factor loadings is introduced by using dynamic forecast of a Dynamic Factor model fitted to a large dataset of macroeconomic credit drivers. The empirical findings show that there is a decrease in the degree of Portfolio Credit Risk, once we move from the Basel 2 analytic formula to the Dynamic Factor model specification.
    Keywords: Dynamic Factor Model, Forecasting, Stochastic Simulation, Risk Management, Banking
    JEL: C32 C53 E17 G21 G33
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:mod:recent:010&r=rmg
  7. By: Frank Milne (Queen's Unversity)
    Abstract: This paper explores the theoretical structure and implementation of Risk Management systems in Financial Institutions. It uses the current credit crisis as a test of the model's deficiencies. The paper suggests possible modifications to these systems to allow for "liquidity" in asset trading. Also the paper links these modifications to the theory of banking and financial crises and suggests possible ways in which regulators and central banks may exploit or modify RM systems to test for systemic risks.
    Keywords: Credit Risk, Risk Management, Liquidity
    JEL: G18 G21 L13
    Date: 2008–09
    URL: http://d.repec.org/n?u=RePEc:jdi:wpaper:1&r=rmg
  8. By: Thiemo Krink; Sandra Paterlini
    Abstract: Financial portfolio optimization is a challenging problem. First, the problem is multiobjective (i.e.: minimize risk and maximize profit) and the objective functions are often multimodal and non smooth (e.g.: value at risk). Second, managers have often to face real-world constraints, which are typically non-linear. Hence, conventional optimization techniques, such as quadratic programming, cannot be used. Stochastic search heuristic can be an attractive alternative. In this paper, we propose a new multiobjective algorithm for portfolio optimization: DEMPO - Differential Evolution for Multiobjective Portfolio Optimization. The main advantage of this new algorithm is its generality, i.e., the ability to tackle a portfolio optimization task as it is, without simplifications. Our empirical results show the capability of our approach of obtaining highly accurate results in very reasonable runtime, in comparison with quadratic programming and another state-of-art search heuristic, the so-called NSGA II.
    Keywords: Portfolio Optimization, Multiobjective, Real-world Constraints, Value at Risk, Expected Shortfall, Differential Evolution
    JEL: G11 C61 D81
    Date: 2008–06
    URL: http://d.repec.org/n?u=RePEc:mod:recent:021&r=rmg
  9. By: S. Boragan Aruoba; Francis X. Diebold; Chiara Scotti
    Abstract: We construct a framework for measuring economic activity at high frequency, potentially in real time. We use a variety of stock and flow data observed at mixed frequencies (including very high frequencies), and we use a dynamic factor model that permits exact filtering. We illustrate the framework in a prototype empirical example and a simulation study calibrated to the example.
    Keywords: Business conditions
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:08-19&r=rmg
  10. By: Jessica Wachter
    Abstract: This paper introduces a model in which the probability of a rare disaster varies over time. I show that the model can account for the high equity premium and high volatility in the aggregate stock market. At the same time, the model generates a low mean and volatility for the government bill rate, as well as economically significant excess stock return predictability. The model is set in continuous time, assumes recursive preferences and is solved in closed-form. It is shown that recursive preferences, as well as time-variation in the disaster probability, are key to the model's success.
    JEL: G12
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:14386&r=rmg
  11. By: Gerald P. Dwyer, Jr.; Cora Barnhart
    Abstract: We examine the returns to investors in publicly traded stock in new industries. We examine data from the United States on sellers of own-brand personal computers, airlines and airplane manufacturers, automobile manufacturers, railroads, and telegraphs. We find that a relatively small number of companies generate outstanding returns and many firms fail. Firms in new industries typically have high volatility of individual stocks' returns. Compared with indexes for the same period, expected returns of firms are higher for two industries, lower for one industry and roughly the same for two industries. Portfolios of firms in new industries generally have lower Sharpe ratios than the overall market.
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedawp:2008-21&r=rmg
  12. By: Gabriel Jiménez (Banco de España); José A. López (Federal Reseve Bank of San Francisco); Jesús Saurina (Banco de España)
    Abstract: Since bank credit lines are a major source of corporate funding, we examine the determinants of credit line usage with a comprehensive database of Spanish corporate credit lines. A line’s default status is a key factor driving its usage, which increases as a firm’s financial condition worsens. Line usage decreases by roughly 10% for each year of its life. Lender characteristics, such as the number and length of a firm’s banking relationships, are found to affect a firm’s usage decisions, and credit line usage is found to be inversely related to macroeconomic conditions.
    Keywords: credit lines, firm default, bank lending, exposure at default
    JEL: E32 G18 M21
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:0821&r=rmg
  13. By: Piazza Gaglianone, Wagner; Linton, Oliver; Renato Lima, Luiz
    Abstract: This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables to do this sacrifices too much information. However, most of the specification tests (also called backtests) avaliable in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not realy solely on binary variable. It is show that the new backtest provides a sufficiant condition to assess the performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theorical findings are corroborated through a monte Carlo simulation and an empirical exercise with daily S&P500 time series.
    Date: 2008–09
    URL: http://d.repec.org/n?u=RePEc:fgv:epgewp:679&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.