New Economics Papers
on Risk Management
Issue of 2010‒10‒02
sixteen papers chosen by



  1. Basel III and responding to the recent Financial Crisis: progress made by the Basel Committee in relation to the need for increased bank capital and increased quality of loss absorbing capital By Ojo, Marianne
  2. Price of Risk - Recent Evidence from Large Financials By Karim Youssef; Manmohan Singh
  3. Estimation of operational value-at-risk in the presence of minimum collection threshold: An empirical study By Chernobai, Anna; Menn, Christian; Rachev, Svetlozar T.; Trück, Stefan
  4. Time series analysis for financial market meltdowns By Young Shin Kim; Rachev, Svetlozar T.; Bianchi, Michele Leonardo; Mitov, Ivan; Fabozzi, Frank J.
  5. A semiparametric Bayesian approach to the analysis of financial time series with applications to value at risk estimation By Concepción Ausín; Pedro Galeano; Pulak Ghosh
  6. Model Selection and Testing of Conditional and Stochastic Volatility Models By Massimiliano Caporin; Michael McAleer
  7. Aggregation of exponential smoothing processes with an application to portfolio risk evaluation By SBRANA, Giacomo; SILVESTRINI, Andrea
  8. Simulation of Risk Processes By Burnecki, Krzysztof; Weron, Rafal
  9. Scaling portfolio volatility and calculating risk contributions in the presence of serial cross-correlations By Nikolaus Rab; Richard Warnung
  10. Efficient Evaluation of Multidimensional Time-Varying Density Forecasts with an Application to Risk Management By Evarist Stoja; Arnold Polanski
  11. On the Savety Loading for Chain Ladder Estimates: A Monte Carlo Simulation Study By Magda Schiegl
  12. Commodities inventory effect By CARPANTIER, Jean - François
  13. On Cross-risk Vulnerability By Yannick Malevergne; Rey Beatrice
  14. Ruin probabilities in a finite-horizon risk model with investment and reinsurance By Rosario Romera; Wolfgang Runggaldier
  15. Financial protection of the state against natural disasters : a primer By Ghesquiere, Francis; Mahul, Olivier
  16. Modeling of Interest Rate Term Structures under Collateralization and its Implications By Masaaki Fujii; Yasufumi Shimada; Akihiko Takahashi

  1. By: Ojo, Marianne
    Abstract: Developments since the introduction of the 1988 Basel Capital Accord have resulted in growing realisation that new forms of risks have emerged and that previously existing and managed forms require further redress. The revised Capital Accord, Basel II, evolved to a form of meta regulation – a type of regulation which involves the risk management of internal risks within firms. The 1988 Basel Accord was adopted as a means of achieving two primary objectives: Firstly, “…to help strengthen the soundness and stability of the international banking system – this being facilitated where international banking organisations were encouraged to supplement their capital positions; and secondly, to mitigate competitive inequalities.” As well as briefly outlining various efforts and measures which have been undertaken and adopted by several bodies in response to the recent Financial Crisis, this paper considers why efforts aimed at developing a new framework, namely, Basel III, have been undertaken and global developments which have promulgated the need for such a framework. Further, it attempts to evaluate the strengths and flaws inherent in the present and future regulatory frameworks by drawing a comparison between Basel II and the enhanced framework which will eventually be referred to as Basel III.
    Keywords: capital; cyclicality; buffers; risk; regulation; internal controls; equity; liquidity; losses; forward looking provisions; silent participations; Basel III
    JEL: E0 K2 E32 E58 E44
    Date: 2010–09–22
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:25291&r=rmg
  2. By: Karim Youssef; Manmohan Singh
    Abstract: Probability of default (PD) measures have been widely used in estimating potential losses of, and contagion among, large financial institutions. In a period of financial stress however, the existing methods to compute PDs and generate loss estimates that may vary significantly. This paper discusses three issues that should be taken into account in using PD-based methodologies for loss or contagion analyses: (i) the use of - risk-neutral probabilities - vs. -real-world probabilities; - (ii) the divergence between movements in credit and equity markets during periods of financial stress; and (iii) the assumption of stochastic vs. fixed recovery for financial institutions’ assets. All three elements have nontrivial implications for providing an accurate estimate of default probabilities and associated losses as inputs for setting policies related to large banks in distress.
    Date: 2010–07–22
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:10/190&r=rmg
  3. By: Chernobai, Anna; Menn, Christian; Rachev, Svetlozar T.; Trück, Stefan
    Abstract: The recently finalized Basel II Capital Accord requires banks to adopt a procedure to estimate the operational risk capital charge. Under the Advanced Measurement Approaches, that are currently mandated for all large internationally active US banks, require the use of historic operational loss data. Operational loss databases are typically subject to a minimum recording threshold of roughly $10,000. We demonstrate that ignoring such thresholds leads to biases in corresponding parameter estimates when the threshold is ignored. Using publicly available operational loss data, we analyze the effects of model misspecification on resulting expected loss, Value-at-Risk, and Conditional Value-at-Risk figures and show that underestimation of the regulatory capital is a consequence of such model error. The choice of an adequate loss distribution is conducted via in-sample goodness-of-fit procedures and backtesting, using both classical and robust methodologies. --
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:4&r=rmg
  4. By: Young Shin Kim; Rachev, Svetlozar T.; Bianchi, Michele Leonardo; Mitov, Ivan; Fabozzi, Frank J.
    Abstract: There appears to be a consensus that the recent instability in global financial markets may be attributable in part to the failure of financial modeling. More specifically, current risk models have failed to properly assess the risks associated with large adverse stock price behavior. In this paper, we first discuss the limitations of classical time series models for forecasting financial market meltdowns. Then we set forth a framework capable of forecasting both extreme events and highly volatile markets. Based on the empirical evidence presented in this paper, our framework offers an improvement over prevailing models for evaluating stock market risk exposure during distressed market periods. --
    Keywords: ARMA-GARCH model,»-stable distribution,tempered stable distribution,value-at-risk (VaR),average value-at-risk (AVaR)
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:2&r=rmg
  5. By: Concepción Ausín; Pedro Galeano; Pulak Ghosh
    Abstract: Financial time series analysis deals with the understanding of data collected on financial markets. Several parametric distribution models have been entertained for describing, estimating and predicting the dynamics of financial time series. Alternatively, this article considers a Bayesian semiparametric approach. In particular, the usual parametric distributional assumptions of the GARCH-type models are relaxed by entertaining the class of location-scale mixtures of Gaussian distributions with a Dirichlet process prior on the mixing distribution, leading to a Dirichlet process mixture model. The proposed specification allows for a greater exibility in capturing both the skewness and kurtosis frequently observed in financial returns. The Bayesian model provides statistical inference with finite sample validity. Furthermore, it is also possible to obtain predictive distributions for the Value at Risk (VaR), which has become the most widely used measure of market risk for practitioners. Through a simulation study, we demonstrate the performance of the proposed semiparametric method and compare results with the ones from a normal distribution assumption. We also demonstrate the superiority of our proposed semiparametric method using real data from the Bombay Stock Exchange Index (BSE-30) and the Hang Seng Index (HSI).
    Keywords: Bayesian estimation, Deviance information criterion, Dirichlet process mixture, Financial time series, Location-scale Gaussian mixture, Markov chain Monte Carlo
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws103822&r=rmg
  6. By: Massimiliano Caporin (Department of Economics and Management "Marco Fanno", University of Padova); Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University)
    Abstract: This paper focuses on the selection and comparison of alternative non-nested volatility models. We review the traditional in-sample methods commonly applied in the volatility framework, namely diagnostic checking procedures, information criteria, and conditions for the existence of moments and asymptotic theory, as well as the out-of-sample model selection approaches, such as mean squared error and Model Confidence Set approaches. The paper develops some innovative loss functions which are based on Value-at-Risk forecasts. Finally, we present an empirical application based on simple univariate volatility models, namely GARCH, GJR, EGARCH, and Stochastic Volatility that are widely used to capture asymmetry and leverage.
    Keywords: Volatility model selection, volatility model comparison, non-nested models, model confidence set, Value-at-Risk forecasts, asymmetry, leverage
    JEL: C11 C22 C52
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:724&r=rmg
  7. By: SBRANA, Giacomo (Université de Strasbourg, BETA, F-67085 Strasbourg, France); SILVESTRINI, Andrea (Bank of Italy, Economics, Research and International Relations Area, Economic and Financial Statistics Department, I-00184 Roma, Italy)
    Abstract: In this paper we propose a unified framework to analyse contemporaneous and temporal aggregation of exponential smoothing (EWMA) models. Focusing on a vector IMA(1,1) model, we obtain a closed form representation for the parameters of the contemporaneously and temporally aggregated process as a function of the parameters of the original one. In the framework of EWMA estimates of volatility, we present an application dealing with Value-at-Risk (VaR) prediction at different sampling frequencies for an equally weighted portfolio composed of multiple indices. We apply the aggregation results by inferring the decay factor in the portfolio volatility equation from the estimated vector IMA(1,1) model of squared returns. Empirical results show that VaR predictions delivered using this suggested approach are at least as accurate as those obtained by applying the standard univariate RiskMetrics TM methodology.
    Keywords: contemporaneous and temporal aggregation, EWMA, volatility, Value-at-Risk
    JEL: C10 C32 C43
    Date: 2010–07–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010039&r=rmg
  8. By: Burnecki, Krzysztof; Weron, Rafal
    Abstract: This paper is intended as a guide to simulation of risk processes. A typical model for insurance risk, the so-called collective risk model, treats the aggregate loss as having a compound distribution with two main components: one characterizing the arrival of claims and another describing the severity (or size) of loss resulting from the occurrence of a claim. The collective risk model is often used in health insurance and in general insurance, whenever the main risk components are the number of insurance claims and the amount of the claims. It can also be used for modeling other non-insurance product risks, such as credit and operational risk. In this paper we present efficient simulation algorithms for several classes of claim arrival processes.
    Keywords: Risk process; Claim arrival process; Homogeneous Poisson process (HPP); Non-homogeneous Poisson process (NHPP); Mixed Poisson process; Cox process; Renewal process.
    JEL: C63 C24 G32 C15
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:25444&r=rmg
  9. By: Nikolaus Rab; Richard Warnung
    Abstract: In practice daily volatility of portfolio returns is transformed to longer holding periods by multiplying by the square-root of time which assumes that returns are not serially correlated. Under this assumption this procedure of scaling can also be applied to contributions to volatility of the assets in the portfolio. Trading at exchanges located in different time zones can lead to significant serial cross-correlations of the returns of these assets when using close prices as is usually done in practice. These serial correlations cause the square-root-of-time rule to fail. Moreover volatility contributions in this setting turn out to be misleading due to non-synchronous correlations. We address this issue and provide alternative procedures for scaling volatility and calculating risk contributions for arbitrary holding periods.
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1009.3638&r=rmg
  10. By: Evarist Stoja; Arnold Polanski
    Abstract: We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
    Keywords: Multivariate Density Forecast Evaluation, Probability Integral Transformation, Multidimensional Value at Risk, Monte Carlo Simulations
    JEL: C52 C53
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:09/617&r=rmg
  11. By: Magda Schiegl
    Abstract: A method for analysing the risk of taking a too low reserve level by use of Chain Ladder method is developed. We give an answer to the question of how much safety loading in terms of the Chain Ladder standard error has to be added to the Chain Ladder reserve in order to reach a specified security level in loss reserving. This is an important question in the framework of integrated risk management of an insurance company. Furthermore we investigate the relative bias of Chain Ladder estimators. We use Monte Carlo simulation technique as well as the collective model of risk theory in each cell of run-off table. We analyse deviation between Chain Ladder reserves and Monte Carlo simulated reserves statistically. Our results document dependency on claim number and claim size distribution types and parameters.
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1009.4143&r=rmg
  12. By: CARPANTIER, Jean - François (Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium)
    Abstract: Asymmetric GARCH models were developped for equity stocks to take into account the larger response of the conditional variance to negative price shocks. We show that these asymmetric GARCH models are also relevant for modelling commodity prices. Contrary to the equity case, positive shocks are the main contributors to the conditional variance of commodity prices. The theory of storage, by relating the state of the inventories of a commodity to its conditional variance, is a serious candidate to explain the phenomenon, as positive price shocks for commodities usually serve as proxies for the deterioration of the inventories. We find that this inverse leverage effect, or “inventory effect”, is relatively robust, for different subsamples, for diverse types of commodities and for different ways of specifying the asymmetry, though weaker than the leverage effect for equity stocks. Appropriately specifying the asymmetric conditional variance of commodities could improve risk management, hedging strategies or Value-at-Risk estimates. Incidentally, the inventory effect sheds some new light on the debate about the origin of the leverage effect.
    Keywords: GARCH, asymmetries, leverage effect, inventory, commodities, Value-at-Risk
    JEL: C22 G13 Q14
    Date: 2010–07–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010040&r=rmg
  13. By: Yannick Malevergne (COACTIS - Université Lumière - Lyon II : EA4161 - Université Jean Monnet - Saint-Etienne); Rey Beatrice (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: We introduce the notion of cross-risk vulnerability to generalize the concept of risk vulnerability introduced by Gollier and Pratt [Gollier, C., Pratt, J.W. 1996. Risk vulnerability and the tempering effect of background risk. Econometrica 64, 1109–1124]. While risk vulnerability captures the idea that the presence of an unfair financial background risk should make risk-averse individuals behave in a more risk-averse way with respect to an independent financial risk, cross-risk vulnerability extends this idea to the impact of a non-financial background risk on the financial risk. It provides an answer to the question of the impact of a background risk on the optimal coinsurance rate and on the optimal deductible level. We derive necessary and sufficient conditions for a bivariate utility function to exhibit cross-risk vulnerability both toward an actuarially neutral background risk and toward an unfair background risk. We also analyze the question of the sub-additivity of risk premia and show to what extent cross-risk vulnerability provides an answer.
    Keywords: Risk aversion; Risk vulnerability; Multivariate risk; Background risk
    Date: 2009–10–01
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-00520050_v1&r=rmg
  14. By: Rosario Romera; Wolfgang Runggaldier
    Abstract: A finite horizon insurance model is studied where the risk/reserve process can be controlled by reinsurance and investment in the financial market. Obtaining explicit optimal solutions for the minimizing ruin probability problem is a difficult task. Therefore, we consider an alternative method commonly used in ruin theory, which consists in deriving inequalities that can be used to obtain upper bounds for the ruin probabilities and then choose the control to minimize the bound. We finally specialize our results to the particular, but relevant, case of exponentially distributed claims and compare for this case our bounds with the classical Lundberg bound.
    Keywords: Risk process, Reinsurance and investment, Lundberg’s inequality, 91B30, 93E20, 60J28
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws103721&r=rmg
  15. By: Ghesquiere, Francis; Mahul, Olivier
    Abstract: This paper has been prepared for policy makers interested in establishing or strengthening financial strategies to increase the financial response capacity of governments of developing countries in the aftermath of natural disasters, while protecting their long-term fiscal balances. It analyzes various aspects of emergency financing, including the types of instruments available, their relative costs and disbursement speeds, and how these can be combined to provide cost-effective financing for the different phases that follow a disaster. The paper explains why governments are usually better served by retaining most of their natural disaster risk while using risk transfer mechanisms to manage the excess volatility of their budgets or access immediate liquidity after a disaster. Finally, it discusses innovative approaches to disaster risk financing and provides examples of strategies that developing countries have implemented in recent years.
    Keywords: Debt Markets,Hazard Risk Management,Natural Disasters,Banks&Banking Reform,Insurance&Risk Mitigation
    Date: 2010–09–01
    URL: http://d.repec.org/n?u=RePEc:wbk:wbrwps:5429&r=rmg
  16. By: Masaaki Fujii (The University of Tokyo); Yasufumi Shimada (Shinsei Bank, Limited); Akihiko Takahashi (The University of Tokyo)
    Abstract: In recent years, we have observed dramatic increase of collateralization as an important credit risk mitigation tool in over the counter (OTC) market [6]. Combined with the significant and persistent widening of various basis spreads, such as Libor-OIS and cross currency basis, the practitioners have started to notice the importance of difference between the funding cost of contracts and Libors of the relevant currencies. In this article, we integrate the series of our recent works [1, 2, 4] and explain the consistent construction of term structures of interest rates in the presence of collateralization and all the relevant basis spreads, their no-arbitrage dynamics as well as their implications for derivative pricing and risk management. Particularly, we have shown the importance of the choice of collateral currency and embedded hcheapestto- deliverh (CTD) option in a collateral agreement.
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:cfi:fseres:cf230&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.