New Economics Papers
on Risk Management
Issue of 2012‒10‒27
seventeen papers chosen by



  1. Measuring and Analysing Marginal Systemic Risk Contribution using CoVaR: A Copula Approach By Brice Hakwa; Manfred J\"ager-Ambro\.zewicz; Barbara R\"udiger
  2. Nonparametric estimation of Value-at-Risk By Ramon Alemany; Catalina Bolancé; Montserrat Guillén
  3. Value at Risk Model Used to Stock Prices Prediction By Radim Gottwald
  4. Credit Risk Contagion and the Global Financial Crisis By Azusa Takeyama; Nick Constantinou; Dmitri Vinogradov
  5. Estimation of the Marginal Expected Shortfall: The Mean when a Related Variable is Extreme By Cai, J.; Einmahl, J.H.J.; Haan, L.F.M. de; Zhou, C.
  6. An Approximate Solution Method for Large Risk-Averse Markov Decision Processes By Marek Petrik; Dharmashankar Subramanian
  7. A Markov Copula Model of Portfolio Credit Risk with Stochastic Intensities and Random Recoveries By Bielecki, T.R.; Cousin, A.; Crépey, S.; Herbertsson, Alexander
  8. Do Hedge Funds Reduce Idiosyncratic Risk? By Namho Kang; Peter Kondor; Ronnie Sadka
  9. A Framework for Extracting the Probability of Default from Stock Option Prices By Azusa Takeyama; Nick Constantinou; Dmitri Vinogradov
  10. Cascading Failures in Bi-partite Graphs: Model for Systemic Risk Propagation By Xuqing Huang; Irena Vodenska; Shlomo Havlin; H. Eugene Stanley
  11. Diversification of Geographic Risk in Retail Bank Networks: Evidence from Bank Expansion after the Riegle-Neal Act By Victor Aguirregabiria; Robert Clark; Hui Wang
  12. Start-Up Size Strategy: Risk Management and Performance By André van Stel; Andrew Burke; José Maria Millan; Concepcion Roman
  13. A Copula Based Bayesian Approach for Paid-Incurred Claims Models for Non-Life Insurance Reserving By Gareth W. Peters; Alice X. D. Dong; Robert Kohn
  14. An introduction to particle integration methods: with applications to risk and insurance By P. Del Moral; G. W. Peters; Ch. Verg\'e
  15. Optimal Preventive Bank Supervision Combining Random Audits and Continuous Intervention By Mohamed Belhaj; Nataliya Klimenko
  16. Weighted Sets of Probabilities and MinimaxWeighted Expected Regret: New Approaches for Representing Uncertainty and Making Decisions By Joseph Y. Halpern; Samantha Leung
  17. The interaction between the central bank and government in tail risk scenarios By Jan Willem van den End; Marco Hoeberichts

  1. By: Brice Hakwa; Manfred J\"ager-Ambro\.zewicz; Barbara R\"udiger
    Abstract: This paper is devoted to the quantification and analysis of marginal risk contribution of a given single financial institution i to the risk of a financial system s. Our work expands on the CoVaR concept proposed by Adrian and Brunnermeier as a tool for the measurement of marginal systemic risk contribution. We first give a mathematical definition of CoVaR_{\alpha}^{s|L^i=l}. Our definition improves the CoVaR concept by expressing CoVaR_{\alpha}^{s|L^i=l} as a function of a state l and of a given probability level \alpha relative to i and s respectively. Based on Copula theory we connect CoVaR_{\alpha}^{s|L^i=l} to the partial derivatives of Copula through their probabilistic interpretation and definitions (Conditional Probability). Using this we provide a closed formula for the calculation of CoVaR_{\alpha}^{s|L^i=l} for a large class of (marginal) distributions and dependence structures (linear and non-linear). Our formula allows a better analysis of systemic risk using CoVaR in the sense that it allows to define CoVaR_{\alpha}^{s|L^i=l} depending on the marginal distributions of the losses of i and s respectively and the copula between L^i and L^s. We discuss the implications of this in the context of the quantification and analysis of systemic risk contributions. %some mathematical This makes possible the For example we will analyse the marginal effects of L^i, L^s and C of the risk contribution of i.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.4713&r=rmg
  2. By: Ramon Alemany (Department of Econometrics, Riskcenter-IREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain); Catalina Bolancé (Department of Econometrics, Riskcenter-IREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain); Montserrat Guillén (Department of Econometrics, Riskcenter-IREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain)
    Abstract: A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.
    Keywords: kernel estimation, bandwidth selection, quantile, risk measures..
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:xrp:wpaper:xreap2012-19&r=rmg
  3. By: Radim Gottwald (Department of Finance, Faculty of Business and Economics, Mendel university in Brno)
    Abstract: The focus of the author is the Value at Risk model which is currently often adopted as the risk analysis model, particularly in banking and insurance. Following the model principle characteristics, the Value at Risk is economically interpreted. Attention is paid to the distinct features of three sub-methods: historical simulation, the Monte Carlo method and variance and covariance method. A row of empirical studies of the practical application of these methods are provided. The objective of the paper is the application of the Value at Risk model on shares from the SPAD segment of the Prague Stock Exchange between 2009 and 2011. A corresponding reliability interval, hold time, historical period and other essential parameters related to the sub-methods are gradually defined and chosen. By using historical values of stocks and shares, diverse statistical indicators are calculated. The diversified Values at Risk of the sub-methods are benchmarked against the non-diversified ones. The results show that any loss related to the non-diversified Value at Risk is always higher among the three methods than a loss related to a diversified Value at Risk. We can expect – with selected probability – a drop in the value of the portfolio which differs depending on which method is adopted based on recent share developments. The methodology is further benchmarked against other methodologies used in other papers applying the Value at Risk model. The message of this paper lies in the unique selection of applied methods, risk factors and the stock market. The methodology allows us to evaluate the risk level for investments in shares in a specific way, which will be appreciated by numerous financial entities when making an investment decision.
    Keywords: risk measurement, historical simulation method, Monte Carlo method, variance covariance method
    JEL: C15 E37 G32
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:men:wpaper:30_2012&r=rmg
  4. By: Azusa Takeyama (Deputy Director and Economist, Institute for Monetary and Economic Studies, Bank of Japan (E-mail: azusa.takeyama@boj.or.jp)); Nick Constantinou (Lectuer, Essex Business School, University of Essex (E-mail: nconst@essex.ac.uk)); Dmitri Vinogradov (Lectuer, Essex Business School, University of Essex (E-mail:dvinog@essex.ac.uk))
    Abstract: This paper investigates how the market valuation of credit risk changed during 2008-2009 via a separation of the probability of default (PD) and the loss given default (LGD) of credit default swaps ( CDSs), using the information implied by equity options. While the Lehman Brothers collapse in September 2008 harmed the stability of the financial systems in major industrialized countries, the CDS spreads of some major UK banks did not increase in response to this turmoil in financial markets including the decline in their own stock prices. This implies that the CDS spreads of financial institutions may not reflect all their credit risk due to the government interventions. Since CDS spreads are not appropriate to analyze the impact of the government interventions on credit risk and the cross sectional movement of credit risk, we investigate how the government interventions affect the PD and LGD of financial institutions and how the PD and LGD of financial institutions were related with those of non-financial firms. We demonstrate that the rise in the credit risk of financial institutions did not bring about that of non-financial firms (credit risk contagion) both in the US and UK using principal component analysis.
    Keywords: Credit Default Swap (CDS), Probability of Default (PD), Loss Given Default (LGD), Credit Risk Contagion
    JEL: C12 C53 G13
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:ime:imedps:12-e-15&r=rmg
  5. By: Cai, J.; Einmahl, J.H.J.; Haan, L.F.M. de; Zhou, C. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: Denote the loss return on the equity of a financial institution as X and that of the entire market as Y . For a given very small value of p > 0, the marginal expected shortfall (MES) is defined as E(X | Y > QY (1−p)), where QY (1−p) is the (1−p)-th quantile of the distribution of Y . The MES is an important factor when measuring the systemic risk of financial institutions. For a wide nonparametric class of bivariate distributions, we construct an estimator of the MES and establish the asymptotic normality of the estimator when p ↓ 0, as the sample size n → ∞. Since we are in particular interested in the case p = O(1=n), we use extreme value techniques for deriving the estimator and its asymptotic behavior. The finite sample performance of the estimator and the adequacy of the limit theorem are shown in a detailed simulation study. We also apply our method to estimate the MES of three large U.S. investment banks.
    Keywords: Asymptotic normality;extreme values;tail dependence.
    JEL: C13 C14
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2012080&r=rmg
  6. By: Marek Petrik; Dharmashankar Subramanian
    Abstract: Stochastic domains often involve risk-averse decision makers. While recent work has focused on how to model risk in Markov decision processes using risk measures, it has not addressed the problem of solving large risk-averse formulations. In this paper, we propose and analyze a new method for solving large risk-averse MDPs with hybrid continuous-discrete state spaces and continuous action spaces. The proposed method iteratively improves a bound on the value function using a linearity structure of the MDP. We demonstrate the utility and properties of the method on a portfolio optimization problem.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.4901&r=rmg
  7. By: Bielecki, T.R. (Illinois Institute of Technology); Cousin, A. (Université de Lyon); Crépey, S. (Université d’Évry Val d’Essonne); Herbertsson, Alexander (Department of Economics, School of Business, Economics and Law, Göteborg University)
    Abstract: In [4], the authors introduced a Markov copula model of portfolio credit risk. This model solves the top-down versus bottom-up puzzle in achieving efficient joint calibration to single-name CDS and to multi-name CDO tranches data. In [4], we studied a general model, that allows for stochastic default intensities and for random recoveries, and we conducted empirical study of our model using both deterministic and stochastic default intensities, as well as deterministic and random recoveries only. Since, in case of some “badly behaved” data sets a satisfactory calibration accuracy can only be achieved through the use of random recoveries, and, since for important applications, such as CVA computations for credit derivatives, the use of stochastic intensities is advocated by practitioners, efficient implementation of our model that would account for these two issues is very important. However, the details behind the implementation of the loss distribution in the case with random recoveries were not provided in [4]. Neither were the details on the stochastic default intensities given there. This paper is thus a complement to [4], with a focus on a detailed description of the methodology that we used so to implement these two model features: random recoveries and stochastic intensities.<p>
    Keywords: Portfolio Credit Risk; Markov Copula Model; Common Shocks; Stochastic Spreads; Random Recoveries
    JEL: C02 C63 G13 G32 G33
    Date: 2012–10–16
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0545&r=rmg
  8. By: Namho Kang; Peter Kondor; Ronnie Sadka
    Abstract: This paper studies the effect of hedge-fund trading on idiosyncratic risk. We hypothesize that while hedge-fund activity would often reduce idiosyncratic risk, high initial levels of idiosyncratic risk might be further amplified due to fund loss limits. Panel-regression analyses provide supporting evidence for this hypothesis. The results are robust to sample selection and are further corroborated by a natural experiment using the Lehman bankruptcy as an exogenous adverse shock to hedge-fund trading. Hedge-fund capital also explains the increased idiosyncratic volatility of high-idiosyncratic-volatility stocks as well as the decreased idiosyncratic volatility of low-idiosyncratic-volatility stocks over the past few decade.
    Date: 2012–10–04
    URL: http://d.repec.org/n?u=RePEc:ceu:econwp:2012_15&r=rmg
  9. By: Azusa Takeyama (Deputy Director and Economist, Institute for Monetary and Economic Studies, Bank of Japan (E-mail: azusa.takeyama@boj.or.jp)); Nick Constantinou (Lectuer, Essex Business School, University of Essex (E-mail: nconst@essex.ac.uk)); Dmitri Vinogradov (Lectuer, Essex Business School, University of Essex (E-mail:dvinog@essex.ac.uk))
    Abstract: This paper develops a framework to estimate the probability of default (PD) implied in listed stock options. The underlying option pricing model measures PD as the intensity of a jump diffusion process, in which the underlying stock price jumps to zero at default. We adopt a two-stage calibration algorithm to obtain the precise estimator of PD. In the calibration procedure, we improve the fitness of the option pricing model via the implementation of the time inhomogeneous term structure model in the option pricing model. Since the term structure model perfectly fits the actual term structure, we resolve the estimation bias caused by the poor fitness of the time homogeneous term structure model. It is demonstrated that the PD estimator from listed stock options can provide meaningful insights on the pricing of credit derivatives like credit default swap.
    Keywords: probability of default (PD), option pricing under credit risk, perturbation method
    JEL: C12 C53 G13
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:ime:imedps:12-e-14&r=rmg
  10. By: Xuqing Huang; Irena Vodenska; Shlomo Havlin; H. Eugene Stanley
    Abstract: In order to design complex networks that are robust and sustainable, we must understand systemic risk. As economic systems become increasingly interconnected, for example, a shock in a single financial network can provoke cascading failures throughout the system. The widespread effects of the current EU debt crisis and the 2008 world financial crisis occur because financial systems are characterized by complex relations that allow a local crisis to spread dramatically. We study US commercial bank balance sheet data and design a bi-partite banking network composed of (i) banks and (ii) bank assets. We propose a cascading failure model to simulate the crisis spreading process in a bi-partite banking network. We test our model using 2007 data to analyze failed banks. We find that, within realistic parameters, our model identifies a significant portion of the actual failed banks from the FDIC failed bank database from 2008 to 2011.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.4973&r=rmg
  11. By: Victor Aguirregabiria; Robert Clark; Hui Wang
    Abstract: The 1994 Riegle Neal (RN) Act removed interstate banking restrictions in the US. The primary motivation was to permit geographic risk diversification (GRD). Using a factor model to measure banks' geographic risk, we show that RN expanded GRD possibilities in small states, but that few banks took advantage. Using our measure of geographic risk and a revealed preference approach, we identify preferences towards GRD separately from the contribution of other factors to branch network configuration. Risk has a negative effect on bank value, but this has been counterbalanced by economies of density/scale, reallocation/merging costs, and concerns for local market power.
    Keywords: Geographic risk diversification; Retail banking; Oligopoly competition; Branch networks; Riegle Neal Act
    JEL: L13 L51 G21
    Date: 2012–10–15
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-465&r=rmg
  12. By: André van Stel; Andrew Burke; José Maria Millan; Concepcion Roman
    Abstract: Start-up size is a key strategic decision for entrepreneurs. Should entrepreneurs start up close to minimum efficient scale or should they take less risks and start-up on a smaller scale? Previously, this strategic decision appeared to be one of simply making a choice between a higher risk/reward larger start-up versus a lower risk/reward smaller scale start-up. However, recent research on the relationship between risk management and performance (Burke, 2009) indicates that in situations of greater uncertainty and where innovation is incremental, a lower risk small start-up size can enable greater reward through enhanced post start-up flexibility and agility. In this paper we provide the first statistical test of the efficacy of start-up size strategies. We focus on employer businesses that provide jobs. We find that employer businesses that originally adopted a small scale (own-account) start-up strategy have higher survival chances and entrepreneurial incomes than employer businesses that employed personnel immediately from start-up. We also find that prior entrepreneurial experience positively affects firm survival and entrepreneurial incomes. Given the high failure rates among start-ups and the associated difficulty for new enterprises to create sustainable jobs, the research results highlight how strategic choice in relation to firm start-up size and risk management can have an important bearing on new venture performance.
    Date: 2012–08–29
    URL: http://d.repec.org/n?u=RePEc:eim:papers:h201207&r=rmg
  13. By: Gareth W. Peters; Alice X. D. Dong; Robert Kohn
    Abstract: Our article considers the class of recently developed stochastic models that combine claims payments and incurred losses information into a coherent reserving methodology. In particular, we develop a family of Heirarchical Bayesian Paid-Incurred-Claims models, combining the claims reserving models of Hertig et al. (1985) and Gogol et al. (1993). In the process we extend the independent log-normal model of Merz et al. (2010) by incorporating different dependence structures using a Data-Augmented mixture Copula Paid-Incurred claims model. The utility and influence of incorporating both payment and incurred losses into estimating of the full predictive distribution of the outstanding loss liabilities and the resulting reserves is demonstrated in the following cases: (i) an independent payment (P) data model; (ii) the independent Payment-Incurred Claims (PIC) data model of Merz et al. (2010); (iii) a novel dependent lag-year telescoping block diagonal Gaussian Copula PIC data model incorporating conjugacy via transformation; (iv) a novel data-augmented mixture Archimedean copula dependent PIC data model. Inference in such models is developed via a class of adaptive Markov chain Monte Carlo sampling algorithms. These incorporate a data-augmentation framework utilized to efficiently evaluate the likelihood for the copula based PIC model in the loss reserving triangles. The adaptation strategy is based on representing a positive definite covariance matrix by the exponential of a symmetric matrix as proposed by Leonard et al. (1992).
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.3849&r=rmg
  14. By: P. Del Moral; G. W. Peters; Ch. Verg\'e
    Abstract: Interacting particle methods are increasingly used to sample from complex and high-dimensional distributions. These stochastic particle integration techniques can be interpreted as an universal acceptance-rejection sequential particle sampler equipped with adaptive and interacting recycling mechanisms. Practically, the particles evolve randomly around the space independently and to each particle is associated a positive potential function. Periodically, particles with high potentials duplicate at the expense of low potential particle which die. This natural genetic type selection scheme appears in numerous applications in applied probability, physics, Bayesian statistics, signal processing, biology, and information engineering. It is the intention of this paper to introduce them to risk modeling. From a purely mathematical point of view, these stochastic samplers can be interpreted as Feynman-Kac particle integration methods. These functional models are natural mathematical extensions of the traditional change of probability measures, commonly used to design an importance sampling strategy. In this article, we provide a brief introduction to the stochastic modeling and the theoretical analysis of these particle algorithms. Then we conclude with an illustration of a subset of such methods to resolve important risk measure and capital estimation in risk and insurance modelling.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.3851&r=rmg
  15. By: Mohamed Belhaj (Centrale Marseille (Aix-Marseille School of Economics), CNRS & EHESS); Nataliya Klimenko (Aix-Marseille Université, Greqam)
    Abstract: Early regulator interventions into problem banks are one of the key suggestions of Basel II. However, no guidance is given on their design. To fill this gap, we outline an incentive-based preventive supervision strategy that eliminates bad asset management in banks. Two supervision techniques are combined: continuous regulator intervention and random audits. Random audit technologies differ as to quality and cost. Our design ensures good management without excessive supervision costs, through a gradual adjustment of supervision effort to the bank's financial health. We also consider preventive supervision in a setting where audits can be delegated to an independent audit agency, showing how to induce agency compliance with regulatory instructions in the least costly way.
    Keywords: banking supervision, random audit, incentives, moral hazard, delegation.
    JEL: G21 G28
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:aim:wpaimx:1201&r=rmg
  16. By: Joseph Y. Halpern; Samantha Leung
    Abstract: We consider a setting where an agent's uncertainty is represented by a set of probability measures, rather than a single measure. Measure-bymeasure updating of such a set of measures upon acquiring new information is well-known to suffer from problems; agents are not always able to learn appropriately. To deal with these problems, we propose using weighted sets of probabilities: a representation where each measure is associated with a weight, which denotes its significance. We describe a natural approach to updating in such a situation and a natural approach to determining the weights. We then show how this representation can be used in decision-making, by modifying a standard approach to decision making-minimizing expected regret-to obtain minimax weighted expected regret (MWER).We provide an axiomatization that characterizes preferences induced by MWER both in the static and dynamic case.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.4853&r=rmg
  17. By: Jan Willem van den End; Marco Hoeberichts
    Abstract: We analyse the relationship between tail risk and crisis measures by governments and the central bank. Using an adjusted Merton model in a game theoretical set-up, the analysis shows that the participation constraint for interventions by the central bank and the governments is less binding if the risk of contagion is high. The strategic interaction between governments and the central bank also influences the effectiveness of the interventions. A joint effort of both the governments and central bank leads to a better outcome. To prevent a bad equilibrium a sizable commitment by both players is required. Our stylized model sheds light on the strategic interaction between EMU governments and the Eurosystem in the context of the Outright Monetary Transactions program (OMT).
    Keywords: Financial crisis; Monetary policy; Central banks; Policy coordination
    JEL: E42 E52 E61 G01 G18
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:352&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.