nep-rmg New Economics Papers
on Risk Management
Issue of 2015‒08‒25
twenty papers chosen by



  1. Systemic risk measures and macroprudential stress tests. An assessment over the 2014 EBA exercise By Chiara Pederzoli; Costanza Torricelli
  2. A risk management approach to capital allocation By Véronique Maume-Deschamps; Didier Rullière; Khalil Said
  3. What Is the Best Risk Measure in Practice? A Comparison of Standard Measures By Suzanne Emmer; Marie Kratz; Dirk Tasche
  4. Counterparty risk and funding: Immersion and beyond By Stéphane Crépey; S. Song
  5. The Impact of Systemic Risk on the Diversification Benefits of a Risk Portfolio By Marc Busse; Michel Dacorogna; Marie Kratz
  6. Mapping Heat in the U.S. Financial System By Aikman, David; Kiley, Michael T.; Lee, Seung Jung; Palumbo, Michael G.; Warusawitharana, Missaka
  7. How Insurers Differ from Banks: A Primer on Systemic Regulation By Christian Thimann
  8. A recursive algorithm for multivariate risk measures and a set-valued Bellman's principle By Zachary Feinstein; Birgit Rudloff
  9. Realized Bank Risk during the Great Recession By Altunbas, Yener; Manganelli, Simone; Marques-Ibanez, David
  10. Impact of dependence on some multivariate risk indicators By Véronique Maume-Deschamps; Didier Rullière; Khalil Said
  11. On capital allocation by minimizing multivariate risk indicators By Véronique Maume-Deschamps; Didier Rullière; Khalil Said
  12. Risk aggregation with empirical margins: Latin hypercubes, empirical copulas, and convergence of sum distributions By Georg Mainik
  13. A new perspective for risk management: a study of the design of generic technology with a matroid model in C-K theory By Pascal Le Masson; Benoit Weil; Olga Kokshagina
  14. Basel III and SME access to credit : Evidence from France By Thomas Humblot
  15. The term structure of the price of variance risk By Andries, Marianne; Eisenbach, Thomas M.; Schmalz, Martin C.; Wang, Yichuan
  16. Accessibility Analysis of Risk Severity By Mengying Cui; David Levinson
  17. Vector Quantile Regression By Guillaume Carlier; Victor Chernozhukov; Alfred Galichon
  18. Heterogeneity and the formation of risk-sharing coalitions By Fernando Jaramillo; Hubert Kempf; Fabien Moizeau
  19. Robust normative comparisons of socially risky situations By Nicolas Gravel; Benoît Tarroux
  20. Construction of an Index that links Wind Speeds and Strong Claim Rate of Insurers after a Storm in France By Alexandre Mornet; Thomas Opitz; Michel Luzi; Stéphane Loisel

  1. By: Chiara Pederzoli; Costanza Torricelli
    Abstract: The European Banking Authority (EBA) stress tests, which aim to quantify banks’ capital shortfall in a potential future crisis (adverse economic scenario), further stimulated an academic debate over systemic risk measures and their predictive/informative content. Focusing on marked based measures, Acharya et al. (2010) provides a theoretical background to justify the use of Marginal Expected Shortfall (MES) for predicting the stress test results, and verify it on the first stress test conducted after the 2007-2008 crises on the US banking system (SCAP, Supervisory Capital Assessment Program). The aim of this paper is to further test the goodness of MES as a predictive measure, by analysing it in relation to the results of the 2014 European stress tests exercise conducted by EBA. Our results are strongly dependent on index used to capture the systemic distress event, whereby MES, based on a global market index, does not show association with EBA stress test, by contrast to F-MES, which is based on a financial market index, and has a significant information and predictive power. Our results may carry useful regulatory implication for the stress test exercises.
    Keywords: systemic risk, stress test, macroprudential regulation
    JEL: G01 G10 G28
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:mod:wcefin:15207&r=rmg
  2. By: Véronique Maume-Deschamps (ICJ - Institut Camille Jordan [Villeurbanne] - ECL - École Centrale de Lyon - Université Jean Monnet - Saint-Etienne - UCBL - Université Claude Bernard Lyon 1 - INSA - Institut National des Sciences Appliquées - CNRS); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Khalil Said (COACTIS - UL2 - Université Lumière - Lyon 2 - Université Jean Monnet - Saint-Etienne, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: The European insurance sector will soon be faced with the application of Solvency 2 regulation norms. It will create a real change in risk management practices. The ORSA approach of the second pillar makes the capital allocation an important exercise for all insurers and specially for groups. Considering multi-branches firms, capital allocation has to be based on a multivariate risk modeling. Several allocation methods are present in the literature and insurers practices. In this paper, we present a new risk allocation method, we study its coherence using an axiomatic approach, and we try to define what the best allocation choice for an insurance group is.
    Date: 2015–06–12
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01163180&r=rmg
  3. By: Suzanne Emmer (CREAR - Center of Research in Econo-finance and Actuarial sciences on Risk / Centre de Recherche Econo-financière et Actuarielle sur le Risque - Essec Business School); Marie Kratz (SID - Information Systems, Decision Sciences and Statistics Department - Essec Business School, MAP5 - MAP5 - Mathématiques Appliquées à Paris 5 - CNRS - UPD5 - Université Paris Descartes - Paris 5 - Institut National des Sciences Mathématiques et de leurs Interactions); Dirk Tasche (Prudential Regulation Authority - Bank of England)
    Abstract: Expected Shortfall (ES) has been widely accepted as a risk measure that is conceptually superior to Value-at-Risk (VaR). At the same time, however, it has been criticized for issues relating to backtesting. In particular, ES has been found not to be elicitable which means that backtesting for ES is less straight-forward than, e.g., backtesting for VaR. Expectiles have been suggested as potentially better alternatives to both ES and VaR. In this paper, we revisit commonly accepted desirable properties of risk measures like coherence, comonotonic additivity, robustness and elicitability. We check VaR, ES and Expectiles with regard to whether or not they enjoy these properties, with particular emphasis on Expectiles. We also consider their impact on capital allocation, an important issue in risk management. We find that, despite the caveats that apply to the estimation and backtesting of ES, it can be considered a good risk measure. In particular, there is no sufficient evidence to justify an all-inclusive replacement of ES by Expectiles in applications, especially as we provide an alternative way for backtesting of ES.
    Date: 2013–12–20
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00921283&r=rmg
  4. By: Stéphane Crépey (LaMME - Laboratoire de Mathématiques et Modélisation d'Evry - Institut national de la recherche agronomique (INRA) - Université d'Evry-Val d'Essonne - UPD5 - Université Paris Descartes - Paris 5 - ENSIIE - CNRS - UPEC UP12 - Université Paris-Est Créteil Val-de-Marne - Paris 12); S. Song (LaMME - Laboratoire de Mathématiques et Modélisation d'Evry - Institut national de la recherche agronomique (INRA) - Université d'Evry-Val d'Essonne - UPD5 - Université Paris Descartes - Paris 5 - ENSIIE - CNRS - UPEC UP12 - Université Paris-Est Créteil Val-de-Marne - Paris 12)
    Abstract: A basic reduced-form counterparty risk modeling approach hinges on a standard immersion hypothesis between a reference filtration and the filtration progressively enlarged by the default times of the two parties, also involving the continuity of some of the data at default time. This basic approach is too restrictive for application to credit derivatives, which are characterized by strong wrong-way risk, i.e. adverse dependence between the exposure and the credit riskiness of the counterparties, and gap risk, i.e. slippage between the portfolio and its collateral during the so called cure period that separates default from liquidation. This paper shows how a suitable extension of the basic approach can be devised so that it can be applied in dynamic copula models of counterparty risk on credit derivatives. More generally, this method is applicable in any marked default times intensity setup satisfying a suitable integrability condition. The integrability condition expresses that no mass is lost in a related measure change. The changed probability measure is not needed algorithmically. All one needs in practice is an explicit expression for the intensities of the marked default times.
    Date: 2014–03–04
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00989062&r=rmg
  5. By: Marc Busse (SCOR SE - SCOR SE); Michel Dacorogna (SCOR SE - SCOR SE); Marie Kratz (SID - Information Systems, Decision Sciences and Statistics Department - Essec Business School, MAP5 - MAP5 - Mathématiques Appliquées à Paris 5 - CNRS - UPD5 - Université Paris Descartes - Paris 5 - Institut National des Sciences Mathématiques et de leurs Interactions)
    Abstract: Risk diversification is the basis of insurance and investment. It is thus crucial to study the effects that could limit it. One of them is the existence of systemic risk that affects all the policies at the same time. We introduce here a probabilistic approach to examine the consequences of its presence on the risk loading of the premium of a portfolio of insurance policies. This approach could be easily generalized for investment risk. We see that, even with a small probability of occurrence, systemic risk can reduce dramatically the diversification benefits. It is clearly revealed via a non-diversifiable term that appears in the analytical expression of the variance of our models. We propose two ways of introducing it and discuss their advantages and limitations. By using both VaR and TVaR to compute the loading, we see that only the latter captures the full effect of systemic risk when its probability to occur is low.
    Date: 2013–12–06
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00914844&r=rmg
  6. By: Aikman, David (Bank of England); Kiley, Michael T. (Board of Governors of the Federal Reserve System (U.S.)); Lee, Seung Jung (Board of Governors of the Federal Reserve System (U.S.)); Palumbo, Michael G. (Board of Governors of the Federal Reserve System (U.S.)); Warusawitharana, Missaka (Board of Governors of the Federal Reserve System (U.S.))
    Abstract: We provide a framework for assessing the build-up of vulnerabilities in the U.S. financial system. We collect forty-four indicators of financial and balance-sheet conditions, cutting across measures of valuation pressures, nonfinancial borrowing, and financial-sector health. We place the data in economic categories, track their evolution, and develop an algorithmic approach to monitoring vulnerabilities that can complement the more judgmental approach of most official-sector organizations. Our approach picks up rising imbalances in the U.S. financial system through the mid-2000s, presaging the financial crisis. We also highlight several statistical properties of our approach: most importantly, our summary measures of system-wide vulnerabilities lead the credit-to-GDP gap (a key gauge in Basel III and related research) by a year or more. Thus, our framework may provide useful information for setting macroprudential policy tools such as the countercyclical capital buffer.
    Keywords: Early warning system; financial crisis; financial stability; financial vulnerabilities; heat maps; macroprudential policy; systemic risk; data visualization; countercyclical capital buffers
    JEL: G01 G12 G21 G23 G28
    Date: 2015–06–24
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-59&r=rmg
  7. By: Christian Thimann (PSE - Paris-Jourdan Sciences Economiques - CNRS - Institut national de la recherche agronomique (INRA) - EHESS - École des hautes études en sciences sociales - ENS Paris - École normale supérieure - Paris - École des Ponts ParisTech (ENPC), Axa - AXA, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics)
    Abstract: This paper aims at providing a conceptual distinction between banking and insurance with regard to systemic regulation. It discusses key differences and similarities as to how both sectors interact with the financial system. Insurers interact as financial intermediaries and through financial market investments, but do not share the features of banking that give rise to particular systemic risk in that sector, such as the institutional interconnectedness through the interbank market, the maturity transformation combined with leverage, the prevalence of liquidity risk and the operation of the payment system. The paper also draws attention to three salient features in insurance that need to be taken account in systemic regulation: the quasiabsence of leverage, the fundamentally different role of capital and the ‘built-in bail-in’ of a significant part of insurance liabilities through policy-holder participation. Based on these considerations, the paper argues that if certain activities were to give rise to concerns about systemic risk in the case of insurers, regulatory responses other than capital surcharges may be more appropriate.
    Date: 2014–10–16
    URL: http://d.repec.org/n?u=RePEc:hal:psewpa:halshs-01074933&r=rmg
  8. By: Zachary Feinstein; Birgit Rudloff
    Abstract: A method for calculating multi-portfolio time consistent multivariate risk measures in discrete time is presented. Market models for $d$ assets with transaction costs or illiquidity and possible trading constraints are considered on a finite probability space. The set of capital requirements at each time and state is calculated recursively backwards in time along the event tree. We motivate why the proposed procedure can be seen as a set-valued Bellman's principle, that might be of independent interest within the growing field of set optimization. We give conditions under which the backwards calculation of the sets reduces to solving a sequence of linear, respectively convex vector optimization problems. Numerical examples are given and include superhedging under illiquidity, the set-valued entropic risk measure, and the multi-portfolio time consistent version of the relaxed worst case risk measure and of the set-valued average value at risk.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.02367&r=rmg
  9. By: Altunbas, Yener (Bangor Business School); Manganelli, Simone (European Central Bank); Marques-Ibanez, David (Board of Governors of the Federal Reserve System (U.S.))
    Abstract: In the years preceding the 2007-2009 financial crisis, forward-looking indicators of bank risk concentrated and suggested unusually low expectations of bank default. We assess whether the ex-ante (i.e. prior to the crisis) cross-sectional variability in bank characteristics is related to the ex-post (i.e. during the crisis) materialization of bank risk. Our tailor-made dataset crucially accounts for the different dimensions of realized bank risk including access to central bank liquidity during the crisis. We consistently find that less reliance on deposit funding, more aggressive credit growth, larger size and leverage were associated with larger levels of realized risk. The impact of these characteristics is particularly relevant for capturing the systemic dimensions of bank risk and tends to become stronger for the tail of the riskier banks. The majority of these characteristics also predicted bank risk as materialized before the financial crisis.
    Keywords: Bank risk; business models; Great Recession
    JEL: E58 G15 G21 G32
    Date: 2015–08–04
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1140&r=rmg
  10. By: Véronique Maume-Deschamps (ICJ - Institut Camille Jordan [Villeurbanne] - ECL - École Centrale de Lyon - UCBL - Université Claude Bernard Lyon 1 - Université Jean Monnet - Saint-Etienne - INSA - Institut National des Sciences Appliquées - CNRS); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Khalil Said (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1, COACTIS - UL2 - Université Lumière - Lyon 2 - Université Jean Monnet - Saint-Etienne)
    Abstract: The minimization of some multivariate risk indicators may be used as an allocation method, as proposed in Cénac et al. [6]. The aim of capital allocation is to choose a point in a simplex, according to a given criterion. In a previous paper [17] we proved that the proposed allocation technique satisfies a set of coherence axioms. In the present one, we study the properties and asymptotic behavior of the allocation for some distribution models. We analyze also the impact of the dependence structure on the allocation using some copulas.
    Date: 2015–07–04
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01171395&r=rmg
  11. By: Véronique Maume-Deschamps (ICJ - Institut Camille Jordan [Villeurbanne] - ECL - École Centrale de Lyon - UCBL - Université Claude Bernard Lyon 1 - Université Jean Monnet - Saint-Etienne - INSA - Institut National des Sciences Appliquées - CNRS); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Khalil Said (COACTIS - UL2 - Université Lumière - Lyon 2 - Université Jean Monnet - Saint-Etienne, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: The issue of capital allocation in a multivariate context arises from the presence of dependence between the various risky activities which may generate a diversification effect. Several allocation methods in the literature are based on a choice of a univariate risk measure and an allocation principle, others on optimizing a multivariate ruin probability or some multivariate risk indicators. In this paper, we focus on the latter technique. Using an axiomatic approach, we study its coherence properties. We give some explicit results in mono periodic cases. Finally we analyze the impact of the dependence structure on the optimal allocation.
    Date: 2014–11–13
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01082559&r=rmg
  12. By: Georg Mainik
    Abstract: This paper studies convergence properties of multivariate distributions constructed by endowing empirical margins with a copula. This setting includes Latin Hypercube Sampling with dependence, also known as the Iman--Conover method. The primary question addressed here is the convergence of the component sum, which is relevant to risk aggregation in insurance and finance. This paper shows that a CLT for the aggregated risk distribution is not available, so that the underlying mathematical problem goes beyond classic functional CLTs for empirical copulas. This issue is relevant to Monte-Carlo based risk aggregation in all multivariate models generated by plugging empirical margins into a copula. Instead of a functional CLT, this paper establishes strong uniform consistency of the estimated sum distribution function and provides a sufficient criterion for the convergence rate $O(n^{-1/2})$ in probability. These convergence results hold for all copulas with bounded densities. Examples with unbounded densities include bivariate Clayton and Gauss copulas. The convergence results are not specific to the component sum and hold also for any other componentwise non-decreasing aggregation function. On the other hand, convergence of estimates for the joint distribution is much easier to prove, including CLTs. Beyond Iman--Conover estimates, the results of this paper apply to multivariate distributions obtained by plugging empirical margins into an exact copula or by plugging exact margins into an empirical copula.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.02749&r=rmg
  13. By: Pascal Le Masson (CGS - Centre de Gestion Scientifique - MINES ParisTech - École nationale supérieure des mines de Paris); Benoit Weil (CGS - Centre de Gestion Scientifique - MINES ParisTech - École nationale supérieure des mines de Paris); Olga Kokshagina (CGS - Centre de Gestion Scientifique - MINES ParisTech - École nationale supérieure des mines de Paris)
    Abstract: Risk management today has its main roots in decision theory paradigm (Friedman and Savage 1948). It consists in making the optimal choice between given possible decisions and probable states of nature. In this paper we extend this model to include a design capacity to deal with risk situations. A design perspective leads to add a new action possibility in the model: to design a new alternative to deal with the probable states of nature. The new alternative design might also "create" new risks, so that a design perspective leads also to model the emergence of new risks as an exogenous "design process". Hence a design perspective raises two issues: can we design an alternative that would lower the risk? Does this new alternative create new risks? We show (1) that minimizing known risks consists in designing an alternative whose success is independent from all the known risks – this alternative can be considered as a generic technology. We show (2)that the design of this generic technology depends on the structure of the unknown, ie the structure of the space generated by the concept of risk-free alternative. (3) We identify new strategies to deal with risks as dealing with the unknown.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-01083249&r=rmg
  14. By: Thomas Humblot (Larefi - Laboratoire d'analyse et de recherche en économie et finance internationales - Université Montesquieu - Bordeaux 4)
    Abstract: This paper investigates Basel III potential effects on SME access to bank credit. In an innovative empirical framework, French small firms are studied using microdata over the 2008-2013 periods. We conclude that the new regulation will have an M-shaped impact. Eventually, Basel III eliminates low profitable exposures regardless of their regulatory charge alleviations, restricts risky positions despite of their profitability and digs SME funding gap. Only regulatory adjusted dominant risk/return profiles are funded. On average, no reduction in credit matu-rity nor in volume is observable. The overall effect ultimately depends on banks' initial position.
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01096527&r=rmg
  15. By: Andries, Marianne (Toulouse School of Economics); Eisenbach, Thomas M. (Federal Reserve Bank of New York); Schmalz, Martin C. (University of Michigan); Wang, Yichuan (University of Michigan)
    Abstract: We estimate the term structure of the price of variance risk (PVR), which helps distinguish between competing asset-pricing theories. First, we measure the PVR as proportional to the Sharpe ratio of short-term holding returns of delta-neutral index straddles; second, we estimate the PVR in a Heston (1993) stochastic-volatility model. In both cases, the estimation is performed separately for different maturities. We find the PVR is negative and decreases in absolute value with maturity; it is more negative and its term structure is steeper when volatility is high. These findings are inconsistent with calibrations of established asset-pricing models that assume constant risk aversion across maturities.
    Keywords: volatility risk; option returns; straddle; term structure
    JEL: G12 G13
    Date: 2015–08–01
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:736&r=rmg
  16. By: Mengying Cui; David Levinson (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota)
    Abstract: Risk severity in transportation network analysis is defined as the effects of a link or network failure on the whole system. Change accessibility (reduction in the number of jobs which can be reached) is used as an integrated indicator to reflect the severity of a link outage. The changes of accessibility before-and-after the removing of a freeway segment from the network represent its risk severity. The analysis in the Minneapolis - St. Paul (Twin Cities) region show that links near downtown Minneapolis have relative higher risk severity than those in rural area. The geographical distribution of links with the highest risk severity displays the property that these links tend to be near or at the intersection of freeways. Risk severity of these links based on the accessibility to jobs and to workers at different time thresholds and during different dayparts are also analyzed in the paper. The research finds that network structure measures: betweenness, straightness and closeness, help explain the severity of loss due to network outage.
    Keywords: GPS data, congestion, network structure, accessibility, vulnerability
    JEL: R14 R41 R42
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:nex:wpaper:vulnerability&r=rmg
  17. By: Guillaume Carlier (CEREMADE - CEntre de REcherches en MAthématiques de la DEcision - CNRS - Université Paris IX - Paris Dauphine); Victor Chernozhukov (CEREMADE - CEntre de REcherches en MAthématiques de la DEcision - CNRS - Université Paris IX - Paris Dauphine); Alfred Galichon (CEREMADE - CEntre de REcherches en MAthématiques de la DEcision - CNRS - Université Paris IX - Paris Dauphine)
    Abstract: We propose a notion of conditional vector quantile function and a vector quantile regression. A conditional vector quantile function (CVQF) of a random vector Y, taking values in ℝd given covariates Z=z, taking values in ℝk, is a map u↦QY∣Z(u,z), which is monotone, in the sense of being a gradient of a convex function, and such that given that vector U follows a reference non-atomic distribution FU, for instance uniform distribution on a unit cube in ℝd, the random vector QY∣Z(U,z) has the distribution of Y conditional on Z=z. Moreover, we have a strong representation, Y=QY∣Z(U,Z) almost surely, for some version of U. The vector quantile regression (VQR) is a linear model for CVQF of Y given Z. Under correct specification, the notion produces strong representation, Y=β(U)⊤f(Z), for f(Z) denoting a known set of transformations of Z, where u↦β(u)⊤f(Z) is a monotone map, the gradient of a convex function, and the quantile regression coefficients u↦β(u) have the interpretations analogous to that of the standard scalar quantile regression. As f(Z) becomes a richer class of transformations of Z, the model becomes nonparametric, as in series modelling. A key property of VQR is the embedding of the classical Monge-Kantorovich's optimal transportation problem at its core as a special case. In the classical case, where Y is scalar, VQR reduces to a version of the classical QR, and CVQF reduces to the scalar conditional quantile function. Several applications to diverse problems such as multiple Engel curve estimation, and measurement of financial risk, are considered.
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01169653&r=rmg
  18. By: Fernando Jaramillo (Universidad del Rosario - Facultad de Economia); Hubert Kempf (EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics); Fabien Moizeau (CREM - Centre de Recherche en Economie et Management - CNRS - Université de Caen Basse-Normandie - UR1 - Université de Rennes 1)
    Abstract: We study the relationship between the distribution of individuals' attributes over the pop-ulation and the extent of risk sharing in a risky environment. We consider a society where individuals voluntarily form risk-sharing groups in the absence of financial markets. We obtain a partition of society into distinct coalitions leading to partial risk sharing. When individuals differ only with respect to risk, the partition is homophily-based: the less risky agents congreg-ate together and reject more risky ones into other coalitions. The distribution of risk affects the number and size of these coalitions. It turns out that individuals may pay a lower risk premium in more risky societies. We show that a higher heterogeneity in risk leads to a lower degree of partial risk sharing. The case of heterogenous risk aversion generates similar results. The empirical evidence on partial risk sharing can be understood when the endogenous partition of society into risk-sharing coalitions is taken into account.
    Date: 2015–05
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-01075648&r=rmg
  19. By: Nicolas Gravel (AMSE - Aix-Marseille School of Economics - EHESS - École des hautes études en sciences sociales - Centre national de la recherche scientifique (CNRS) - Ecole Centrale Marseille (ECM) - AMU - Aix-Marseille Université); Benoît Tarroux (CREM - Centre de Recherche en Economie et Management - CNRS - Université de Caen Basse-Normandie - UR1 - Université de Rennes 1)
    Abstract: In this paper, we theoretically characterize robust empirically implementable normative criteria for evaluating socially risky situations. Socially risky situations are modeled as distributions, among individuals, of lotteries on a finite set of state-contingent pecuniary consequences. Individuals are assumed to have selfish Von Neumann-Morgenstern preferences for these socially risky situations. We provide empirically implementable criteria that coincide with the unanimity, over a reasonably large class of such individual preferences, of anonymous and Pareto-inclusive Von Neuman Morgenstern social rankings of risks. The implementable criteria can be interpreted as sequential expected poverty dominance.An illustration of the usefulness of the criteria for comparing the exposure to unemployment risk of different segments of the French and US workforce is also provided.
    Date: 2015–02
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-01057024&r=rmg
  20. By: Alexandre Mornet (Allianz, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Thomas Opitz (I3M - Institut de Mathématiques et de Modélisation de Montpellier - CNRS - UM2 - Université Montpellier 2 - Sciences et Techniques); Michel Luzi (Allianz); Stéphane Loisel (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: For insurance companies, wind storms represent a main source of volatility, leading to potentially huge aggregated claim amounts. In this article, we compare different constructions of a storm index allowing us to assess the economic impact of storms on an insurance portfolio by exploiting information from historical wind speed data. Contrary to historical insurance portfolio data, meteorological variables can be considered as stationary between years and are easily available with long observation records; hence, they represent a valuable source of additional information for insurers if the relation between observations of claims and wind speeds can be revealed. Since standard correlation measures between raw wind speeds and insurance claims are weak, a storm index focusing on high wind speeds can afford better information. This method has been used on the German territory by Klawa and Ulbrich and gave good results for yearly aggregated claims. Using historical meteorological and insurance data, we assess the consistency of the pro-posed indices construction and we test their sensitivity to their various parameters and weights. Moreover, we are able to place the major insurance events since 1998 on a broader horizon of 40+ years. Our approach provides a meteorological justification for calculating the return periods of extreme storm-related insurance events whose magnitude has rarely been reached.
    Date: 2014–07
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01081758&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.