New Economics Papers
on Risk Management
Issue of 2010‒10‒30
thirteen papers chosen by



  1. Mitigating the pro-cyclicality of Basel II By Rafael Repullo; Jesús Saurina; Carlos Trucharte
  2. Portfolio Management under Asymmetric Dependence and Distribution By Stefan Hlawatsch; Peter Reichling
  3. Systemic Risk Diagnostics By Bernd Schwaab; Andre Lucas; Siem Jan Koopman
  4. An Inquiry into Banking Portfolios and Financial Stability Surrounding "The Great Recession" By Garita, Gus
  5. Impact of Insurance for Operational Risk: Is it worthwhile to insure or be insured for severe losses? By Gareth W. Peters; Aaron D. Byrnes; Pavel V. Shevchenko
  6. A la Carte of Correlation Models: Which One to Choose? By Harry Zheng
  7. The information content of high-frequency data for estimating equity return models and forecasting risk By Dobrislav Dobrev; Pawel Szerszen
  8. Panel Data Models with Unobserved Multiple Time- Varying Effects to Estimate Risk Premium of Corporate Bonds By Bada, Oualid; Kneip, Alois
  9. Dynamic Coherent Acceptability Indices and their Applications to Finance By Tomasz R. Bielecki; Igor Cialenco; Zhao Zhang
  10. Panel Data Models with Unobserved Multiple Time - Varying Effects to Estimate Risk Premium of Corporate Bonds By Oualid Bada; Alois Kneip
  11. A Cross-Sectional Performance Measure for Portfolio Management. By Monica Billio; Ludovic Calès; Dominique Guegan
  12. Create Better Diversified High-Conviction Equity Portfolios using the Portfolio Diversification Index By Crezée, D.P.; Swinkels, L.A.P.
  13. Losses from Simulated Defaults in Canada's Large Value Transfer System By Nellie Zhang; Tom Hossfeld

  1. By: Rafael Repullo (CEMFI); Jesús Saurina (Banco de España); Carlos Trucharte (Banco de España)
    Abstract: Policy discussions on the recent financial crisis feature widespread calls to address the pro-cyclical effects of regulation. The main concern is that the new risk-sensitive bank capital regulation (Basel II) may amplify business cycle fluctuations. This paper compares the leading alternative procedures that have been proposed to mitigate this problem. We estimate a model of the probabilities of default (PDs) of Spanish firms during the period 1987 2008, and use the estimated PDs to compute the corresponding series of Basel II capital requirements per unit of loans. These requirements move significantly along the business cycle, ranging from 7.6% (in 2006) to 11.9% (in 1993). The comparison of the different procedures is based on the criterion of minimizing the root mean square deviations of each adjusted series with respect to the Hodrick-Prescott trend of the original series. The results show that the best procedures are either to smooth the input of the Basel II formula by using through the cycle PDs or to smooth the output with a multiplier based on GDP growth. Our discussion concludes that the latter is better in terms of simplicity, transparency, and consistency with banks’ risk pricing and risk management systems. For the portfolio of Spanish commercial and industrial loans and a 45% loss given default (LGD), the multiplier would amount to a 6.5% surcharge for each standard deviation in GDP growth. The surcharge would be significantly higher with cyclically-varying LGDs.
    Keywords: Bank capital regulation, Basel II, Pro-cyclicality, Business cycles, Credit crunch
    JEL: E32 G28
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:1028&r=rmg
  2. By: Stefan Hlawatsch (Faculty of Economics and Management, Otto-von-Guericke University Magdeburg); Peter Reichling (Faculty of Economics and Management, Otto-von-Guericke University Magdeburg)
    Abstract: Aim of our paper is to analyze the enhancement of portfolio management by using more sophisticated assumptions about distributions and dependencies of stock returns. We assume a skewed t-distribution of the returns according to Azzalini and Capitanio (2003) and a dependency structure following a Clayton copula. The risk measure applied to our portfolio selection changed from traditional portfolio variance to downside-oriented conditional value-at-risk. The empirical results show a superior performance of our approach compared to the Markowitz approach and to the approach proposed by Hatherley and Alcock (2007) on a risk-adjusted basis. The approach is applied on daily stock returns of 16 stocks of the EURO STOXX 50.
    Keywords: Asymmetric Dependency, Copula, Skewed t-Distribution, Conditional Value-at-Risk, Portfolio Optimization
    JEL: C01 C13 C15 C16 C46 G11
    Date: 2010–07
    URL: http://d.repec.org/n?u=RePEc:mag:wpaper:100017&r=rmg
  3. By: Bernd Schwaab (VU University Amsterdam, and European Central Bank); Andre Lucas (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam)
    Abstract: A macro-prudential policy maker can manage risks to financial stability only if such risks can be reliably assessed. To this purpose we propose a novel framework for systemic risk diagnostics based on state space methods. We assess systemic risk based on latent macro-financial and credit risk components in the U.S., the EU-27 area, and the rest of the world. The time-varying probability of a systemic event, defined as the simultaneous failure of a large number of bank and non-bank intermediaries, can be assessed in- and out of sample. Credit and macro-financial conditions are combined into a straightforward coincident indicator of system risk. In an empirical analysis of worldwide default data, we find that credit risk conditions can significantly and persistently decouple from business cycle conditions due to e.g. unobserved changes in credit supply. Such decoupling is an early warning signal for macro-prudential policy.
    Keywords: financial crisis; systemic risk; credit portfolio models; frailty-correlated defaults; state space methods
    JEL: G21 C33
    Date: 2010–10–18
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20100104&r=rmg
  4. By: Garita, Gus
    Abstract: By utilizing the extreme dependence structure and the conditional probability of joint failure (CPJF) between banks, this paper characterizes a risk-stability index (RSI) that quantifies (i) common distress of banks, (ii) distress between specific banks, and (iii) distress to a portfolio related to a specific bank. The results show that financial stability is a continuum; that the Korean and U.S. banking systems seem more prone to systemic risk; and that Asian banks experience the most persistence of distress. Furthermore, a panel VAR indicates that "leaning against the wind" reduces the instability of a financial system.
    Keywords: Conditional probability of joint failure; contagion; dependence structure; distress; multivariate extreme value theory; panel VAR; persistence; risk.
    JEL: C10 F42 E44
    Date: 2010–10–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:25996&r=rmg
  5. By: Gareth W. Peters; Aaron D. Byrnes; Pavel V. Shevchenko
    Abstract: Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach allows a provision for reduction of capital as a result of insurance mitigation of up to 20%. This paper studies the behaviour of different insurance policies in the context of capital reduction for a range of possible extreme loss models and insurance policy scenarios in a multi-period, multiple risk settings. A Loss Distributional Approach (LDA) for modelling of the annual loss process, involving homogeneous compound Poisson processes for the annual losses, with heavy tailed severity models comprised of alpha-stable severities is considered. There has been little analysis of such models to date and it is believed, insurance models will play more of a role in OpRisk mitigation and capital reduction in future. The first question of interest is when would it be equitable for a bank or financial institution to purchase insurance for heavy tailed OpRisk losses under different insurance policy scenarios? The second question then pertains to Solvency II and addresses what the insurers capital would be for such operational risk scenarios under different policy offerings. In addition we consider the insurers perspective with respect to fair premium as a percentage above the expected annual claim for each insurance policy. The intention being to address questions related to VaR reduction under Basel II, SCR under Solvency II and fair insurance premiums in OpRisk for different extreme loss scenarios. In the process we provide closed form solutions for the distribution of loss process and claims process in an LDA structure as well as closed form analytic solutions for the Expected Shortfall, SCR and MCR under Basel II and Solvency II. We also provide closed form analytic solutions for the annual loss distribution of multiple risks including insurance mitigation.
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1010.4406&r=rmg
  6. By: Harry Zheng
    Abstract: In this paper we propose a copula contagion mixture model for correlated default times. The model includes the well known factor, copula, and contagion models as its special cases. The key advantage of such a model is that we can study the interaction of different models and their pricing impact. Specifically, we model the marginal default times to follow some contagion intensity processes coupled with copula dependence structure. We apply the total hazard construction method to generate ordered default times and numerically compare the pricing impact of different models on basket CDSs and CDOs in the presence of exponential decay and counterparty risk.
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1010.4053&r=rmg
  7. By: Dobrislav Dobrev; Pawel Szerszen
    Abstract: We demonstrate that the parameters controlling skewness and kurtosis in popular equity return models estimated at daily frequency can be obtained almost as precisely as if volatility is observable by simply incorporating the strong information content of realized volatility measures extracted from high-frequency data. For this purpose, we introduce asymptotically exact volatility measurement equations in state space form and propose a Bayesian estimation approach. Our highly efficient estimates lead in turn to substantial gains for forecasting various risk measures at horizons ranging from a few days to a few months ahead when taking also into account parameter uncertainty. As a practical rule of thumb, we find that two years of high frequency data often suffice to obtain the same level of precision as twenty years of daily data, thereby making our approach particularly useful in finance applications where only short data samples are available or economically meaningful to use. Moreover, we find that compared to model inference without high-frequency data, our approach largely eliminates underestimation of risk during bad times or overestimation of risk during good times. We assess the attainable improvements in VaR forecast accuracy on simulated data and provide an empirical illustration on stock returns during the financial crisis of 2007-2008.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1005&r=rmg
  8. By: Bada, Oualid; Kneip, Alois
    Abstract: We use a panel cointegration model with multiple time- varying individual effects to control for the missing factors in the credit spread puzzle. Our model specification enables as to capture the unobserved dynamics of the systematic risk premia in the bond market. In order to estimate the dimensionality of the hidden risk factors jointly with the model parameters, we rely on a modified version of the iterated least squares method proposed by Bai, Kao, and Ng (2009). Our result confirms the presence of four common risk components affecting the U.S. corporate bonds during the period between September 2006 and March 2008. However, one single risk factor is sufficient to describe the data for all time periods prior to mid July 2007 when the subprime crisis was detected in the financial market. The dimensionality of the unobserved risk components therefore seems to reflect the degree of difficulty to diversify the individual bond risks.
    Keywords: Panel Data Model; Factor Analysis; Credit Spread; Systematic Risk Premium;
    JEL: C33
    Date: 2010–10–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:26006&r=rmg
  9. By: Tomasz R. Bielecki; Igor Cialenco; Zhao Zhang
    Abstract: In this paper we present a theoretical framework for studying coherent acceptability indices in a dynamic setup. We study dynamic coherent acceptability indices and dynamic coherent risk measures, and we establish a duality between them. We derive a representation theorem for dynamic coherent risk measures in terms of so called dynamically consistent sequence of sets of probability measures. Based on these results, we give a specific construction of dynamic coherent acceptability indices. We also provide examples of dynamic coherent acceptability indices, both abstract and also some that generalize selected classical financial measures of portfolio performance.
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1010.4339&r=rmg
  10. By: Oualid Bada; Alois Kneip
    Abstract: We use a panel cointegration model with multiple time- varying individual effects to control for the enigmatic missing factors in the credit spread puzzle. Our model specification enables as to capture the unobserved dynamics of the systematic risk premia in the bond market. In order to estimate the dimensionality of the hidden risk factors jointly with the model parameters, we rely on a modified version of the iterated least squares method proposed by Bai, Kao, and Ng (2009). Our result confirms the presence of four common risk components affecting the U.S. corporate bonds during the period between September 2006 and March 2008. However, one single risk factor is sufficient to describe the data for all time periods prior to mid July 2007 when the subprime crisis was detected in the financial market. The dimensionality of the unobserved risk components therefore seems to reflect the degree of difficulty to diversify the individual bond risks.
    Keywords: Corporate Bond; Credit Spread; Systematic Risk Premium; Panel; Data Model with Interactive Fixed Effects; Factor Analysis; Dimensionality Criteria; Panel Cointegration
    JEL: D44 D82
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:bon:bonedp:bgse19_2010&r=rmg
  11. By: Monica Billio (Dipartimento di Scienze Economiche - University Ca' Foscari of Venezia); Ludovic Calès (University Ca' Foscari of Venezia et Centre d'Économie de la Sorbonne); Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics)
    Abstract: Sharpe-like ratios have been traditionally used to measure the performances of portfolio managers. However, they are known to suffer major drawbacks. Among them, two are intricate : (1) they are relative to a peer's performance and (2) the best score is generally assumed to correspond to a "good" portfolio allocation, with no guarantee on the goodness of this allocation. Last but no least (3) these measures suffer significant estimation errors leading to the inability to distinguish two managers' performances. In this paper, we propose a cross-sectional measure of portfolio performance dealing with these three issues. First, we define the score of a portfolio over a single period as the percentage of investable portfolios outperformed by this portfolio. This score quantifies the goodness of the allocation remedying drawbacks (1) and (2). The new information brought by the cross-sectionality of this score is then discussed through applications. Secondly, we build a performance index, as the average cross-section score over successive periods, whose estimation partially answers drawback (3). In order to assess its informativeness and using empirical data, we compare its forecasts with those of the Sharpe and Sortino ratios. The results show that our measure is the most robust and informative. It validates the utility of such cross-sectional performance measure.
    Keywords: Performance measure, portfolio management, relative-value strategy, large portfolios, absolute return strategy, multivariate statistics, Generalized Hyperbolic Distribution.
    JEL: C14 C44
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:10070&r=rmg
  12. By: Crezée, D.P.; Swinkels, L.A.P.
    Abstract: We investigate the construction of well-diversified high-conviction equity portfolios, building on Rudin and Morgan (2006) who introduced the Portfolio Diversification Index (PDI) as a new measure of portfolio diversification applied to long/short equity hedge funds in an in-sample period. We are the first to investigate the out-of-sample properties of the PDI. Our research applies a novel portfolio selection algorithm to maximize the PDI of a portfolio of stocks in the S&P 500 Index over 2000 to 2009. We construct equally-weighted, well-diversified portfolios, consisting of 5 to 30 stocks and compare these with randomly selected portfolios of the same stock sizes. Our results indicate that investors using our algorithm to maximize the PDI can improve the diversification of high-conviction equity portfolios. For example, a portfolio of 20 stocks constructed using the algorithm with the PDI behaves out-of-sample as if it contains 10 independent stocks, i.e. a PDI score of 10. Although this is less than the PDI score of 15 achieved in-sample, it is a significant improvement over the PDI score of 7, which occurs with a randomly selected portfolio. Our research is robust with respect to the number of stocks in the investment portfolio and the time period under consideration.
    Keywords: diversification;portfolio construction;risk reduction
    Date: 2010–03–01
    URL: http://d.repec.org/n?u=RePEc:dgr:eureri:1765021037&r=rmg
  13. By: Nellie Zhang; Tom Hossfeld
    Abstract: The Large Value Transfer System (LVTS) loss-sharing mechanism was designed to ensure that, in the event of a one-participant default, the collateral pledged by direct members of the system would be sufficient to cover the largest possible net debit position of a defaulting participant. However, the situation may not hold if the indirect effects of the defaults are taken into consideration, or if two participants default during the same payment cycle. The authors examine surviving participant total losses under both oneand two-participant default conditions, assuming the potential knock-on effects of the default. Their analysis includes the impact of a decline in value of LVTS collateral following an unexpected default. Simulations of participant defaults indicate that the impact on the LVTS is generally small; surviving participants do incur end-of-day collateral shortfalls, but only rarely and in small amounts. Under the two-participant default scenario, the likelihood of the Bank of Canada having to provide funds to ensure LVTS settlement is reasonably low, as is the average residual-coverage amount. The majority of LVTS participants pledge as collateral securities issued by other system members. However, the impact of an issuer of such collateral defaulting is generally not significant in the LVTS.
    Keywords: Financial stability; Payment, clearing, and settlement systems; Financial institutions
    JEL: E47 G21
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:bca:bocadp:10-14&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.