|
on Risk Management |
Issue of 2010‒10‒23
twelve papers chosen by |
By: | Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University); Juan-Ángel Jiménez-Martín (Department of Quantitative Economics, Complutense University of Madrid); Teodosio Pérez-Amaral (Department of Quantitative Economics, Complutense University of Madrid) |
Abstract: | A risk management strategy is proposed as being robust to the Global Financial Crisis (GFC) by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast is based on the median of the point VaR forecasts of a set of conditional volatility models. This risk management strategy is GFC-robust in the sense that maintaining the same risk management strategies before, during and after a financial crisis would lead to comparatively low daily capital charges and violation penalties. The new method is illustrated by using the S&P500 index before, during and after the 2008-09 global financial crisis. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria. The median VaR risk management strategy is GFC-robust as it provides stable results across different periods relative to other VaR forecasting models. The new strategy based on combined forecasts of single models is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions. |
Keywords: | Value-at-Risk (VaR), daily capital charges, robust forecasts, violation penalties, optimizing strategy, aggressive risk management strategy, conservative risk management strategy, Basel II Accord, global financial crisis |
JEL: | G32 G11 C53 C22 |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:kyo:wpaper:727&r=rmg |
By: | Roberto Violi (Bank of Italy) |
Abstract: | This paper explores the implications of systemic risk in Credit Structured Finance (CSF). Risk measurement issues loomed large during the 2007-08 financial crisis, as the massive, unprecedented number of downgrades of AAA senior bond tranches inflicted severe losses on banks, calling into question the credibility of Rating Agencies. I discuss the limits of the standard risk frameworks in CSF (Gaussian, Single Risk Factor Model; GSRFM), popular among market participants. If implemented in a ‘static’ fashion, GSRFM can substantially underprice risk at times of stress. I introduce a simple ‘dynamic’ version of GSRFM that captures the impact of large systemic shocks (e.g. financial meltdown) for the value of CSF bonds (ABS, CDO, CLO, etc.). I argue that a proper 'dynamic' modeling of systemic risk is crucial for gauging the exposure to default contagion (‘correlation risk’). Two policy implications are drawn from a 'dynamic' GSRFM: (i) when rating CSF deals, Agencies should disclose additional risk information (e.g. the expected losses under stressed scenarios; asset correlation estimates); and (ii) a ‘point-in-time’ approach to rating CSF bonds is more appropriate than a ‘through-the-cycle’ approach. |
Keywords: | structured finance, systemic risk, credit risk measures, bond pricing |
JEL: | E44 E65 G12 G13 G14 G18 G21 G24 G28 G34 |
Date: | 2010–09 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_774_10&r=rmg |
By: | Caporin, M.; McAleer, M.J. |
Abstract: | This paper focuses on the selection and comparison of alternative non-nested volatility models. We review the traditional in-sample methods commonly applied in the volatility framework, namely diagnostic checking procedures, information criteria, and conditions for the existence of moments and asymptotic theory, as well as the out-of-sample model selection approaches, such as mean squared error and Model Confidence Set approaches. The paper develops some innovative loss functions which are based on Value-at-Risk forecasts. Finally, we present an empirical application based on simple univariate volatility models, namely GARCH, GJR, EGARCH, and Stochastic Volatility that are widely used to capture asymmetry and leverage. |
Keywords: | volatility model selection;volatility model comparison;non-nested models;model confidence set;Value-at-Risk forecasts;asymmetry, leverage |
Date: | 2010–10–12 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureir:1765020940&r=rmg |
By: | Benjamin M. Tabak; Dimas M. Fazio; Daniel O. Cajueiro |
Abstract: | This paper tests whether diversification of the credit portfolio at the bank level is associated to better performance and lower risk. We employ a new high frequency (monthly) panel data constructed for the Brazilian banking system with information at the bank level for loans by economic sector. We find that loan portfolio concentration increases returns and also reduces default risk; there are significant size effects; foreign and public banks seem to be less affected by the degree of diversification. An important additional finding is that there is an increasing concentration trend after the breakout of the recent international financial crisis, especially after the failure of Lehman Brothers. |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:bcb:wpaper:215&r=rmg |
By: | Alexie Alupoaiei |
Abstract: | In this paper I aimed to analyze the use of copulas in financial application, namely to investigate the assumption of asymmetric dependence and to compute some measures of risk. For this purpose I used a portfolio consisting in four currencies from Central and Eastern Europe. Due to some stylized facts observed in exchange rate series I filter the data with an ARMA GJR model. The marginal distributions of filtered residuals are fitted with a semi-parametric CDF, using a Gaussian kernel for the interior of distribution and Generalized Pareto Distribution for tails. To obtain a better view of the dependence among the four currencies I proposed a decomposition of large portfolio in other three bivariate sub-portfolios. For each of them I compute Value-at-Risk and Conditional Value-at-Risk and then backtest the results. |
Keywords: | Value-at-Risk, copula, Generalized Pareto Distribution |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:cab:wpaefr:44&r=rmg |
By: | Helder Ferreira de Mendonça; Délio José Cordeiro Galvão; Renato Falci Villela Loures |
Abstract: | The advance of globalization of the international financial market has implied a more complex portfolio risk for the banks. Furthermore, several points such as the growth of e-banking and the increase in accounting irregularities call attention to operational risk. This article presents an analysis for the estimation of economic capital concerning operational risk in a Brazilian banking industry case making use of Markov chains, extreme value theory, and peaks over threshold modelling. The findings denote that some existent methods present consistent results among institutions with similar characteristics of loss data. Moreover, even when methods considered as goodness of fit are applied, such as EVT-POT, the capital estimations can generate large variations and become unreal. |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:bcb:wpaper:213&r=rmg |
By: | L. Spadafora; G. P. Berman; F. Borgonovi |
Abstract: | In the Black-Scholes context we consider the probability distribution function (PDF) of financial returns implied by volatility smile and we study the relation between the decay of its tails and the fitting parameters of the smile. We show that, considering a scaling law derived from data, it is possible to get a new fitting procedure of the volatility smile that considers also the exponential decay of the real PDF of returns observed in the financial markets. Our study finds application in the Risk Management activities where the tails characterization of financial returns PDF has a central role for the risk estimation. |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1010.2184&r=rmg |
By: | Mapa, Dennis S.; Cayton, Peter Julian; Lising, Mary Therese |
Abstract: | Financial institutions hold risks in their investments that can potentially affect their ability to serve their clients. For banks to weigh their risks, Value-at-Risk (VaR) methodology is used, which involves studying the distribution of losses and formulating a statistic from this distribution. From the myriad of models, this paper proposes a method of formulating VaR using the Generalized Pareto distribution (GPD) with time-varying parameter through explanatory variables (TiVEx) - peaks over thresholds model (POT). The time varying parameters are linked to the linear predictor variables through link functions. To estimate parameters of the linear predictors, maximum likelihood estimation is used with the time-varying parameters being replaced from the likelihood function of the GPD. The test series used for the paper was the Philippine Peso-US Dollar exchange rate with horizon from January 2, 1997 to March 13, 2009. Explanatory variables used were GARCH volatilities, quarter dummies, number of holiday-weekends passed, and annual trend. Three selected permutations of modeling through TiVEx-POT by dropping other covariates were also conducted. Results show that econometric models and static POT models were better-performing in predicting losses from exchange rate risk, but simple TiVEx models have potential as part of the VaR modelling philosophy since it has consistent green status on the number exemptions and lower quadratic loss values. |
Keywords: | Value-at-Risk; Extreme Value Theory; Generalized Pareto Distribution; Time-Varying Parameters; Use of Explanatory Variables; GARCH modeling; Peaks-over-Thresholds Model |
JEL: | G12 C53 C22 C01 |
Date: | 2009–12 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:25772&r=rmg |
By: | Bogdan Chiriacescu |
Abstract: | The importance of credit risk assessment and monitoring has increased since the recent financial turmoil. This paper presents a toolkit for credit risk modeling that follows the top-down approach proposed by Wilson (1997). The analysis is conducted separately for the household and corporate sector, by means of panel techniques and seemingly unrelated equations, using default aggregated data at county or business sector level. The results indicate that the determinants of default on bank loans for the household sector are unemployment, exchange rate, industrial production, indebtedness and interest rate spreads, while for the corporate sector the output gap, indebtedness and exchange rate are the main factors. Comparing the two models, it arises that default events from the corporate sector occur sooner than for the household sector in case of adverse macroeconomic developments. There are two possible explanations: i) there is no personal bankruptcy law for individuals in Romania, and ii) public administration appears to adjusts slower during recessions, an important part of the work force being part of this system. Furthermore, stress-testing analysis is performed on arbitrarily built portfolios by considering the impact of macroeconomic shocks on the probabilities of default over a one year time horizon. |
Keywords: | credit risk models, top-down approach |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:cab:wpaefr:46&r=rmg |
By: | Antonella Campana (Dept. SEGeS, University of Molise); Paola Ferretti (Dept. of Applied Mathematics and Advanced School of Economics, University Ca'Foscari of Venice) |
Abstract: | With reference to risk adjusted premium principle, in this paper we study excess of loss reinsurance with reinstatements in the case in which the aggregate claims are generated by a discrete distribution. In particular, we focus our study on conditions ensuring feasibility of the initial premium, for example with reference to the limit on the payment of each claim. Comonotonic exchangeability shows the way forward to a more general definition of the initial premium: some properties characterizing the proposed premium are presented. |
Keywords: | Excess of loss reinsurance; reinstatements; distortion risk measures; initial premium; exchangeability. |
JEL: | G22 |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:vnm:wpaper:203&r=rmg |
By: | Richard Watt (University of Canterbury); Francisco J. Vazquez |
Abstract: | Traditionally, downside risk aversion is the study of the placement of a pure risk (a secondary risk) on either the upside or the downside of a primary two-state risk. When the decision maker prefers to have the secondary risk placed on the upside rather than the downside of the primary lottery, he is said to display downside risk aversion. The literature on the intensity of downside risk aversion has been clear on the point that greater prudence is not equivalent to greater downside risk aversion, although the two concepts are linked. In the present paper we present a new, and we argue equally natural, concept of the downside risk aversion of a decision maker, namely the fraction of a zero mean risk that the decision maker would optimally place on the upside. We then consider how this measure can be used to identify the intensity of downside risk aversion. Specifically, we show that greater downside risk aversion in our model can be accurately measured by a relationship that is very similar to, although somewhat stronger than, greater prudence. |
Keywords: | risk aversion; prudence; downside risk |
JEL: | D8 |
Date: | 2010–05–14 |
URL: | http://d.repec.org/n?u=RePEc:cbt:econwp:10/61&r=rmg |
By: | Fabio Fornari (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Wolfgang Lemke (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.) |
Abstract: | We forecast recession probabilities for the United States, Germany and Japan. The predictions are based on the widely-used probit approach, but the dynamics of regressors are endogenized using a VAR. The combined model is called a ‘ProbVAR’. At any point in time, the ProbVAR allows to generate conditional recession probabilities for any sequence of forecast horizons. At the same time, the ProbVAR is as easy to implement as traditional probit regressions. The slope of the yield curve turns out to be a successful predictor, but forecasts can be markedly improved by adding other financial variables such as the short-term interest rate, stock returns or corporate bond spreads. The forecasting performance is very good for the United States: for the out-of-sample exercise (1995 to 2009), the best ProbVAR specification correctly identifies the ex-post classification of recessions and non-recessions 95% of the time for the one-quarter forecast horizon and 87% of the time for the four-quarter horizon. Moreover, the ProbVAR turns out to significantly improve upon survey forecasts. Relative to the good performance reached for the United States, the ProbVAR forecasts are slightly worse for Germany, but considerably inferior for Japan. JEL Classification: C25, C32, E32, E37. |
Keywords: | Recessions, forecasting, probit, VAR. |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20101255&r=rmg |