New Economics Papers
on Risk Management
Issue of 2012‒02‒20
twelve papers chosen by



  1. Time-varying conditional Johnson SU density in value-at-risk (VaR) methodology By Cayton, Peter Julian A.; Mapa, Dennis S.
  2. Calibration of factor models with equity data: parade of correlations By Baranovski, Alexander L.
  3. A Dynamical Approach to Operational Risk Measurement By Marco Bardoscia; Roberto Bellotti
  4. Contagion in financial networks: a threat index By Gabrielle Demange
  5. Harmonising Basel III and the Dodd Frank Act through greater collaboration between standard setters and national supervisors By Ojo, Marianne
  6. Pitfalls in modeling loss given default of bank loans By Hibbeln, Martin; Gürtler, Marc
  7. Volatility timing and portfolio selection: How best to forecast volatility By Adam E Clements; Annastiina Silvennoinen
  8. Portfolio-Management für Privatanleger auf Basis des State Preference Ansatzes By Fäßler, Robert; Kraus, Christina; Weiler, Sebastian M.; Abukadyrova, Kamila
  9. Harmonising Basel III and the Dodd Frank Act through international accounting standards: reasons why international accounting standards should serve as “thermostats” By Ojo, Marianne
  10. Ensemble properties of high frequency data and intraday trading rules By Fulvio Baldovin; Francesco Camana; Massimiliano Caporin; Attilio L. Stella
  11. Stable solutions for optimal reinsurance problems involving risk measures. By Balbás, Alejandro; Balbás, Beatriz; Heras, Antonio
  12. What is Systemic Risk and what can be done about it? A Legal Perspective By Joanna Gray

  1. By: Cayton, Peter Julian A.; Mapa, Dennis S.
    Abstract: Stylized facts on financial time series data are the volatility of returns that follow non-normal conditions such as leverage effects and heavier tails leading returns to have heavier magnitudes of extreme losses. Value-at-risk is a standard method of forecasting possible future losses in investments. A procedure of estimating value-at-risk using time-varying conditional Johnson SU¬ distribution is introduced and assessed with econometric models. The Johnson distribution offers the ability to model higher parameters with time-varying structure using maximum likelihood estimation techniques. Two procedures of modeling with the Johnson distribution are introduced: joint estimation of the volatility and two-step procedure where estimation of the volatility is separate from the estimation of higher parameters. The procedures were demonstrated on Philippine-foreign exchange rates and the Philippine stock exchange index. They were assessed with forecast evaluation measures with comparison to different value-at-risk methodologies. The research opens up modeling procedures where manipulation of higher parameters can be integrated in the value-at-risk methodology.
    Keywords: Time Varying Parameters; GARCH models; Nonnormal distributions; Risk Management
    JEL: G12 C53 G32 C22
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:36206&r=rmg
  2. By: Baranovski, Alexander L.
    Abstract: This paper describes the process of ML-estimating of the equity correlations which can be used as proxies for asset correlations. In a Gaussian framework the ML-estimators are given in closed form. On this basis the impact of the Lehman’s collapse on the dynamics of correlations is investigated: after the Lehman failure in September 2008 the rise in correlations took place across all economic sectors.
    Keywords: intra/inter asset correlations; maximum likelihood estimation; single risk factor model; normal mixture; VAR of equity portfolio
    JEL: C10
    Date: 2012–01–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:36300&r=rmg
  3. By: Marco Bardoscia; Roberto Bellotti
    Abstract: We propose a dynamical model for the estimation of Operational Risk in banking institutions. Operational Risk is the risk that a financial loss occurs as the result of failed processes. Examples of operational losses are the ones generated by internal frauds, human errors or failed transactions. In order to encompass the most heterogeneous set of processes, in our approach the losses of each process are generated by the interplay among random noise, interactions with other processes and the efforts the bank makes to avoid losses. We show how some relevant parameters of the model can be estimated from a database of historical operational losses, validate the estimation procedure and test the forecasting power of the model. Some advantages of our approach over the traditional statistical techniques are that it allows to follow the whole time evolution of the losses and to take into account different-time correlations among the processes.
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1202.2532&r=rmg
  4. By: Gabrielle Demange (PSE - Paris-Jourdan Sciences Economiques - CNRS : UMR8545 - Ecole des Hautes Etudes en Sciences Sociales (EHESS) - Ecole des Ponts ParisTech - Ecole Normale Supérieure de Paris - ENS Paris - INRA, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: An intricate web of claims and obligations ties together the balance sheets of a wide variety of financial institutions. Under the occurrence of default, these interbank claims generate externalities across institutions and possibly disseminate defaults and bankruptcy. Building on a simple model for the joint determination of the repayments of interbank claims, this paper introduces a measure of the threat that a bank poses to the system. Such a measure, called threat index, may be helpful to determine how to inject cash into banks so as to increase debt reimbursement, or to assess the contributions of individual institutions to the risk in the system. Although the threat index and the default level of a bank both reflect some form of weakness and are affected by the whole liability network, the two indicators differ. As a result, injecting cash into the banks with the largest default level may not be optimal.
    Keywords: Contagion ; Systemic risk ; Financial linkages ; Bankruptcy
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:hal:psewpa:halshs-00662513&r=rmg
  5. By: Ojo, Marianne
    Abstract: Having considered a vital means whereby the Basel III framework and the Dodd Frank Act could achieve a respectable degree of harmonization, in the paper which precedes this, namely, the paper on “Harmonising Basel III and the Dodd Frank Act through International Accounting Standards – Reasons why International Accounting Standards Should Serve as “Thermostats”, this paper considers another important means of effectively achieving the aims and objectives of these important and major regulatory reforms aimed at achieving greater financial stability. In so doing, it will highlight challenges encountered by the Basel III framework, as well as that encountered by the Dodd Frank Act – particularly in the areas of enforcement, coordination and communication. In facilitating better enforcement, the need for high level principles, bright line rules and a more effective mandate will be emphasized. Furthermore, a system whereby greater collaboration between standard setters and national supervisors can be better facilitated requires effective coordination and communication mechanisms aimed at ensuring that vital decisions and information are communicated timely, accurately, effectively and completely.
    Keywords: financial stability; Volcker Rule; Basel III; Dodd Frank; European Systemic Risk Board; supervisors; Basel Committee; coordination; information asymmetry; regulation; high level principles
    JEL: D0 E02 K2 D8 G01
    Date: 2012–01–25
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:36164&r=rmg
  6. By: Hibbeln, Martin; Gürtler, Marc
    Abstract: The parameter loss given default (LGD) of loans plays a crucial role for risk-based decision making of banks including risk-adjusted pricing. Depending on the quality of the estimation of LGDs, banks can gain significant competitive advantage. For bank loans, the estimation is usually based on discounted recovery cash flows, leading to workout LGDs. In this paper, we reveal several problems that may occur when modeling workout LGDs, leading to LGD estimates which are biased or have low explanatory power. Based on a data set of 71,463 defaulted bank loans, we analyze these issues and derive recommendations for action in order to avoid these problems. Due to the restricted observation period of recovery cash flows the problem of length-biased sampling occurs, where long workout processes are underrepresented in the sample, leading to an underestimation of LGDs. Write-offs and recoveries are often driven by different influencing factors, which is ignored by the empirical literature on LGD modeling. We propose a two-step approach for modeling LGDs of non-defaulted loans which accounts for these differences leading to an improved explanatory power. For LGDs of defaulted loans, the type of default and the length of the default period have high explanatory power, but estimates relying on these variables can lead to a significant underestimation of LGDs. We propose a model for defaulted loans which makes use of these influence factors and leads to consistent LGD estimates. --
    Keywords: Credit risk,Bank loans,Loss given default,Forecasting
    JEL: G21 G28
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:tbsifw:if35v1&r=rmg
  7. By: Adam E Clements (QUT); Annastiina Silvennoinen (QUT)
    Abstract: Within the context of volatility timing and portfolio selection this paper considers how best to estimate a volatility model. Two issues are dealt with, namely the frequency of data used to construct volatility estimates, and the loss function used to estimate the parameters of a volatility model. We find support for the use of intraday data for estimating volatility which is consistent with earlier research. We also find that the choice of loss function is important and show that a simple mean squared error loss, overall provides the best forecasts of volatility upon which to form optimal portfolios.
    Keywords: Volatility, volatility timing, utility, portfolio allocation, realized volatility
    JEL: C22 G11 G17
    Date: 2011–10–12
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2011_7&r=rmg
  8. By: Fäßler, Robert; Kraus, Christina; Weiler, Sebastian M.; Abukadyrova, Kamila
    Abstract: Im Rahmen der bestehenden Portfoliotheorie wird zur Risikobewertung auf Normalverteilungsannahmen der Renditen oder Korrelationen aus historischen Daten zurückgegriffen. In den Finanzkrisen der Jahre 2008/09 stiegen jedoch die Korrelationen zwischen risikobehafteten Kapitalanlagen stark an. Zugleich wiesen deren Renditen Ausreißer im negativen Bereich auf, für die eine Normalverteilungsannahme keinen Erklärungsgehalt birgt. Deshalb wird in dieser Arbeit unter Anwendung des State Preference Ansatzes eine Möglichkeit zur impliziten Ermittlung der Wahrscheinlichkeitsannahmen und der Risikoeinstellung des Kapitalmarktes vorgestellt. Hierzu wird eine quadratische Payoff Matrix aus den Marktpreisen der Kapitalanlagen im Januar 2011 und deren Rückflüssen in einem spezifizierten zukünftigen Zeitpunkt erstellt. Die Rückflüsse werden über einen multivariaten Regressionsansatz für fest definierte makroökonomische Umweltzustände prognostiziert. Es zeigt sich, dass die Zustandpreisverteilung des Kapitalmarktes nach dem Prinzip arbitragefreier Märkte als Näherungslösung ermittelt werden kann und die Risikoeinstellung des Kapitalmarktes aufzeigt. Durch die Adjustierung der Rückflüsse mit dem risikolosen Zinssatz und dem kapitalanlagespezifischen Risikoaufschlag können die Zustandspreise beispielhaft als wahre Wahrscheinlichkeiten des Kapitalmarktes in das Modell des Minimum-Varianz-Portfolios übertragen und unter festgelegten Annahmen zur Darstellung und Optimierung von Portfolios verwendet werden. -- In context of the existing Portfolio Theory the valuation of risk is based on the normal distribution of return or correlation based on historical data. During the Financial Crisis in 2008/09 the correlation between assets that carried risks increased. In addition the return of those assets were partly negative even though the assumption of Gaussian distribution offered no explanation. By identifying this problem, this working paper offers a possibility to use the implicit probabilities and the risk assessment of the capital markets by using the State Preference Theory. Therefore a squared Payoff Matrix is created by the market prices of chosen assets in January 2011 and their returns in point in time t1. The returns are forecasted using a multivariable regression which applies for exactly defined macro-economic conditions. It is shown, that the state prices of the capital markets can be determined as approximate value that shows the risk accommodation using the principle of arbitrage free markets. By discounting the returns with the risk free rate and the asset specific risk premium the state prices can be shown for example as true probabilities of the capital markets. These probabilities can be transferred into the minimum-variance portfolio which can be used to optimize Portfolios by using specific presumptions.
    JEL: G11 G13
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:bayfat:201103&r=rmg
  9. By: Ojo, Marianne
    Abstract: Why should differences between regulatory and accounting policies be mitigated? Because mitigating such differences could facilitate convergence – as well as financial stability. The paper “Fair Value Accounting and Procyclicality: Mitigating Regulatory and Accounting Policy Differences through Regulatory Structure Reforms and Enforced Self Regulation” illustrates how the implementation of accounting standards and policies, in certain instances, have contrasted with Basel Committee initiatives aimed at mitigating procyclicality and facilitating forward looking provisioning. The paper also highlights how and why differences between regulatory and accounting policies could (and should) be mitigated. This paper focuses on how recent regulatory reforms – with particular reference to the Dodd Frank Act, impact fair value measurements. Other potential implications for accounting measurements and valuation, will also be considered. Given the tendencies for discrepancies to arise between regulatory and accounting policies, and owing to discrepancies between Basel III and the Dodd Frank Act, would a more imposing and commanding role for international standards not serve as a powerful weapon in harmonizing Basel III and Dodd Frank – whilst mitigating regulatory and accounting policy differences?
    Keywords: financial stability; OTC derivatives markets; counterparty risks; disclosure; information asymmetry; transparency; living wills; Volcker Rule; Basel III; Basel II; pro cyclicality; international auditing standards; Dodd Frank Act; fair values
    JEL: E02 D0 K2 D8 G01 E3
    Date: 2012–01–24
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:36149&r=rmg
  10. By: Fulvio Baldovin; Francesco Camana; Massimiliano Caporin; Attilio L. Stella
    Abstract: We demonstrate that a stochastic model consistent with the scaling properties of financial assets is able to replicate the empirical statistical properties of the S&P 500 high frequency data within a window of three hours in each trading day. This result extends previous findings obtained for EUR/USD exchange rates. We apply the forecast capabilities of the model to implement an explicit trading strategy. Trading signals are model-based and not derived from chartist criteria. In-sample and out-of-sample tests indicate that the model performs better than a benchmark asymmetric GARCH process, and expose the existence of small arbitrage opportunities. We discuss how to improve performances and why the trading strategy is potentially interesting to hedge volatility risk for S&P index-based products.
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1202.2447&r=rmg
  11. By: Balbás, Alejandro; Balbás, Beatriz; Heras, Antonio
    Abstract: The optimal reinsurance problem is a classic topic in actuarial mathematics. Recent approaches consider a coherent or expectation bounded risk measure and minimize the global risk of the ceding company under adequate constraints. However, there is no consensus about the risk measure that the insurer must use, since every risk measure presents advantages and shortcomings when compared with others. This paper deals with a discrete probability space and analyzes the stability of the optimal reinsurance with respect to the risk measure that the insurer uses. We will demonstrate that there is a ‘‘stable optimal retention’’ that will show no sensitivity, insofar as it will solve the optimal reinsurance problem for many risk measures, thus providing a very robust reinsurance plan. This stable optimal retention is a stop-loss contract, and it is easy to compute in practice. A fast linear time algorithm will be given and a numerical example presented.
    Keywords: Optimal reinsurance; Risk measure; Sensitivity; Stable optimal retention; Stop-loss reinsurance;
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/13079&r=rmg
  12. By: Joanna Gray
    Abstract: The global financial crisis challenges scholars from across many different disciplines to think about causes, immediate consequences and long term responses from the perspective of their particular disciplinary standpoint. This paper focuses on one particular response to the landscape and architecture of financial regulation that results from the lessons about the need to better recognise and counter risk to the financial system as a whole as opposed to its constituent parts and participants. One important policy response to the challenges posed by systemic risk has been the construction of the emergent macroprudential regulatory agenda that is now beginning to take root in practical forms and institutional architecture within the global, European and national spaces within which norms, standards and laws operate. It is argued here that this emerging agenda will challenge lawyers and legal systems more than may be being currently imagined.
    Date: 2011–10–15
    URL: http://d.repec.org/n?u=RePEc:erp:euirsc:p0296&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.