|
on Risk Management |
Issue of 2016‒05‒28
fifteen papers chosen by |
By: | Mabelle Sayah (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1, ISFA - Institut des Science Financière et d'Assurances - PRES Université de Lyon, Faculte des Sciences - Universite Saint Joseph - USJ - Université Saint-Joseph de Beyrouth) |
Abstract: | A bank's capital charge computation is a widely discussed topic with new approaches emerging continuously. Each bank is computing this figure using internal methodologies in order to reflect its capital adequacy; however, a more homogeneous model is recommended by the Basel committee to enable judging the situation of these financial institutions and comparing different banks among each other. In this paper, we compare different numerical and econometric models to the sensitivity based approach (SBA) implemented by BCBS under Basel III in its February 2015 publication in order to compute the capital charge, we study the influence of having several currencies and maturities within the portfolio and try to define the time horizon and confidence level implied by Basel s III approach through an application on bonds portfolios. By implementing several approaches, we are able to find equivalent VaRs to the one computed by the SBA on a pre-defined confidence level (97.5 %). However, the time horizon differs according to the chosen methodology and ranges from 1 month up to 1 year. |
Keywords: | interest rate risk,ICA,Dynamic Nelson Siegel,bonds portfolio,PCA,Basel III,GARCH,Capital charge,Sensitivity Based approach,trading book |
Date: | 2016–02–01 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01217928&r=rmg |
By: | Dominique Guegan (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique); Kehan Li (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique) |
Abstract: | Finiteness of sample, as one major sources of uncertainty, has been ignored by the regulators and risk managers domains such as portfolio management, credit risk modelling and finance (or insurance) regulatory capital calculations. To capture this uncertainty, we provide a robust confidence interval (CI) of historical Value-at-Risk (hVaR) for different length of sample. We compute this CI from a saddlepoint approximation of the distribution of hVaR using a bisection search approach. We also suggest a Spectral Stress Value-at-Risk measure based on the CI, as an alternative risk measure for both financial and insurance industries. Finally we perform a stress testing application for the SSVaR. |
Keywords: | Uncertainty,Small sample,Value-at-Risk,Asymptotic normality approximation,Saddlepoint approximation,Bisection search approach,Spectral Stress VaR,Stress testing |
Date: | 2016–04 |
URL: | http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01317391&r=rmg |
By: | Fr\'ed\'eric Vrins |
Abstract: | In this paper, we compare static and dynamic (reduced form) approaches for modeling wrong-way risk in the context of CVA. Although all these approaches potentially suffer from arbitrage problems, they are popular (respectively) in industry and academia, mainly due to analytical tractability reasons. We complete the stochastic intensity models with another dynamic approach, consisting in the straight modeling of the survival (Az\'ema supermartingale) process using the $\Phi$-martingale. Just like the other approaches, this method allows for automatic calibration to a given default probability curve. We derive analytically the positive exposures $V^+_t$ "conditional upon default" associated to prototypical market price processes of FRA and IRS in all cases. We further discuss the link between the "default" condition and change-of-measure techniques. The expectation of $V^+_t$ conditional upon $\tau=t$ is equal to the unconditional expectation of $V^+_t\zeta_t$. The process $\zeta$ is explicitly derived in the dynamic approaches: it is proven to be positive and to have unit expectation. Unfortunately however, it fails to be a martingale, so that Girsanov machinery cannot be used. Nevertheless, the expectation of $V^+_t\zeta_t$ can be computed explicitly, leading to analytical expected positive exposure profiles in the considered examples. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1605.05100&r=rmg |
By: | Masafumi Nakano (Graduate School of Economics, University of Tokyo); Seisho Sato (Faculty of Economics, University of Tokyo); Akihiko Takahashi (Faculty of Economics, University of Tokyo); Soichiro Takahashi (Graduate School of Economics, University of Tokyo) |
Abstract: | This paper proposes a new method for constructing optimal portfolios with a particle filtering method, which shows we are able to improve performances of mean-variance portfolios substantially through estimation of expected returns and returns' volatilities based on Monte Carlo filter. In particular, we introduce state variables associated with expected returns as well as asymmetric volatilities in a state space framework and predict asset returns consistent with volatility changes in time. As a result, our estimated portfolios outperform not only mean-variance Portfolios with moving averages and variances of past returns, but also risk parity, minimum variance, and equally weighted portfolios, which do not depend on predictions of asset returns. Moreover, we construct portfolios with transaction costs and no-short-sale constraints, which possibly include Japanese REIT and U.S. REIT in addition to domestic and international bonds and equities with a riskless asset. Finally, performance evaluation based on accumulated returns, Sharpe ratios, Sortino ratios and maximum drawdowns confirms the validity of our method. |
URL: | http://d.repec.org/n?u=RePEc:tky:jseres:2015cj276&r=rmg |
By: | Farooquee, Arsalan Ali; Shrimali, Gireesh |
Abstract: | India’s ambitious renewable energy targets of 175 GW by 2022 will require significant foreign investment. A major issue facing foreign investment in India is offtaker risk or the risk of the public sector distribution companies (DISCOMs) being unable to make payments on time for the procurement of power. Ultimately, this will require long-term financial structural fixes for DISCOMs, some of which are currently under consideration. However, in the short-term, one solution is a government-supported payment security mechanism to build investor confidence. In this paper, we develop a framework, in order to enable assessment of an existing payment security mechanism. We built our framework using elements of credit and financial guarantees – probability of default, exposure at default, and recovery after default. We applied the framework to estimate the size of payment security mechanism involving a central aggregator during JNNSM Phase 2, Batch1. We estimated this size to be INR 4160 million or INR 5.55 million/MW, or less than 10% of capital costs, but more than 2.5 times the size of a previously proposed facility. In other words, the existing facility did not provide adequate coverage of off-taker risk. |
Keywords: | Credit risk, Probability of default, Renewable energy, Power purchase agreement |
JEL: | Q4 Q5 |
Date: | 2016–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:71241&r=rmg |
By: | Damien Ackerer; Damir Filipovi\'c |
Abstract: | We introduce a novel class of credit risk models in which the drift of the survival process of a firm is a linear function of the factors. These models outperform the standard affine default intensity models in terms of analytical tractability. The prices of defaultable bonds and credit default swaps (CDS) are linear in the factors. The price of a CDS option can be uniformly approximated by polynomials in the factors. An empirical study illustrates the versatility of these models by fitting CDS spread time series. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1605.07419&r=rmg |
By: | Charles W. Calomiris; Matthew Jaremski |
Abstract: | Economic theories posit that bank liability insurance is designed as serving the public interest by mitigating systemic risk in the banking system through liquidity risk reduction. Political theories see liability insurance as serving the private interests of banks, bank borrowers, and depositors, potentially at the expense of the public interest. Empirical evidence – both historical and contemporary – supports the private-interest approach as liability insurance generally has been associated with increases, rather than decreases, in systemic risk. Exceptions to this rule are rare, and reflect design features that prevent moral hazard and adverse selection. Prudential regulation of insured banks has generally not been a very effective tool in limiting the systemic risk increases associated with liability insurance. This likely reflects purposeful failures in regulation; if liability insurance is motivated by private interests, then there would be little point to removing the subsidies it creates through strict regulation. That same logic explains why more effective policies for addressing systemic risk are not employed in place of liability insurance. The politics of liability insurance also should not be construed narrowly to encompass only the vested interests of bankers. Indeed, in many countries, it has been installed as a pass-through subsidy targeted to particular classes of bank borrowers. |
JEL: | E44 G21 G28 |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:22223&r=rmg |
By: | Birgit Rudloff |
Abstract: | In incomplete financial markets not every contingent claim can be replicated by a self-financing strategy. The risk of the resulting shortfall can be measured by convex risk measures, recently introduced by F\"ollmer, Schied (2002). The dynamic optimization problem of finding a self-financing strategy that minimizes the convex risk of the shortfall can be split into a static optimization problem and a representation problem. It follows that the optimal strategy consists in superhedging the modified claim $\widetilde{\varphi}H$, where $H$ is the payoff of the claim and $\widetilde{\varphi}$ is the solution of the static optimization problem, the optimal randomized test. In this paper, we will deduce necessary and sufficient optimality conditions for the static problem using convex duality methods. The solution of the static optimization problem turns out to be a randomized test with a typical $0$-$1$-structure. |
Date: | 2016–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1604.08070&r=rmg |
By: | Ewa Marciniak; Zbigniew Palmowski |
Abstract: | This paper concerns the dual risk model, dual to the risk model for insurance applications, where premiums are surplus-dependent. In such a model premiums are regarded as costs, while claims refer to profits. We calculate the mean of the cumulative discounted dividends paid until ruin, if the barrier strategy is applied. We formulate associated Hamilton-Jacobi-Bellman equation and identify sufficient conditions for a barrier strategy to be optimal. Some numerical examples are provided when profits have exponential law. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1605.04584&r=rmg |
By: | Robinson Kruse (Rijksuniversiteit Groningen and CREATES); Christian Leschinski (Leibniz University Hannover); Michael Will (Leibniz University Hannover) |
Abstract: | This paper extends the popular Diebold-Mariano test to situations when the forecast error loss differential exhibits long memory. It is shown that this situation can arise frequently, since long memory can be transmitted from forecasts and the forecast objective to forecast error loss differentials. The nature of this transmission mainly depends on the (un)biasedness of the forecasts and whether the involved series share common long memory. Further results show that the conventional Diebold-Mariano test is invalidated under these circumstances. Robust statistics based on a memory and autocorrelation consistent estimator and an extended fixed-bandwidth approach are considered. The subsequent Monte Carlo study provides a novel comparison of these robust statistics. As empirical applications, we conduct forecast comparison tests for the realized volatility of the Standard and Poors 500 index among recent extensions of the heterogeneous autoregressive model. While we find that forecasts improve significantly if jumps in the log-price process are considered separately from continuous components, improvements achieved by the inclusion of implied volatility turn out to be insignificant in most situations. |
Keywords: | Equal Predictive Ability, Long Memory, Diebold-Mariano Test, Long-run Variance Estimation, Realized Volatility |
JEL: | C22 C52 C53 |
Date: | 2016–05–19 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2016-17&r=rmg |
By: | Takashi Shinzato |
Abstract: | In the present paper, the minimal investment risk for a portfolio optimization problem with imposed budget and investment concentration constraints is considered using replica analysis. Since the minimal investment risk is influenced by the investment concentration constraint (as well as the budget constraint), it is intuitive that the minimal investment risk for the problem with an investment concentration constraint be larger than that without the constraint (that is, with only the budget constraint). Moreover, a numerical experiment shows the effectiveness of our proposed analysis. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1605.06845&r=rmg |
By: | Thomas Krause; T. Sondershaus; Lena Tonzer |
Abstract: | We construct a novel dataset to measure banks’ business and geographical complexity. Using these measures of complexity, we evaluate how they relate to banks’ idiosyncratic and systemic riskiness. The sample covers stock listed banks in the euro area from 2007 to 2014. Our results show that banks have increased their total number of subsidiaries while business and geographical complexity have declined. |
Keywords: | bank risk, complexity, globalization |
JEL: | G01 G20 G33 |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:iwh:dispap:17-16&r=rmg |
By: | Hernández del Valle Gerardo; Juárez-Torres Miriam; Guerrero Santiago |
Abstract: | In this paper we extend the traditional GARCH(1,1) model by including a functional trend term in the conditional volatility of a time series. We derive the main properties of the model and apply it to all agricultural commodities in the Mexican CPI basket, as well as to the international prices of maize, wheat, pork, poultry and beef products for three different time periods that implied changes in price regulations and behavior. The proposed model seems to adequately fit the volatility process and, according to homoscedasticity tests, outperforms the ARCH(1) and GARCH(1,1) models, some of the most popular approaches used in the literature to analyze price volatility. |
Keywords: | Agricultural prices; volatility; GARCH models. |
JEL: | C22 C51 E31 Q18 |
Date: | 2016–04 |
URL: | http://d.repec.org/n?u=RePEc:bdm:wpaper:2016-04&r=rmg |
By: | Julien Blasco; Graciela Chichilnisky |
Abstract: | This article focuses on the work of O. Chanel and G. Chichilnisky (2013) on the flaws of expected utility theory while assessing the value of life. Expected utility is a fundamental tool in decision theory. However, it does not fit with the experimental results when it comes to catastrophic outcomes ---see, for example, Chichilnisky (2009) for more details. In the experiments conducted by Olivier Chanel in 1998 and 2009, several subjects are ask to imagine they are presented 1 billion identical pills. They are paid \$220,000 to take and swallow one, knowing that one out of 1 billion is deadly. The objective of this article is to show that risk aversion phenomenon cannot explain the experimental results found. This is an additional reason why a new kind of utility function is necessary: the axioms proposed by Graciela Chichilnisky will be briefly presented, and it will be shown that it better fits with experiments than any risk aversion utility function. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1604.05672&r=rmg |
By: | Damien Ackerer; Damir Filipovi\'c; Sergio Pulido |
Abstract: | We introduce a novel stochastic volatility model where the squared volatility of the asset return follows a Jacobi process. It contains the Heston model as a limit case. We show that the finite-dimensional distributions of the log price process admit a Gram--Charlier A expansion in closed-form. We use this to derive closed-form series representations for option prices whose payoff is a function of the underlying asset price trajectory at finitely many time points. This includes European call, put, and digital options, forward start options, and forward start options on the underlying return. We derive sharp analytical and numerical bounds on the truncation errors. We illustrate the performance by numerical examples, which show that our approach offers a viable alternative to Fourier transform techniques. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1605.07099&r=rmg |