New Economics Papers
on Risk Management
Issue of 2012‒11‒24
twelve papers chosen by



  1. The missing link: Unifying risk taking and time discounting By Thomas Epper; Helga Fehr-Duda
  2. Inside the labyrinth of Basel risk-weighted assets: how not to get lost By Francesco cannata; Simone Casellina; Gregorio Guidi
  3. VAR for VaR: Measuring Tail Dependence Using Multivariate Regression Quantiles By Habert white; Tae-Hwan Kim; Simone Manganelli
  4. Endogeneous Risk in Monopolistic Competition By Vladislav Damjanovic
  5. Excessive credit growth and countercyclical capital buffers in basel III: an empirical evidence from central and east european countries By Seidler, Jakub; Gersl, Adam
  6. Exponential GARCH Modeling with Realized Measures of Volatility By Peter Reinhard Hansen; Zhuo Huang
  7. Bayesian Credit Rating Assessment By Paola Cerchiello; Paolo Giudici
  8. Independent Factor Autoregressive Conditional Density Model By Alexios Ghalanos; Eduardo Rossi; Giovanni Urga
  9. Niveloids and Their Extensions:Risk Measures on Small Domains By Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Aldo Rustichini
  10. When Lower Risk Increases Profit: Competition and Control of a Central Counterparty By Jean-Sébastien Fontaine; Héctor Pérez Saiz; Joshua Slive
  11. The National Retirement Risk Index: An Update By Alicia H. Munnell; Anthony Webb; Francesca Golub-Sass
  12. Estimating the Parameters of Stochastic Volatility Models using Option Price Data By Stan Hurn; Ken Lindsay; Andrew McClelland

  1. By: Thomas Epper; Helga Fehr-Duda
    Abstract: Almost all important decisions in people’s lives entail risky and delayed consequences. Regardless of whether we make choices involving health, wealth, love or education, almost every choice involves costs and benefits that are uncertain and materialize over time. Because risk and delay often arise simultaneously, theories of decision making should be capable of explaining how behavior under risk and over time interacts. There is, in fact, a growing body of evidence indicating important interactions between behaviorally revealed risk tolerance and patience. Risk taking behavior is delay dependent, and time discounting is risk dependent. Here we show that the inherent uncertainty of future events conjointly with people’s proneness to weight probabilities nonlinearly generates a unifying framework for explaining time-dependent risk taking, risk-dependent time discounting, preferences for late resolution of uncertainty, and several other puzzling interaction effects between risk and time.
    Keywords: Risk taking, time discounting, probability weighting, decreasing impatience, increasing risk tolerance, preference for late resolution of uncertainty, preference for one-shot resolution of uncertainty
    JEL: D01 D81 D91
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:zur:econwp:096&r=rmg
  2. By: Francesco cannata (Bank of Italy); Simone Casellina (Bank of Italy); Gregorio Guidi (Bank of Italy)
    Abstract: Many studies have questioned the reliability of banks’ calculations of risk-weighted assets (RWA) for prudential purposes. The significant divergences found at international level are taken as indicating excessive subjectivity in the current rules governing banks’ risk measurement and capital requirement calculations. This paper emphasises the need for appropriate metrics to compare banks’ riskiness under a risk-sensitive framework (either Basel 2 or Basel 3). The ratio of RWA to total assets – which is widely used for peer analyses – is a valuable starting point, but when analysis becomes more detailed it needs to be supplemented by other indicators. Focusing on credit risk, we propose an analytical methodology to disentangle the major factors in RWA differences and, using data from Italian banks (given the inadequate degree of detail of Pillar 3 reports), we show that a large part of the interbank dispersion is explained by the business mix of individual institutions as well as the use of different prudential approaches (standardised and IRB). In conclusion we propose a simple data template that international banks could use to apply the framework suggested.
    Keywords: Basel Accord, risk-weighted assets, banking supervision, credit risk
    JEL: G18 G21 G28
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:bdi:opques:qef_132_12&r=rmg
  3. By: Habert white; Tae-Hwan Kim (School of Economics, Yonsei University); Simone Manganelli (European Central Bank, DG-Research)
    Abstract: This paper proposes methods for estimation and inference in multivariate, multi-quantile models. The theory can simultaneously accommodate models with multiple random variables, multiple confidence levels, and multiple lags of the associated quantiles. The proposed framework can be conveniently thought of as a vector autoregressive (VAR) extension to quantile models. We estimate a simple version of the model using market equity returns data to analyse spillovers in the values at risk (VaR) between a market index and financial institutions. We construct impulse-response functions for the quantiles of a sample of 230 financial institutions around the world and study how financial institution-specific and system-wide shocks are absorbed by the system. We show how our methodology can successfully identify both in-sample and out-of-sample the set of financial institutions whose risk is most sentitive to market wide shocks in situations of financial distress, and can prove a valuable addition to the traditional toolkit of policy makers and supervisors.
    Keywords: Quantile impulse-responses, spillover, codependence,CAViaR
    JEL: C13 C14 C32
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:yon:wpaper:2012rwp-45&r=rmg
  4. By: Vladislav Damjanovic
    Abstract: We consider a model of financial intermediation with a monopolistic competition market structure. A non-monotonic relationship between the risk measured as a probability of default and the degree of competition is established.
    Keywords: Competition and Risk, Risk in DSGE models, Bank competition; Bank failure, Default correlation, Risk-shifting effect, Margin effect.
    JEL: G21 G24 D43 E13 E43
    Date: 2012–10–24
    URL: http://d.repec.org/n?u=RePEc:san:cdmawp:1210&r=rmg
  5. By: Seidler, Jakub; Gersl, Adam
    Abstract: Excessive credit growth is often considered to be an indicator of future problems in the financial sector. This paper examines the issue of how best to determine whether the observed level of private sector credit is excessive in the context of the “countercyclical capital buffer”, a macroprudential tool proposed in the new regulatory framework of Basel II by the Basel Committee on Banking Supervision. An empirical analysis of selected Central and Eastern European countries, including the Czech Republic, provides alternative estimates of excessive private credit and shows that the HP filter calculation proposed by the Basel Committee is not necessarily a suitable indicator of excessive credit growth for converging countries.
    Keywords: credit growth; financial crisis; countercyclical capital buffer; Basel II
    JEL: G18 G01 G21
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42541&r=rmg
  6. By: Peter Reinhard Hansen (European University Institute and CREATES); Zhuo Huang (Peking University, National School of Development, China Center for Economic Research)
    Abstract: We introduce the Realized Exponential GARCH model that can utilize multiple realized volatility measures for the modeling of a return series. The model specifies the dynamic properties of both returns and realized measures, and is characterized by a flexible modeling of the dependence between returns and volatility. We apply the model to DJIA stocks and an exchange traded fund that tracks the S&P 500 index and find that specifications with multiple realized measures dominate those that rely on a single realized measure. The empirical analysis suggests some convenient simplifications and highlights the advantages of the new specification.
    Keywords: EGARCH, High Frequency Data, Realized Variance, Leverage Effect.
    JEL: C10 C22 C80
    Date: 2012–10–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-44&r=rmg
  7. By: Paola Cerchiello (Department of Economics and Management, University of Pavia); Paolo Giudici (Department of Economics and Management, University of Pavia)
    Abstract: In this contribution we aim at improving ordinal variable selection in the context of causal models. In this regard, we propose an approach that provides a formal inferential tool to compare the explanatory power of each covariate, and, therefore, to select an effective model for classification purposes. Our proposed model is Bayesian nonparametric, and, thus, keeps the amount of model specification to a minimum. We consider the case in which information from the covariates is at the ordinal level. A noticeable instance of this regards the situation in which ordinal variables result from rankings of companies that are to be evaluated according to different macro and micro economic aspects, leading to different ordinal covariates that correspond to different ratings, that entail different magnitudes of the probability of default. For each given covariate, we suggest to partition the statistical units in as many groups as the number of observed levels of the covariate. We then assume individual defaults to be homogeneous within each group and heterogeneous across groups. Our aim is to compare and, therefore, select, the partition structures resulting from the consideration of different explanatory covariates. The metric we choose for variable comparison is the calculation of the posterior probability of each partition. The application of our proposal to a European credit risk database shows that it performs well, leading to a coherent and clear to explain method for variable averaging the estimated default probabilities.
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:pav:demwpp:019&r=rmg
  8. By: Alexios Ghalanos (Faculty of Finance, Cass Business School); Eduardo Rossi (Department of Economics and Management, University of Pavia); Giovanni Urga (Faculty of Finance, Cass Business School and University of Bergamo)
    Abstract: In this paper, we propose a novel Independent Factor Autoregressive Conditional Density (IFACD) model able to generate time-varying higher moments using an independent factor setup. Our proposed framework incorporates dynamic estimation of higher comovements and feasible portfolio representation within a non elliptical multivariate distribution. We report an empirical application, using returns data from 14 MSCI equity index iShares for the period 1996 to 2011, and we show that the IFACD model provides superior VaR forecasts and portfolio allocations with respect to the CHICAGO and DCC models.
    Keywords: Independent Factor Model, GO-GARCH, Independent Component Analysis, Timevarying Co-moments
    JEL: C13 C16 C32 G11
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:pav:demwpp:021&r=rmg
  9. By: Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Aldo Rustichini
    Abstract: Given a functional defi?ned on a nonempty subset of an Archimedean Riesz space with unit, necessary and sufficient conditions are obtained for the existence of a (convex or concave) niveloid that extends the functional to the entire space. In the language of mathematical fi?nance, this problem is equivalent to the one of verifying if the policy adopted by a regulator is consistent with monetary risk measurement, when only partial information is available. Keywords: extension theorems, Daniell-Stone theorem, risk measures, variational preferences
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:igi:igierp:458&r=rmg
  10. By: Jean-Sébastien Fontaine; Héctor Pérez Saiz; Joshua Slive
    Abstract: We model the behavior of dealers in Over-the-Counter (OTC) derivatives markets where a small number of dealers trade with a continuum of heterogeneous clients (hedgers). Imperfect competition and (endogenous) default induce a familiar trade-off between competition and risk. Increasing the number of dealers servicing the market decreases the price paid by hedgers but lowers revenue for dealers, increasing the probability of a default. Restricting entry maximizes welfare when dealers’ efficiency is high relative to their market power. A Central Counter-Party (CCP) offering novation tilts the trade-off toward more competition. Free-entry is optimal for all level of dealers’ efficiency if they can constrain risk-taking by its members. In this model, dealers can choose CCP rules to restrict entry and increase their benefits. Moreover, dealers impose binding risk constraints to increase revenues at the expense of the hedgers. In other words, dealers can use risk controls to commit to a lower degree of competition. These theoretical results provide one rationalization of ongoing efforts by regulators globally to promote fair and risk-based access to CCPs.
    Keywords: Financial markets; Financial stability; Financial system regulation and policies
    JEL: G10 G18
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:12-35&r=rmg
  11. By: Alicia H. Munnell; Anthony Webb; Francesca Golub-Sass
    Abstract: The release of the Federal Reserve’s 2010 Survey of Consumer Finances is a great opportunity to reassess Americans’ retirement preparedness as measured by the National Retirement Risk Index (NRRI). The NRRI shows the share of working households who are “at risk” of being unable to maintain their pre-retirement standard of living in retirement. The Index compares projected replacement rates – retirement income as a percentage of pre-retirement income – for today’s working households with target rates that would allow them to maintain their living standard and calculates the percentage at risk of falling short. The NRRI was originally constructed using the Federal Reserve’s 2004 Survey of Consumer Finances (SCF). The SCF is a triennial survey of a nationally representative sample of U.S. households, which collects detailed information on households’ assets, liabilities, and demographic characteristics. The 2007 SCF did not allow for a meaningful update, because stock market and housing prices plummeted right after the survey interviews were completed. Thus, the 2010 survey is the first opportunity to see how the financial crisis and ensuing recession have affected Americans’ readiness for retirement. The discussion proceeds as follows. The first section describes the nuts and bolts of constructing the NRRI and how the new SCF data were incorporated. The second section updates the NRRI using the 2010 SCF, showing that the percentage of households at risk increased by nine percentage points between the 2007 and 2010 surveys – 44 percent to 53 percent. The third section identifies the impact of various factors on the change. The final section concludes that the NRRI confirms what we already know: today’s workers face a major retirement income challenge. Even if households work to age 65 and annuitize all their financial assets, including the receipts from reverse mortgages on their homes, more than half are at risk of being unable to maintain their standard of living in retirement.
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:crr:issbrf:ib2012-20&r=rmg
  12. By: Stan Hurn (QUT); Ken Lindsay (QUT); Andrew McClelland (QUT)
    Abstract: This paper describes a maximum likelihood method for estimating the parameters of Heston's model of stochastic volatility using data on an underlying market index and the prices of options written on that index. Parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options) are identified including the equity and volatility risk premia. The estimation is implemented using a particle filter. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using Graphical Processing Units. A byproduct of this focus on easing the computational burden is the development of a simplication of the closed-form approximation used to price European options in Heston's model. The efficacy of the filter is demonstrated under simulation and an empirical investigation of the fit of the model to the S&P 500 Index is undertaken. All the parameters of the model are reliably estimated and, in contrast to previous work, the volatility premium is well estimated and found to be significant.
    Keywords: stochastic volatility, parameter estimation, maximum likelihood, particle filter
    JEL: C22 C52
    Date: 2012–10–18
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2012_91&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.