nep-rmg New Economics Papers
on Risk Management
Issue of 2016‒05‒21
23 papers chosen by



  1. Measuring expected time to default under stress conditions for corporate loans By Mariusz Górajski; Dobromił Serwa; Zuzanna Wośko
  2. A unfi ed view of systemic risk: detecting SIFIs and forecasting the fi nancial cycle via EWSs By Alessandro Spelta
  3. Regulation and Bankers’ Incentives By Fabiana Gómez; Jorge Ponce
  4. Method Development Aspects of Liquidity-Adjusted Value-at-Risk (LVaR) Technique for Commodities Portfolios By Mazin A. M. Al Janabi
  5. Value-at-Risk: The Effect of Autoregression in a Quantile Process By Khizar Qureshi
  6. A robust confidence interval of historical Value-at-Risk for small sample By Dominique Guegan; Bertrand K. Hassani; Kehan Li
  7. Point process models for extreme returns: Harnessing implied volatility By R Herrera; Adam Clements
  8. Spectrally-Corrected Estimation for High-Dimensional Markowitz Mean-Variance Optimization By Zhidong Bai; Hua Li; Michael McAleer; Wing-Keung Wong
  9. Bank Capital Structure and Financial Innovation: Antagonists or Two Sides of the Same Coin? By Lorenzo Sasso
  10. Capturing the co-benefits of disaster risk management on the private sector side By Rose,Adam
  11. Hedging Inflation with Individual US stocks: A long-run portfolio analysis By Georgios Bampinas; Theodore Panagiotidis
  12. Managerial Reputation, Risk-Taking, and Imperfect Capital Markets By Koji Asano
  13. Forecasting the Market Risk Premium with Artificial Neural Networks By Leoni Eleni Oikonomikou
  14. Credit risk interconnectedness: What does the market really know? By Abbassi, Puriya; Brownlees, Christian; Hans, Christina; Podlich, Natalia
  15. Foreign exchange risk premia: from traditional to state-space analyses By Nakmai, Siwat
  16. A new structural stochastic volatility model of asset pricing and its stylized facts By Radu T. Pruna; Maria Polukarov; Nicholas R. Jennings
  17. Flood insurance in England: an assessment of the current and newly proposed insurance scheme in the context of rising flood risk By Swenja Surminski; Jillian Eldridge
  18. Bankruptcy prediction for SMEs using relational data By TOBBACK, Ellen; MOEYERSOMS, Julie; STANKOVA, Marija; MARTENS, David
  19. Estimating Systematic Risk Under Extremely Adverse Market Conditions By Maarten van Oordt; Chen Zhou
  20. Traditional banks, shadow banks and the US credit boom: Credit origination versus financing By Unger, Robert
  21. Global Risk Aversion Spillover Dynamics and Investors' Attention Allocation By Ceylan, Özcan
  22. Optimal market making By Olivier Gu\'eant
  23. A Note on The Evolution of Preferences By Oliver Enrique Pardo Reinoso

  1. By: Mariusz Górajski; Dobromił Serwa; Zuzanna Wośko
    Abstract: We present a new measure of extreme credit risk in the time domain, namely the conditional expected time to default (CETD). This measure has a clear interpretation and can be applied in a straightforward way to the analyses of loan performance in time. In contrast to the probability of default, CETD provides direct information on the timing of a potential loan default under some stress scenarios. We apply a novel method to compute CETD using Markov probability transition matrices, a popular approach in survival analysis literature. We employ the new measure to the analysis of changing credit risk in a large portfolio of corporate loans.
    Keywords: credit risk, time to default, value at risk, conditional ETD
    JEL: G21 G32 C13 C18
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:nbp:nbpmis:237&r=rmg
  2. By: Alessandro Spelta (Università Cattolica del Sacro Cuore; Dipartimento di Economia e Finanza, Università Cattolica del Sacro Cuore)
    Abstract: Following the defi nition of systemic risk by the Financial Stability Board, the International Monetary Fund and the Bank for International Settlements, this paper proposes a method able to simultaneously address the two dimensions in which this risk materializes: namely the cross-sectional and the time dimension. The method is based on the W-TOPHITS algorithm, that exploits the connectivity information of an evolving network, and decomposes its tensor representation as the outer product of three vectors: borrowing, lending and time scores. These vectors can be interpreted as indices of the systemic importance of borrowing and lending associated with each fi nancial institution and of the systemic importance associated with each period, coherently with the realization of the whole network in that period. The time score, being able to simultaneously consider the temporal distribution of the whole traded volume over time as well as the spatial distribution of the transactions between players in each period, turns out to be a useful Early Warning Signal of the fi nancial crisis. The W-TOPHITS is tested on the e-MID interbank market dataset and on the BIS consolidated banking statistics with the aim of discovering Systemically Important Financial Institutions and to show how the time score is able to signal a change in the bipartite network of borrowers and lenders that heralds the fall of the traded volume that occurred during the 2007/2009 nancial crisis.
    Keywords: Systemic Risk, Tensor, Early Warning Signals, Evolving Networks.
    JEL: G01 G17 C63 C53
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:ctc:serie1:def036&r=rmg
  3. By: Fabiana Gómez (University of Bristol); Jorge Ponce (Banco Central del Uruguay and Departamento de Economía, Facultad de Ciencias Sociales, Universidad de la República)
    Abstract: We formally compare the effects of minimum capital requirements, capital buffers, liquidity requirements and loan loss provisions on the incentives of bankers to exert effort and take excessive risk. We find that these regulations impact differently the behavior of bankers. In the case of investment banks, the application of capital buffers and liquidity requirements makes it more difficult to achieve the first best solution. In the case of commercial banks, capital buffers, reserve requirements and traditional loan loss provisions for expected losses provide adequate incentives to bank managers, although the capital buffer is the most powerful instrument. Counter-cyclical (so-called dynamic) loan loss provisions may provide bank managers with incentives to gamble. The results inform policy makers in the ongoing debate about the harmonization of banking regulation and the implementation of Basel III.
    Keywords: Banking regulation, minimum capital requirement, capital buffer, liquidity requirement, (counter-cyclical) loan loss provision, commercial banks, investment banks, bankers’ incentives, effort, risk.
    JEL: G21 G28
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:ude:wpaper:0915&r=rmg
  4. By: Mazin A. M. Al Janabi (UAE University)
    Abstract: This paper reviews and examines the method development aspects of Al Janabi (2012) theoretical foundations and optimization algorithms for the assessment of Liquidity-Adjusted Value at Risk (LVaR) technique under adverse market conditions. This paper focuses on the development of robust theoretical foundation and modeling framework that attempt to tackle the issue of market/liquidity risk and economic-capital estimation at a portfolio level and within a multivariate context. The proposed optimization algorithm demonstrates that better investable commodities portfolios can be obtained than using the traditional Markowitz’s (1952) portfolio theory. In addition, the optimization algorithm has shown that portfolio managers can obtain financially meaningful investable portfolios and demonstrated interesting market-microstructures’ patterns which cannot be attained by using the classical Markowitz’s mean–variance approach. Advantages of the method include:•Developed algorithms can aid in advancing portfolio management in financial and commodities markets by testing for investable portfolios subject to meaningful financial constraints.•Investable commodities portfolios cannot be achieved via Markowitz's (1952) classical portfolio approach as the empirical results indicate that investable commodities portfolios lie off the efficient frontier.•The proposed modeling technique can be used by risk managers and portfolio managers for the assessment of appropriate asset allocations of different investable commodities portfolios under crisis market outlooks.
    Keywords: Commodity; Financial Engineering; Liquidity-Adjusted Value-at-Risk; Optimization; Portfolio Management
    JEL: C10 C13 G20
    URL: http://d.repec.org/n?u=RePEc:sek:iacpro:3605760&r=rmg
  5. By: Khizar Qureshi
    Abstract: Value-at-Risk (VaR) is an institutional measure of risk favored by financial regulators. VaR may be interpreted as a quantile of future portfolio values conditional on the information available, where the most common quantile used is 95%. Here we demonstrate Conditional Autoregressive Value at Risk, first introduced by Engle, Manganelli (2001). CAViaR suggests that negative/positive returns are not i.i.d., and that there is significant autocorrelation. The model is tested using data from 1986- 1999 and 1999-2009 for GM, IBM, XOM, SPX, and then validated via the dynamic quantile test. Results suggest that the tails (upper/lower quantile) of a distribution of returns behave differently than the core.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1605.04940&r=rmg
  6. By: Dominique Guegan (Centre d'Economie de la Sorbonne); Bertrand K. Hassani (Grupo Santander et Centre d'Economie de la Sorbonne); Kehan Li (Centre d'Economie de la Sorbonne)
    Abstract: Finiteness of sample, as one major sources of uncertainty, has been ignored by the regulators and risk managers domains such as portfolio management, credit risk modelling and finance (or insurance) regulatory capital calculations. To capture this uncertainty, we provide a robust confidence interval (CI) of historical Value-at-Risk (hVaR) for different length of sample. We compute this CI from a saddlepoint approximation of the distribution of hVaR using a bisection search approach. We also suggest a Spectral Stress Value-at-Risk measure based on the CI, as an alternative risk measure for both financial and insurance industries. Finally we perform a stress testing application for the SSVaR
    Keywords: Value-at-Risk; Small sample; Uncertainty; Asymptotic normality approximation; Saddlepoint approximation; Bisection search approach; Spectral Stress VaR; Stress testing
    JEL: C14 D81 G28 G32
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:16034&r=rmg
  7. By: R Herrera; Adam Clements (QUT)
    Abstract: Forecasting the risk of extreme losses is an important issue in the management of financial risk. There has been a great deal of research examining how option implied volatilities (IV) can be used to forecasts asset return volatility. However, the impact of IV in the context of predicting extreme risk has received relatively little attention. The role of IV is considered within a range of models beginning with the traditional GARCH based approach. Furthermore, a number of novel point process models for forecasting extreme risk are proposed in this paper. Univariate models where IV is included as an exogenous variable are considered along with a novel bivariate approach where movements in IV are treated as another point process. It is found that in the context of forecasting Value-at-Risk, the bivariate models produce the most accurate forecasts across a wide range of scenarios.
    Keywords: Implied volatility, Hawkes process, Peaks over threshold, Point process, Extreme events
    JEL: C14 C53
    Date: 2015–05–06
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2015_02&r=rmg
  8. By: Zhidong Bai (Northeast Normal University, China); Hua Li (Chang Chun University, China); Michael McAleer (National Tsing Hua University, Hsinchu, Taiwan; Erasmus University Rotterdam, the Netherlands; Complutense University of Madrid, Spain); Wing-Keung Wong (Hong Kong Baptist University, China, and Research Grants Council of Hong Kong, Hong Kong)
    Abstract: This paper considers the portfolio problem for high dimensional data when the dimension and size are both large.We analyze the traditional Markowitz mean-variance (MV) portfolio by large dimension matrix theory, and find the spectral distribution of the sample covariance is the main factor to make the expected return of the traditional MV portfolio overestimate the theoretical MV portfolio. A correction is suggested to the spectral construction of the sample covariances to be the sample spectrallycorrected covariance, and to improve the traditional MV portfolio to be spectrally corrected. In the expressions of the expected return and risk on the MV portfolio, the population covariance matrix is always a quadratic form, which will direct MV portfolio estimation. We provide the limiting behavior of the quadratic form with the sample spectrally-corrected covariance matrix, and explain the superior performance to the sample covariance as the dimension increases to infinity proportionally with the sample size. Moreover, this paper deduces the limiting behavior of the expected return and risk on the spectrally-corrected MV portfolio, and illustrates the superior properties of the spectrally-corrected MV portfolio. In simulations, we compare the spectrally-corrected estimates with the traditional and bootstrap-corrected estimates, and show the performance of the spectrally-corrected estimates are the best in portfolio returns and portfolio risk. We also compare the performance of the new proposed estimation with different optimal portfolio estimates for real data from S&P 500. The empirical findings are consistent with the theory developed in the paper.
    Keywords: Markowitz Mean-Variance Optimization; Optimal Return; Optimal Portfolio Allocation; Large Random Matrix; Bootstrap Method; Spectrally-corrected Covariance Matrix
    JEL: C13 C61 G11
    Date: 2016–04–11
    URL: http://d.repec.org/n?u=RePEc:tin:wpaper:20160025&r=rmg
  9. By: Lorenzo Sasso (National Research University Higher School of Economics)
    Abstract: This article examines the challenges to banking capital regulation posed by ongoing financial innovation through regulatory capital arbitrage. On the one hand, such practice undermines the quality of regulatory capital, eroding prudential capital standards, but most importantly it creates a distortion in the regulatory capital ratio measures, which prevents investors and regulators from identifying the bank’s real underlying risks. Opportunities for regulatory capital arbitrage arise as a consequence of the inherent mismatch of accounting goals, corporate law and prudential regulation – all interacting with the notion of capital for banks. On the other hand, financial innovation is the result of banks’ risk-management policies. In order to reduce the cost of capital and compliance banks engage in derivatives, structured finance and hybrid instruments, altering the risk/return of their cash flow and the information released to the market for disclosure. In a way, regulation is the solution but also part of the problem. For this reason, new regulation strategies for banks need to be implemented. Systemic risk and balance-sheet risk need to be tackled respectively with macro- and micro-prudential regulation. This would involve an international harmonization of the accounting standards and individualised capital adequacy requirements for banks. The regulation has to be functional for the market under examination. The regulator should therefore consider the adoption of prudential filters to make static variables such as accounting rules, which are normally focused on evaluation, more dynamic to give banks some financial flexibility in their risk-management policies.
    Keywords: Regulatory capital arbitrage; hybrid financial instruments; capital adequacy requirements; micro-prudential regulation; risk management; fair value accounting; IAS 32, IAS 39.
    JEL: Z
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:hig:wpaper:66/law/2016&r=rmg
  10. By: Rose,Adam
    Abstract: In most countries, the private sector owns the vast majority of the buildings and a considerable portion of the infrastructure at risk. However, most investment in disaster risk management is made by the public sector, with the private sector lagging far behind. The situation represents missed opportunities for businesses to capture not only higher levels of the direct benefits of disaster risk management, but also a broader set of co-benefits to themselves and society as a whole. These co-benefits include ways of lowering production costs, improving the health of workers, and contributing to general economic stability. Ironically, many of these co-benefits are more tangible and immediate than ordinary disaster risk management benefits, which may not appear until a disaster has struck many years after the investment has been made. This study analyzes several important facets of private sector investment in disaster risk management, primarily from an economic perspective. It is intended as a first step toward promoting greater investment in disaster risk management by identifying potential co-benefits, explaining why they are not always pursued, and suggesting ways to integrate them into private sector decision-making. The latter includes government incentives, justified on the grounds that many private sector investments have extensive co-benefits, many of which pay dividends to society as a whole.
    Keywords: Debt Markets,Climate Change Economics,Economic Theory&Research,Climate Change Mitigation and Green House Gases,Environmental Economics&Policies
    Date: 2016–04–12
    URL: http://d.repec.org/n?u=RePEc:wbk:wbrwps:7634&r=rmg
  11. By: Georgios Bampinas (Department of Economics, University of Macedonia, Greece); Theodore Panagiotidis (Department of Economics, University of Macedonia, Greece; The Rimini Centre for Economic Analysis, Italy)
    Abstract: This paper examines whether individual stocks can act as inflation hedgers. We focus on longer investment horizons and construct in- and out-of-sample portfolios based on the long-run relationship (cointegration) of stock prices with respect to consumer prices. Empirical evidence suggests that investors are better off by holding a portfolio of stocks with higher long-run betas as part of asset selection and allocation strategy. Stocks that outperform inflation tend to be drawn from the Energy and Industrial sectors. Finally, we observe that the companies average inflation hedging ability declined steadily over the past ten years, while the number of firms that hedge inflation has decreased considerably after the recent downturn of the US economy.
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:16-11&r=rmg
  12. By: Koji Asano (Graduate School of Economics, Osaka University)
    Abstract: This paper presents a model of portfolio management with reputation concerns in imperfect capital markets. Managers with financial constraints raise funds from investors and select a project that is characterized by the degree of risk. Managers differ in their ability to determine the probability of success. Based on past performance, all agents revise beliefs about managers f ability, and the beliefs affect the availability of funds in the future. This provides motivation for managers to build reputation by manipulating their performance through project selection. We show that the quality of investor protection changes fund flows, thereby influencing managers f project selection. Our model predicts that strong investor protection causes risk-taking behavior, whereas weak investor protection leads to risk-averse behavior.
    Keywords: reputation, investment decision, risk-taking, investor protection, pledgeability
    JEL: G31 G32
    Date: 2016–05
    URL: http://d.repec.org/n?u=RePEc:osk:wpaper:1612&r=rmg
  13. By: Leoni Eleni Oikonomikou (Georg-August University Göttingen)
    Abstract: This paper aims to forecast the Market Risk premium (MRP) in the US stock market by applying machine learning techniques, namely the Multilayer Perceptron Network (MLP), the Elman Network (EN) and the Higher Order Neural Network (HONN). Furthermore, Univariate ARMA and Exponential Smoothing models are also tested. The Market Risk Premium is defined as the historical differential between the return of the benchmark stock index over a short-term interest rate. Data are taken in daily frequency from January 2007 through December 2014. All these models outperform a Naive benchmark model. The Elman network outperforms all the other models during the insample period, whereas the MLP network provides superior results in the out-of-sample period. The contribution of this paper to the existing literature is twofold. First, it is the first study that attempts to forecast the Market Risk Premium in a daily basis using Artificial Neural Networks (ANNs). Second, it is not based on a theoretical model but is mainly data driven. The chosen calculation approach fits quite well with the characteristics of ANNs. The forecasting model is tested with data from the US stock market. The proposed model-based forecasting method aims to capture patterns in the data that improve the forecasting accuracy of the Market Risk Premium in the tested market and indicates potential key metrics for investment and trading purposes for short time horizons.
    Keywords: nonlinear models; forecasting performance metrics; market risk premium; US equity market
    JEL: C45 C52 G15 G17
    Date: 2016–04–14
    URL: http://d.repec.org/n?u=RePEc:got:gotcrc:202&r=rmg
  14. By: Abbassi, Puriya; Brownlees, Christian; Hans, Christina; Podlich, Natalia
    Abstract: We analyze the relation between market-based credit risk interconnectedness among banks during the crisis and the associated balance sheet linkages via funding and securities holdings. For identification, we use a proprietary dataset that has the funding positions of banks at the bank-to-bank level for 2006-13 in conjunction with investments of banks at the security level and the credit register from Germany. We find asymmetries both cross-sectionally and over time: when banks face difficulties to raise funding, the interbank lending affects market-based bank interconnectedness. Moreover, banks with investments in securities related to troubled classes have a higher credit risk interconnectedness. Overall, our results suggest that market-based measures of interdependence can serve well as risk monitoring tools in the absence of disaggregated high-frequency bank fundamental data.
    Keywords: Credit Risk,Networks,CDS,Interbank Lending,Portfolio Distance
    JEL: C33 C53 E44 F36 G12 G14 G18 G21
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdps:092016&r=rmg
  15. By: Nakmai, Siwat
    Abstract: This paper examines foreign exchange risk premia from simple univariate regressions to the state-space method. The adjusted traditional regressions properly figure out the existence and time-evolving property of the risk premia. Successively, the state-space estimations overall are quite rationally competent in examining the essence of time variability of the unobservable risk premia. To be more precise, the coefficients on the lagged estimated time-series are significant and the disturbance combined from the observation and transition equations in the state-space system, rational and premium errors, respectively, is statistically white noise. Such the two residuals are discovered to move oppositely with their covariance approaching zero suggested by the empirics. Besides, foreign exchange risk premia are projected and found significantly stationary at level and relatively volatile throughout time with some clustering. This volatility is however not quite dominant in the deviations of forward prediction errors.
    Keywords: foreign exchange risk premia, univariate regressions, state-space modeling, Kalman filter
    JEL: C20 C32 F31
    Date: 2016–04–15
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:71237&r=rmg
  16. By: Radu T. Pruna; Maria Polukarov; Nicholas R. Jennings
    Abstract: Building on a prominent agent-based model, we present a new structural stochastic volatility asset pricing model of fundamentalists vs. chartists where the prices are determined based on excess demand. Specifically, this allows for modelling stochastic interactions between agents, based on a herding process corrected by a price misalignment, and incorporating strong noise components in the agents' demand. The model's parameters are estimated using the method of simulated moments, where the moments reflect the basic properties of the daily returns of a stock market index. In addition, for the first time we apply a (parametric) bootstrap method in a setting where the switching between strategies is modelled using a discrete choice approach. As we demonstrate, the resulting dynamics replicate a rich set of the stylized facts of the daily financial data including: heavy tails, volatility clustering, long memory in absolute returns, as well as the absence of autocorrelation in raw returns, volatility-volume correlations, aggregate Gaussianity, concave price impact and extreme price events.
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1604.08824&r=rmg
  17. By: Swenja Surminski; Jillian Eldridge
    Abstract: Flooding is the largest natural disaster risk in England and it is expected to rise even further with a changing climate. Agreeing on how we pay for this now and in the future is a challenge, with competing drivers such as fairness, economic efficiency, political feasibility and public acceptance all playing their part. We investigate this in the context of recent efforts to reform the provision of flood insurance, which have been debated between government and industry over the last three years. Recognising the challenge of rising losses and increasing costs we are particularly interested in how the existing arrangement and the new flood insurance proposal (Flood Re) reflect on the need for physical risk reduction. By applying our analytical framework we find an absence of formal incentive mechanisms for risk reduction in the existing and proposed Flood Re scheme. We identify the barriers for applying insurance to risk reduction and point to some possible modifications in the Flood Re proposal to deliver a greater link between risk transfer and risk reduction. Our investigation offers some insights into the challenges of designing and implementing flood insurance schemes – a task that is currently being considered in a range of countries, including several developing countries, who hope to apply flood insurance as a tool to increase their climate resilience.
    Keywords: Flood insurance; flood risk; risk reduction
    JEL: G32
    Date: 2015–01–15
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:66256&r=rmg
  18. By: TOBBACK, Ellen; MOEYERSOMS, Julie; STANKOVA, Marija; MARTENS, David
    Abstract: Bankruptcy prediction has been a popular and challenging research area for decades. Most prediction models are built using traditional data such as financial gures, stock market data and firm specific variables. We complement such dense data with ne-grained data by including information on the company's directors and managers in the prediction models. This information is used to build a network between Belgian enterprises, where two companies are related if they share or have shared a director or high-level manager. We start from two possibly related assumptions: (i) if a company is linked to many (or only) bankrupt firms, it will have a higher probability of becoming bankrupt and (ii) the management has an inuence on the performance of the company and incompetent or fraudulent managers can lead a company into bankruptcy. The weighted-vote relational neighbour (wvRN) classier is applied on the created network and transforms the relationships between companies in bankruptcy prediction scores, thereby assuming that a company is more likely to file for bankruptcy if one of the related companies in its network has failed. The more related companies have failed, the higher the predicted probability of bankruptcy. The relational model is then benchmarked against a base model that contains only structured data such as financial ratios. Finally, an ensemble model is built that combines the relational model's output scores with the structured data. We find that this ensemble model outperforms the base model when detecting the riskiest firms, especially when predicting two-years ahead.
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:ant:wpaper:2016004&r=rmg
  19. By: Maarten van Oordt; Chen Zhou
    Abstract: This paper considers the problem of estimating a linear model between two heavy-tailed variables if the explanatory variable has an extremely low (or high) value. We propose an estimator for the model coefficient by exploiting the tail dependence between the two variables and prove its asymptotic properties. Simulations show that our estimation method yields a lower mean squared error than regressions conditional on tail observations. In an empirical application we illustrate the better performance of our approach relative to the conditional regression approach in projecting the losses of industry-specific stock portfolios in the event of a market crash.
    Keywords: Econometric and statistical methods, Financial markets
    JEL: C14 G01
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:16-22&r=rmg
  20. By: Unger, Robert
    Abstract: The US credit boom has been identified as one of the causes of the global financial crisis and the resulting debt overhang is seen as the primary reason for the weak economic recovery. Most of the existing literature links the credit boom to the emergence of the shadow banking system. This paper shows that the largest part of the shadow banking system merely transforms existing financial claims against ultimate borrowers that have been originated by traditional banks. Based on financial accounts data, it is estimated that, shortly before the onset of the financial crisis, just about 12% of loans to the non-financial private sector had been originated by shadow banks. Consequently, dampening credit creation by the traditional banking sector might be an additional policy instrument to reduce the build-up of systemic risk in the shadow banking system.
    Keywords: banks,credit boom,credit creation,financial crisis,shadow banks,systemic risk
    JEL: E40 E50 F30 G21 G23
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdps:112016&r=rmg
  21. By: Ceylan, Özcan
    Abstract: This paper investigates market-wide risk aversion in an international setting. Particularly, this empirical study evaluates risk aversion spillover dynamics as an uncertainty transmission mechanism for the period 2000-2015 to reveal if there has been a significant change in these dynamics when markets are going through turbulent periods. As a plausible proxy for risk aversion, variance risk premium (VRP) is computed through the difference between expected variances under risk-neutral and physical measures for seven markets studied: United States, United Kingdom, Germany, France, Netherlands, Switzerland and Japan. Effects of a shock to U.S. VRP on the other markets' VRPs are evaluated through Generalized Forecast Error Variance Decomposition. Results show that risk aversion spillovers from U.S. to other markets are stronger while the U.S. is going through turbulent periods confirming the intuition that investors are more focused on incidents in the turbulent market. Markets become more connected in terms of sentiments when a country is unexpectedly hit by a major crisis, limiting diversification opportunities.
    Keywords: Investor sentiment, Risk aversion spillovers, Variance risk premium, Generalized forecast error variance decomposition, Investors' attention allocation, Financial crises.
    JEL: D8 F36 G14 G15
    Date: 2016–05–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:71320&r=rmg
  22. By: Olivier Gu\'eant
    Abstract: Market makers provide liquidity to other market participants: they propose prices at which they stand ready to buy and sell a wide variety of assets. They face a complex optimization problem with static and dynamic components: they need indeed to propose bid and offer/ask prices in an optimal way for making money out of the difference between these two prices (their bid-ask spread), while mitigating the risk associated with price changes -- because they seldom buy and sell simultaneously, and therefore hold long or short inventories which expose them to market risk. In this paper, (i) we propose a general modeling framework which generalizes (and reconciles) the various modeling approaches proposed in the literature since the publication of the seminal paper "High-frequency trading in a limit order book" by Avellaneda and Stoikov, (ii) we prove new general results on the existence and the characterization of optimal market making strategies, (iii) we obtain new closed-form approximations for the optimal quotes, (iv) we extend the modeling framework to the case of multi-asset market making, and (v) we show how the model can be used in practice in the specific case of the corporate bond market and for two credit indices.
    Date: 2016–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1605.01862&r=rmg
  23. By: Oliver Enrique Pardo Reinoso
    Abstract: This note checks the robustness of a surprising result in Dekel et al. (2007). The result states that strict Nash equilibria might cease to be evolutionary stable when agents are able to observe the opponent’s preferences with a very low probability. This note shows that the result is driven by the assumption that there is no risk for the observed preferences to be mistaken. In particular, when a player may observe a signal correlated with the opponent’s preferences, but the signal is noisy enough, all strict Nash equilibria are evolutionary stable.
    Date: 2015–09–01
    URL: http://d.repec.org/n?u=RePEc:col:000495:014568&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.