|
on Risk Management |
Issue of 2019‒03‒25
23 papers chosen by |
By: | Raphaël Oger (CGI - Centre Génie Industriel - IMT Mines Albi - IMT École nationale supérieure des Mines d'Albi-Carmaux); Frederick Benaben (CGI - Centre Génie Industriel - IMT Mines Albi - IMT École nationale supérieure des Mines d'Albi-Carmaux); Matthieu Lauras (CGI - Centre Génie Industriel - IMT Mines Albi - IMT École nationale supérieure des Mines d'Albi-Carmaux); Benoit Montreuil (Georgia Institute of Technology (Georgia Tech)) |
Abstract: | This paper is one pillar of a research project aimed to design a decision support system supporting collaborative Supply Chain Risk Management among logistics network stakeholders. It presents the motivation behind this objective, and the contribution towards this objective: a methodology to automatically deduce all the supply chain options enabled by a logistics network to fulfil the demand. This methodology is introduced as part of a decision support automation framework for Supply Chain Risk and Opportunity Management among logistics network stakeholders. The methodology focuses on strategic and tactical supply chain decisions, and on manufacturing stakeholders. |
Keywords: | Risk Management,Opportunity Management,Supply Chain Design,Logistics Network,Decision Support,Strategic Business Planning,Supply Chain Risk Management,Supply Chain |
Date: | 2018–06–11 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01884393&r=all |
By: | Ahelegbey, Daniel Felix; Giudici, Paolo; Hadji-Misheva, Branka |
Abstract: | This paper investigates how to improve statistical-based credit scoring of SMEs involved in P2P lending. The methodology discussed in the paper is a factor network-based segmentation for credit score modeling. The approach first constructs a network of SMEs where links emerge from comovement of latent factors, which allows us to segment the heterogeneous population into clusters. We then build a credit score model for each cluster via lasso logistic regression. We compare our approach with the conventional logistic model by analyzing the credit score of over 15000 SMEs engaged in P2P lending services across Europe. The result reveals that credit risk modeling using our network-based segmentation achieves higher predictive performance than the conventional model. |
Keywords: | Credit Risk, Factor models, Fintech, Peer-to-Peer lending, Credit Scoring, Lasso, Segmentation |
JEL: | C38 G2 |
Date: | 2019–02–26 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:92633&r=all |
By: | Masoud Fekri; Babak Barazandeh |
Abstract: | Optimal capital allocation between different assets is an important financial problem, which is generally framed as the portfolio optimization problem. General models include the single-period and multi-period cases. The traditional Mean-Variance model introduced by Harry Markowitz has been the basis of many models used to solve the portfolio optimization problem. The overall goal is to achieve the highest return and lowest risk in portfolio optimization problems. In this paper, we will present an optimal portfolio based the Markowitz Mean-Variance-Skewness with weight constraints model for short-term investment opportunities in Iran's stock market. We will use a neural network based predictor to predict the stock returns and measure the risk of stocks based on the prediction errors in the neural network. We will perform a series of experiments on our portfolio optimization model with the real data from Iran's stock market indices including Bank, Insurance, Investment, Petroleum Products and Chemicals indices. Finally, 8 different portfolios with low, medium and high risks for different type of investors (risk-averse or risk taker) using genetic algorithm will be designed and analyzed. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06632&r=all |
By: | \c{C}a\u{g}{\i}n Ararat; Nurtai Meimanjanov |
Abstract: | Systemic risk is concerned with the instability of a financial system whose members are interdependent in the sense that the failure of a few institutions may trigger a chain of defaults throughout the system. Recently, several systemic risk measures are proposed in the literature that are used to determine capital requirements for the members subject to joint risk considerations. We address the problem of computing systemic risk measures for systems with sophisticated clearing mechanisms. In particular, we consider the Eisenberg-Noe network model and the Rogers-Veraart network model, where the former one is extended to the case where operating cash flows in the system are unrestricted in sign. We propose novel mixed-integer linear programming problems that can be used to compute clearing vectors for these models. Due to the binary variables in these problems, the corresponding (set-valued) systemic risk measures fail to have convex values in general. We associate nonconvex vector optimization problems to these systemic risk measures and solve them by a recent nonconvex variant of Benson's algorithm which requires solving two types of scalar optimization problems. We provide a detailed analysis of the theoretical features of these problems for the extended Eisenberg-Noe and Rogers-Veraart models. We test the proposed formulations on computational examples and perform sensitivity analyses with respect to some model-specific and structural parameters. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.08367&r=all |
By: | Kuwahara, Satoshi (Asian Development Bank Institute); Yoshino, Naoyuki (Asian Development Bank Institute); Sagara, Megumi (Asian Development Bank Institute); Taghizadeh-Hesary, Farhad (Asian Development Bank Institute) |
Abstract: | The credit risk database (CRD) makes it possible to mitigate the problem of information asymmetry between small and medium-sized enterprises (SMEs) and financial institutions and contributes to improving SMEs’ access to finance by collecting a large number of financial statements through the mechanism of SME finances and establishing a robust statistical model. We use the CRD in Japan, confirm the situation in Japan, and highlight the CRD’s contribution to evaluating the creditworthiness of SMEs. We also explain how to establish the CRD as a financial infrastructure, while indicating that the CRD and the scoring model based on it have maintained their quality owing to their operating system. We hope our experience contributes to the introduction of a statistical credit risk database composed of a large number of anonymous financial statement data in other countries and that the CRD helps to improve SMEs’ access to finance as a financial infrastructure. |
Keywords: | credit risk database; CRD creditworthiness; SMEs in Japan |
JEL: | G21 G28 G32 |
Date: | 2019–02–21 |
URL: | http://d.repec.org/n?u=RePEc:ris:adbiwp:0924&r=all |
By: | Craig, Ben; Giuzio, Margherita; Paterlini, Sandra |
Abstract: | Recent policy discussion includes the introduction of diversification requirements for sovereign bond portfolios of European banks. In this paper, we evaluate the possible effects of these constraints on risk and diversification in the sovereign bond portfolios of the major European banks. First, we capture the dependence structure of European countries' sovereign risks and identify the common factors driving European sovereign CDS spreads by means of an independent component analysis. We then analyze the risk and diversification in the sovereign bond portfolios of the largest European banks and discuss the role of “home bias”, i.e., the tendency of banks to concentrate their sovereign bond holdings in their domicile country. Finally, we evaluate the effect of diversification requirements on the tail risk of sovereign bond portfolios and quantify the system-wide losses in the presence of fire-sales. Under our assumptions about how banks respond to the new requirements, demanding that banks modify their holdings to increase their portfolio diversification may mitigate fire-sale externalities, but may be ineffective in reducing portfolio risk, including tail risk. JEL Classification: G01, G11, G21, G28 |
Keywords: | bank regulation, diversification, home bias, sovereign-bank nexus, sovereign risk |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:srk:srkwps:201989&r=all |
By: | Radanliev, Petar; De Roure, David; Nicolescu, Razvan; Huth, Michael; Mantilla Montalvo, Rafael; Cannady, Stacy; Burnap, Peter |
Abstract: | This article is focused on the economic impact assessment of Internet of Things (IoT) and its associated cyber risks vectors and vertices – a reinterpretation of IoT verticals. We adapt to IoT both the Cyber Value at Risk model, a well-established model for measuring the maximum possible loss over a given time period, and the MicroMort model, a widely used model for predicting uncertainty through units of mortality risk. The resulting new IoT MicroMort for calculating IoT risk is tested and validated with real data from the BullGuard's IoT Scanner (over 310,000 scans) and the Garner report on IoT connected devices. Two calculations are developed, the current state of IoT cyber risk and the future forecasts of IoT cyber risk. Our work therefore advances the efforts of integrating cyber risk impact assessments and offer a better understanding of economic impact assessment for IoT cyber risk. |
Keywords: | IoT cyber risk IoT risk analysis IoT cyber insurance IoT MicroMort Cyber value-at-risk |
JEL: | C1 C10 C15 C18 O3 O30 O31 O32 O33 O38 O39 |
Date: | 2018–09 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:92567&r=all |
By: | Hanene Ben Salah (IMAG - Institut Montpelliérain Alexander Grothendieck - UM - Université de Montpellier - CNRS - Centre National de la Recherche Scientifique); Ali Gannoun (IMAG - Institut Montpelliérain Alexander Grothendieck - UM - Université de Montpellier - CNRS - Centre National de la Recherche Scientifique); Christian De Peretti (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1 - Université de Lyon); Mathieu Ribatet (IMAG - Institut Montpelliérain Alexander Grothendieck - UM - Université de Montpellier - CNRS - Centre National de la Recherche Scientifique); Abdelwahed Trabelsi (Laboratoire BESTMOD ISG Tunis - ISG Tunis) |
Abstract: | The DownSide Risk (DSR) model for portfolio optimization allows to overcome the drawbacks of the classical Mean-Variance model concerning the asymmetry of returns and the risk perception of investors. This optimization model deals with a positive definite matrix that is endogenous with respect to the portfolio weights and hence yields to a non standard optimization problem. To bypass this hurdle, Athayde (2001) developed a new recursive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier usually exhibits some inflexion points which make this curve not very smooth. In order to overcome these points, Athayde (2003) proposed a mean kernel estimation of returns to get a smoother portfolio frontier. This technique provides an effect similar to the case in which an infinite number of observations is available. In spite of the originality of this approach, the proposed algorithm was not neatly written. Moreover, no application was presented in his paper. Ben Salah et al (2015), taking advantage on the the robustness of the median, replaced the mean estimator in Athayde's model by a nonparametric median estimator of the returns, and gave a tidily and comprehensive version of the former algorithm (of Athayde (2001, 2003)). In all the previous cases, the problem is computationally complex since at each iteration, the returns (for each asset and for the portfolio) need to be re-estimated. Due to the changes in the kernel weights for every time, the portfolio is altered. In this paper, a new method to reduce the number of iterations is proposed. Its principle is to start by estimating non parametrically all the returns for each asset; then, the returns of a given portfolio will be derived from the previous estimated assets returns. Using the DSR criterion and Athayde's algorithm, a smoother portfolio frontier is obtained when short selling is or is not allowed. The proposed approach is applied on the French and Brazilian stock markets. |
Keywords: | Semivariance,Nonparametric Median Estimation,Nonparametric Mean Estimation,DownSide Risk,Kernel Method |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01299561&r=all |
By: | Mehmet Balcilar (Department of Economics, Eastern Mediterranean University, Famagusta, via Mersin 10, Northern Cyprus, Turkey; Department of Economics, University of Pretoria, Pretoria, 0002, South Africa; Montpellier Business School, Montpellier, France.); Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, South Africa); Shixuan Wang (Department of Economics, University of Reading, Reading, RG6 6AA, United Kingdom); Mark E. Wohar (College of Business Administration, University of Nebraska at Omaha, 6708 Pine Street, Omaha, NE 68182, USA, and School of Business and Economics, Loughborough University, Leicestershire, LE11 3TU, UK.) |
Abstract: | In this paper, we analyze the predictability of the movements of bond premia of US Treasury due to oil price uncertainty over the monthly period 1953:06 to 2016:12. For our purpose, we use a higher order nonparametric causality-in-quantiles framework, which in turn, allows us to test for predictability over the entire conditional distribution of not only bond returns, but also its volatility, by controlling for misspecification due to uncaptured nonlinearity and structural breaks, which we show to exist in our data. We find that oil uncertainty not only predicts (increases) US bond returns, but also its volatility, with the effect on the latter being stronger. In addition, oil uncertainty tends to have a stronger impact on the shortest and longest maturities (2- and 5-year), and relatively weaker impact on bonds with medium-term (3- and 4-year) maturities. Our results are robust to alternative measures of oil market uncertainty and bond market volatility. |
Keywords: | Oil Price Uncertainty, Bond Returns and Volatility, Higher-Order Nonparametric Causality-in-Quantiles Test |
JEL: | C22 G12 Q02 |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:pre:wpaper:201919&r=all |
By: | Rongju Zhang; Mark Aarons; Gregoire Loeper |
Abstract: | We develop an optimal currency hedging strategy for fund managers who own foreign assets to choose the hedge tenors that maximize their FX carry returns within a liquidity risk constraint. The strategy assumes that the offshore assets are fully hedged with FX forwards. The chosen liquidity risk metric is Cash Flow at Risk (CFaR). The strategy involves time-dispersing the total nominal hedge value into future time buckets to maximize (minimize) the expected FX carry benefit (cost), given the constraint that the CFaRs in all the future time buckets do not breach a predetermined liquidity budget. We demonstrate the methodology via an illustrative example where shorter-dated forwards are assumed to deliver higher carry trade returns (motivated by the historical experience where AUD is the domestic currency and USD is the foreign currency). We also introduce a tenor-ranking method which is useful when this assumption fails. We show by Monte Carlo simulation and by backtesting that our hedging strategy successfully operates within the liquidity budget. We provide practical insights on when and why fund managers should choose short-dated or long-dated tenors. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06346&r=all |
By: | Buse, Rebekka; Schienle, Melanie; Urban, Jörg |
Abstract: | We study the impact of changes in regulations and policy interventions on systemic risk among European sovereigns measured as volatility spillovers in respective credit risk markets. Our unique intraday CDS dataset allows for precise measurement of the effectiveness of these events in a network setting. In particular, it allows discerning interventions which entail significant changes in network cross-effects with appropriate bootstrap confidence intervals. We show that it was mainly regulatory changes with the ban of trading naked sovereign CDS in 2012 as well as the new ISDA regulations in 2014 which were most effective in reducing systemic risk. In comparison, we find that the effect of policy interventions was minor and generally not sustainable. In particular, they only had a significant impact when implemented for the first time and when targeting more than one country. For the volatility spillover channels, we generally find balanced networks with no fragmentation over time. JEL Classification: G20, G01, G17, C32, C55, G28 |
Keywords: | bootstrap spillover-measures, financial crises, financial stability and systemic risk in the Eurozone, high-frequency CDS, policy and regulation |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:srk:srkwps:201990&r=all |
By: | Gauthier de Maere d'Aertrycke (CEEME - Engie); Andreas Ehrenmann (CEEME - Engie); Daniel Ralph (Cambridge Judge Business School, University of Cambridge); Yves Smeers (Center for Operations Research and Econometrics, Universit´e catholique de Louvain) |
Keywords: | Capacity expansion, spot market, perfect or Cournot competition, risk aversion, risk trading, complete or incomplete risk market, coherent risk measure, risky capacity equilibria |
JEL: | C62 C72 L94 C73 G32 |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:enp:wpaper:eprg1720&r=all |
By: | Nikki Kergozou; David Turner |
Abstract: | Macroeconomic forecasters typically forecast fewer recessions than the number experienced, which means economic growth tends to be over-predicted on average. Consequently, forecast errors are not normally distributed, making it difficult to convey the uncertainty and risks based on the historical forecast track record. To characterise this risk, recent OECD work constructed fan charts parameterised on historical forecast errors and the probability of a future downturn estimated from a probit model comprising a range of potential macroeconomic and financial early warning indicators. As the probability of a downturn increases the associated fan chart is wider, reflecting increased uncertainty, and more skewed to the downside, reflecting greater downside risks. This paper applies this methodology to New Zealand; although one important difference compared to other OECD economies is that the time span of macroeconomic data without major structural change is significantly shorter. Forecast errors for GDP by the OECD, Reserve Bank of New Zealand and New Zealand Treasury all appear to be non-normally distributed. Fan charts for GDP forecasts from the mid-year 2018 OECD Economic Outlook are symmetric due to the low probability of a downturn. Fan charts estimated for the period preceding the global financial crisis using currently-available data have a downwards skew. However, those estimated using data only available in the lead up to the crisis have many insignificant coefficients, likely due to the structural changes that have occurred in the New Zealand economy since the 1980s. |
Keywords: | downturn, economic forecasts, fan charts, New Zealand, risk, uncertainty |
JEL: | E58 E17 E65 E66 A E62 E63 |
Date: | 2019–03–08 |
URL: | http://d.repec.org/n?u=RePEc:oec:ecoaaa:1543-en&r=all |
By: | Marek Capinski |
Abstract: | Real life hedging in the Black-Scholes model must be imperfect and if the stock's drift is higher than the risk free rate, leads to a profit on average. Hence the option price is examined as a fair game agreement between the parties, based on expected payoffs and a simple measure of risk. The resulting prices result in the volatility smile. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.07875&r=all |
By: | Òscar Jordà; Moritz Schularick; Alan M. Taylor |
Abstract: | The risk premium puzzle is worse than you think. Using a new database for the U.S. and 15 other advanced economies from 1870 to the present that includes housing as well as equity returns (to capture the full risky capital portfolio of the representative agent), standard calculations using returns to total wealth and consumption show that: housing returns in the long run are comparable to those of equities, and yet housing returns have lower volatility and lower covariance with consumption growth than equities. The same applies to a weighted total-wealth portfolio, and over a range of horizons. As a result, the implied risk aversion parameters for housing wealth and total wealth are even larger than those for equities, often by a factor of 2 or more. We find that more exotic models cannot resolve these even bigger puzzles, and we see little role for limited participation, idiosyncratic housing risk, transaction costs, or liquidity premiums. |
JEL: | E44 G12 G15 N20 |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:25653&r=all |
By: | Enzo Busseti |
Abstract: | Equity risk premium is a central component of every risk and return model in finance and a key input to estimate costs of equity and capital in both corporate finance and valuation. An article by Damodaran examines three broad approaches for estimating the equity risk premium. The first is survey based, it consists in asking common investors or big players like pension fund managers what they require as a premium to invest in equity. The second is to look at the premia earned historically by investing in stocks, as opposed to risk-free investments. The third method tries to extrapolate a market-consensus on equity risk premium (Implied Equity Risk Premium) by analysing equity prices on the market today. After having introduced some basic concepts and models, I'll briefly explain the pluses and minuses of the first two methods, and analyse more deeply the third. In the end I'll show the results of my estimation of ERP on real data, using variants of the Implied ERP (third) method. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.07737&r=all |
By: | Zura Kakushadze; Willie Yu |
Abstract: | We give an explicit algorithm and source code for constructing risk models based on machine learning techniques. The resultant covariance matrices are not factor models. Based on empirical backtests, we compare the performance of these machine learning risk models to other constructions, including statistical risk models, risk models based on fundamental industry classifications, and also those utilizing multilevel clustering based industry classifications. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06334&r=all |
By: | Koumou, Gilles Boevi (HEC Montreal, Canada Research Chair in Risk Management); Dionne, Georges (HEC Montreal, Canada Research Chair in Risk Management) |
Abstract: | This paper provides an axiomatic foundation of the measurement of diversification in a one-period portfolio theory under the assumption that the investor has complete information about the joint distribution of asset returns. Four categories of portfolio diversification measures can be distinguished: the law of large numbers diversification measures, the correlation diversification measures, the market portfolio diversification measures and the risk contribution diversification measures. We offer the first step towards a rigorous theory of correlation diversification measures. We propose a set of nine desirable axioms for this class of diversification measures, and name the measures satisfying these axioms coherent diversification measures that we distinguish from the notion of coherent risk measures. We provide the decision-theoretic foundations of our axioms by studying their compatibility with investors’ preference for diversification in two important decision theories under risk: the expected utility theory and Yaari’s dual theory. We explore whether useful methods of measuring portfolio diversification satisfy our axioms. We also investigate whether or not our axioms have forms of representation. |
Keywords: | Portfolio theory; portfolio diversification; preference for diversification; correlation diversification; expected utility theory; dual theory. |
JEL: | D81 G01 G11 |
Date: | 2019–03–12 |
URL: | http://d.repec.org/n?u=RePEc:ris:crcrmw:2019_002&r=all |
By: | Robert Cole (Reserve Bank of New Zealand) |
Abstract: | This Analytical Note looks at the level and trends in the solvency positions over the five years to 30 September 2018. Solvency position is the capital buffer in excess of minimum requirements, and can be expressed as a $ amount (solvency margin) or a ratio of actual to required capital (solvency ratio). Approximately 60 New Zealand licensed insurers are subject to the Reserve Bank’s solvency requirements calculated in accordance with solvency standards, which apply generally, and insurer-specific licence conditions. The solvency calculations are based on the size and nature of the key financial risks that insurers face. The other (approximately 30) insurers are exempted from the Reserve Bank’s solvency requirements and are instead subject to the solvency requirements of their home supervisor. For example, branches of Australian insurers are subject to the solvency requirements of Australian Prudential Regulation Authority (“APRA”). For insurers subject to Reserve Bank solvency requirements, the aggregate solvency margin was $1.6 billion in excess of licence conditions at 30 September 2018. While this is the same level as 30 September 2013, the aggregate solvency margin was higher during 2016 & 2017. Larger insurers have lower solvency ratios than smaller insurers. Over the past five years the solvency ratio for larger insurers has reduced while for small insurers it has increased. During the last five years, there have been some instances of insurers being in breach of solvency requirements. In most cases, solvency breaches were very quickly resolved, usually by injecting additional capital. Insurers that did not quickly resolve a solvency breach were closed to new business. Compared with common annual movements in solvency ratio, some insurers have a low solvency ratio. This includes some large insurers. Australian insurers with branches in New Zealand have higher solvency ratios under APRA requirements than the insurers subject to Reserve Bank solvency requirements, and the Australian solvency ratios have increased since 2013. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:nzb:nzbans:2019/01&r=all |
By: | Radanliev, Petar; De Roure, David; R.C. Nurse, Jason; Burnap, Pete; Anthi, Eirini; Ani, Uchenna; Maddox, La’Treall; Santos, Omar; Mantilla Montalvo, Rafael |
Abstract: | Definition of Internet of Things (IoT) Cyber Risk – Discussion on a Transformation Roadmap for Standardization of Regulations, Risk Maturity, Strategy Design and Impact Assessment |
Keywords: | Internet of Things; Micro Mart model; Goal-Oriented Approach; transformation roadmap; Cyber risk regulations; empirical analysis; cyber risk self-assessment; cyber risk target state |
JEL: | L0 L5 L50 L52 L53 O2 O21 O3 O31 O32 O33 O38 |
Date: | 2019–03–05 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:92569&r=all |
By: | Ale\v{s} \v{C}ern\'y |
Abstract: | We study dynamic optimal portfolio allocation for monotone mean--variance preferences in a general semimartingale model. Armed with new results in this area we revisit the work of Cui, Li, Wang and Zhu (2012, MAFI) and fully characterize the circumstances under which one can set aside a non-negative cash flow while simultaneously improving the mean--variance efficiency of the left-over wealth. The paper analyzes, for the first time, the monotone hull of the Sharpe ratio and highlights its relevance to the problem at hand. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06912&r=all |
By: | Matteo Barigozzi; Marc Hallin; Stefano Soccorsi |
Abstract: | Ripple effects in financial markets associated with crashes, systemic risk and contagion are characterized by non-trivial lead-lag dynamics which is crucial for understanding how crises spread and, therefore, central in risk management. In the spirit of Diebold and Yilmaz (2014), we investigate connectedness among financial firms via an analysis of impulse response functions of adjusted intraday log-ranges to market shocks involving network theory methods. Motivated by overwhelming evidence that the interdependence structure of financial markets is varying over time, we are basing that analysis on the so-called time-varying General Dynamic Factor Model proposed by Eichler et al. (2011), which extends to the locally stationary context the framework developed by Forni et al. (2000) under stationarity assumptions. The estimation methods in Eichler et al. (2011), however, present the major drawback of involving two-sided filters which make it impossible to recover impulse response functions. We therefore introduce a novel approach extending to the time-varying context the one-sided method of Forni et al. (2017). The resulting estimators of time-varying impulse response functions are shown to be consistent, hence can be used in the analysis of (time-varying) connectedness. Our empirical analysis on a large and strongly comoving panel of intraday price ranges of US stocks indicates that large increases in mid to long-run connectedness are associated with the main financial turmoils. |
Keywords: | Dynamic factor models, volatility, financial crises, contagion, financial connectedness, high-dimensional time series, panel data, time-varying models, local stationarity |
JEL: | C32 C14 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:lan:wpaper:257939806&r=all |
By: | Minh Phi, Nguyet Thi (Asian Development Bank Institute); Hong Hoang, Hanh Thi (Asian Development Bank Institute); Taghizadeh-Hesary, Farhad (Asian Development Bank Institute); Yoshino, Naoyuki (Asian Development Bank Institute) |
Abstract: | In recent years, the Vietnamese economy has shown signs of financial distress, and especially small banks have experienced serious liquidity and solvency problems. Based on the new policy of the State Bank of Vietnam, in order to ensure safe and effective banking operations, the Basel II accord will be widely applied to the whole banking system by 2018. This paper investigates the effects of the Basel II capital requirement implementation in Viet Nam on the bank lending rate and national output. The paper provides a theoretical framework as well as empirical model by developing a Vector Error Correction Model (VECM) over the period 2018 to 2016 by employing three groups of indicators (macroeconomics, banking, and monetary). The main finding of the paper is that at the bank level, a tightening of regulatory capital requirements does not induce a higher lending rate in the long run. Also, changes in micro-prudential capital requirements on banks have statistically significant spillovers on the GDP growth rate in the short term; yet, their effects significantly lessen over a longer period. |
Keywords: | Basel II; regulatory capital requirements; bank capital; lending rate; aggregate growth |
JEL: | G21 G28 |
Date: | 2019–01–18 |
URL: | http://d.repec.org/n?u=RePEc:ris:adbiwp:0916&r=all |