nep-rmg New Economics Papers
on Risk Management
Issue of 2018‒01‒29
twenty-one papers chosen by



  1. Regulation of Islamic banks: Basel III capital framework and profit-sharing investment accounts By Kévin Spinassou; Leo Indra Wardhana
  2. Ensemble Learning or Deep Learning? Application to Default Risk Analysis By Shigeyuki Hamori; Minami Kawai; Takahiro Kume; Yuji Murakami; Chikara Watanabe
  3. Forecasting and risk management in the Vietnam Stock Exchange By Manh Ha Nguyen; Olivier Darné
  4. Dynamic Bank Capital Requirements By Tetiana Davydiuk
  5. Dynamic credit default swaps curves in a network topology By Xiu Xu; Wolfgang K. Härdle; Cathy Yi-Hsuan Chen
  6. “Multivariate count data generalized linear models: Three approaches based on the Sarmanov distribution” By Catalina Bolancé; Raluca Vernic
  7. Banks’ Capital Surplus and the Impact of Additional Capital Requirements By Simona Malovana
  8. How Bad Is a Bad Loan? Distinguishing Inherent Credit Risk from Inefficient Lending (Does the Capital Market Price This Difference?) By Joseph Hughes; Choon-Geol Moon
  9. The European Deposit Insurance Scheme: Assessing risk absorption via SYMBOL By Lucia, Alessi; Giuseppina, Cannas; Sara, Maccaferri; Marco, Petracco Giudici
  10. Risks in China’s financial system By Song, Zheng (Michael); Xiong, Wei
  11. Momentum and Crash Sensitivity By Ruenzi, Stefan; Weigert, Florian
  12. Exchange Rate Co-movements, Hedging and Volatility Spillovers in New EU Forex Markets By Evzen Kocenda; Michala Moravcova
  13. Sales forecasting and risk management under uncertainty in the media industry By V\'ictor Gallego; Pablo Angulo; Pablo Su\'arez-Garc\'ia; David G\'omez-Ullate
  14. Feasible Shared Destiny Risk Distributions By Thibault Gajdos; John A Weymark; Claudio Zoli
  15. Model Uncertainty in Accelerated Degradation Testing Analysis By Le Liu; Xiao-Yang Li; Enrico Zio; Rui Kang; Tong-Min Jiang
  16. An approximation formula for normal implied volatility under general local stochastic volatility models By Yasaman Karami; Kenichiro Shiraya
  17. Does Smooth Ambiguity Matter for Asset Pricing? By A. Ronald Gallant; Mohammad Jahan-Parvar; Hening Liu
  18. Demographic Modeling Via 3-dimensional Markov Chains By Juan Jose Viquez; Alexander Campos; Jorge Loria; Luis Alfredo Mendoza; Jorge Aurelio Viquez
  19. Identifying "Default Thresholds" in Consumer Liabilities Using High Frequency Data By Don Schlagenhauf; Carlos Garriga
  20. The Influence of Seed Selection on the Solvency II Ratio By Quinn Culver; Dennis Heitmann; Christian Wei{\ss}
  21. Towards an integration of systems engineering and project management processes for a decision aiding purpose By Majda Lachhab; Cédrick Béler; Erlyn Solano-Charris; Thierry Coudert

  1. By: Kévin Spinassou (LC2S - Laboratoire Caribéen de Sciences Sociales - CNRS - Centre National de la Recherche Scientifique - Université des Antilles (Pôle Martinique) - UA - Université des Antilles); Leo Indra Wardhana (Universitas Gadjah Mada)
    Abstract: This paper theoretically examines the impact of capital requirements on Islamic banks. Given the large use of profit-sharing investment accounts (PSIA) in Islamic banking and the recent implementation of Basel III capital framework, we develop a simple model where banks are able to offer PSIA contracts under a regulation applying risk-weighted capital ratios and leverage ratio restrictions. We find that banks with high or low returns on assets prefer " conventional " banking, while banks with intermediate returns on assets operate as Islamic banks, by selecting PSIA instead of deposits. We further highlight that capital requirements tend to increase this incentive to opt for Islamic banking, especially when Islamic banks benefit from a less competitive environment and from a locally tailored capital regulation.
    Keywords: Basel III,profit- sharing investment accounts,Islamic finance,bank capital regulation,IFSB
    Date: 2018–01–02
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01674376&r=rmg
  2. By: Shigeyuki Hamori (Graduate School of Economics, Kobe University); Minami Kawai (Department of Economics, Kobe University); Takahiro Kume (Department of Economics, Kobe University); Yuji Murakami (Department of Economics, Kobe University); Chikara Watanabe (Department of Economics, Kobe University)
    Abstract: Proper credit risk management is essential for lending institutions as substantial losses can be incurred when borrowers default. Consequently, statistical methods that can measure and analyze credit risk objectively are becoming increasing important. This study analyzed default payment data from Taiwan and compared the prediction accuracy and classification ability of three ensemble learning methods—specifically, Bagging, Random Forest, and Boosting—with those of various neural network methods, each of which has a different activation function. The results indicate that Boosting has a high prediction accuracy, whereas that of Bagging and Random Forest is relatively low. They also indicate that the prediction accuracy and classification performance of Boosting is better than that of deep neural networks, Bagging, and Random Forest.
    Keywords: credit risk; ensemble learning; deep learning; bagging; random forest; boosting; deep neural network.
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:koe:wpaper:1802&r=rmg
  3. By: Manh Ha Nguyen (LEMNA - Laboratoire d'économie et de management de Nantes Atlantique - UN - Université de Nantes); Olivier Darné (LEMNA - Laboratoire d'économie et de management de Nantes Atlantique - UN - Université de Nantes)
    Abstract: This paper analyzes volatility models and their risk forecasting abilities with the presence of jumps for the Vietnam Stock Exchange (VSE). We apply GARCH-type models, which capture short and long memory and the leverage effect, estimated from both raw and filtered returns. The data sample covers two VSE indexes, the VN index and HNX index, provided by the Ho Chi Minh City Stock Exchange (HOSE) and Hanoi Stock Exchange (HNX), respectively, during the period 2007 - 2015. The empirical results reveal that the FIAPARCH model is the most suitable model for the VN index and HNX index.
    Keywords: Vietnam Stock exchange,volatility,GARCH models,Value-at-Risk.
    Date: 2018–01–09
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-01679456&r=rmg
  4. By: Tetiana Davydiuk (Wharton School, University of Pennsylvania)
    Abstract: The Basel III Accord requires countercyclical capital buffers to protect the banking system against potential losses associated with excessive credit growth and buildups of systemic risk. In this paper, I provide a rationale for time-varying capital requirements in a dynamic general equilibrium setting. An optimal policy trades off reduced inefficient lending with reduced liquidity provision. Quantitatively, I find that the optimal Ramsey policy requires a capital ratio that mostly varies between 4% and 6% and depends on economic growth, bank supply of credit, and asset prices. Specifically, a one standard deviation increase in the bank credit-to-GDP ratio (GDP) translates into a 0.1% (0.7%) increase in capital requirements, while each standard deviation increase in the liquidity premium leads to a 0.1% decrease. The welfare gain from implementing this dynamic policy is large when compared to the gain from having an optimal fixed capital requirement.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:red:sed017:1328&r=rmg
  5. By: Xiu Xu; Wolfgang K. Härdle; Cathy Yi-Hsuan Chen
    Abstract: Abstract: Systemically important banks are connected and have dynamic dependencies of their default probabilities. An extraction of default factors from cross-sectional credit default swaps (CDS) curves allows to analyze the shape and the dynamics of the default probabilities. Extending the Dynamic Nelson Siegel (DNS) model, we propose a network DNS model to analyze the interconnectedness of default factors in a dynamic fashion, and forecast the CDS curves. The extracted level factors representing long-term default risk demonstrate 85.5% total connectedness, while the slope and the curvature factors document 79.72% and 62.94% total connectedness for the short-term and middle-term default risk, respectively. The issues of default spillover and systemic risk should be weighted for the market participants with longer credit exposures, and for regulators with a mission to stabilize financial markets. The US banks contribute more to the long-run default spillover before 2012, whereas the European banks are major default transmitters during and after the European debt crisis either in the long-run or short-run. The outperformance of the network DNS model indicates that the prediction on CDS curve requires network information.
    Keywords: Key Words: CDS, network, default risk, variance decomposition, risk management JEL Classification: C32, C51, G17
    JEL: C00
    Date: 2016–08
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2016-059&r=rmg
  6. By: Catalina Bolancé (Research group–IREA. Av. Diagonal 696; 08034 Barcelona ,Spain.); Raluca Vernic (Faculty of Mathematics and Informatics Ovidius University of Constanta; Bd Mamaia 124, 900527 Constanta, Romania.)
    Abstract: Starting from the question: “What is the accident risk of an insured?”, this paper considers a multivariate approach by taking into account three types of accident risks and the possible dependence between them. Driven by a real data set, we propose three trivariate Sarmanov distributions with generalized linear models (GLMs) for marginals and incorporate various individual characteristics of the policyholders by means of explanatory variables. Since the data set was collected over a longer time period (10 years), we also added each individual’s exposure to risk. To estimate the parameters of the three Sarmanov distributions, we analyze a pseudo-maximumlikelihood method. Finally, the three models are compared numerically with the simpler trivariate Negative Binomial GLM.
    Keywords: Multivariate counting distribution, Sarmanov distribution, Negative Binomial distribution, Generalized Linear Model, ML estimation algorithm. JEL classification: C51, G22.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:ira:wpaper:201718&r=rmg
  7. By: Simona Malovana (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; Czech National Bank, Na Prikope 28, 115 03 Prague 1, Czech Republic)
    Abstract: Banks in the Czech Republic maintain their regulatory capital ratios well above the level required by their regulator. This paper discusses the main reasons for this capital surplus and analyses the impact of additional capital requirements stemming from capital buffers and Pillar 2 add-ons on the capital ratios of banks holding such extra capital. The results provide evidence that banks shrink their capital surplus in response to higher capital requirements. A substantial portion of this adjustment seems to be delivered through changes in average risk weights. For this and other reasons, it is desirable to regularly assess whether the evolution and current level of risk weights give rise to any risk of underestimating the necessary level of capital.
    Keywords: Banks, capital requirements, capital surplus, panel data, partial adjustment model
    JEL: G21 G28 G32
    Date: 2017–12
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2017_28&r=rmg
  8. By: Joseph Hughes (Rutgers University); Choon-Geol Moon (Hanyang University)
    Abstract: We develop a novel technique to decompose banks’ ratio of nonperforming loans to total loans into three components: first, a minimum ratio that represents best-practice lending given the volume and composition of a bank’s loans, the average contractual interest rate charged on these loans, and market conditions such as the average GDP growth rate and market concentration; second, a ratio, the difference between the bank’s observed ratio of nonperforming loans, adjusted for statistical noise, and the best-practice minimum ratio, that represents the bank’s proficiency at loan making; third, a statistical noise. The best-practice ratio of nonperforming loans, the ratio a bank would experience if it were fully efficient at credit-risk evaluation and loan monitoring, represents the inherent credit risk of the loan portfolio and is estimated by a stochastic frontier technique. We apply the technique to 2013 data on top-tier U.S. bank holding companies which we divide into five size groups. The largest banks with consolidated assets exceeding $250 billion experience the highest ratio of nonperformance among the five groups. Moreover, the inherent credit risk of their lending is the highest among the five groups. On the other hand, their inefficiency at lending is one of the lowest among the five. Thus, the high ratio of nonperformance of the largest financial institutions appears to result from lending to riskier borrowers, not inefficiency at lending. Small community banks under $1 billion also exhibit higher inherent credit risk than all other size groups except the largest banks. In contrast, their loan-making inefficiency is highest among the five size groups. Restricting the sample to publicly traded bank holding companies and gauging financial performance by market value, we find the ratio of nonperforming loans to total loans is on average negatively related to financial performance except at the largest banks. When nonperformance, adjusted for statistical noise, is decomposed into inherent credit risk and lending inefficiency, taking more inherent credit risk enhances market value at many more large banks while lending inefficiency is negatively related to market value at all banks. Market discipline appears to reward riskier lending at large banks and discourage lending inefficiency at all banks.
    Keywords: commercial banking, credit risk, nonperforming loans, lending efficiency
    JEL: G21 L25 C58
    Date: 2018–01–16
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201802&r=rmg
  9. By: Lucia, Alessi (European Commission – JRC); Giuseppina, Cannas (European Commission - JRC); Sara, Maccaferri (European Commission - JRC); Marco, Petracco Giudici (European Commission - JRC)
    Abstract: In November 2015, the European Commission adopted a legislative proposal to set up a European Deposit Insurance Scheme (EDIS), a single deposit insurance system for all bank deposits in the Banking Union. JRC was requested to quantitatively assess the effectiveness of introducing a single deposit insurance scheme and to compare it with other alternative options for the set-up of such insurance at European level. JRC compared national Deposit Guarantee Schemes and EDIS based on their respective ability to cover insured deposits in the face of a banking crisis. Analyses are based upon the results of the SYMBOL model simulation of banks’ failures.
    Keywords: banking regulation; banking crisis; deposit insurance
    JEL: C15 G01 G21 G28
    Date: 2017–12
    URL: http://d.repec.org/n?u=RePEc:jrs:wpaper:201712&r=rmg
  10. By: Song, Zheng (Michael); Xiong, Wei
    Abstract: Motivated by growing concerns about the risks and instability of China’s financial system, this article reviews several commonly perceived financial risks and discusses their roots in China’s politico-economic institutions. We emphasize the need to evaluate these risks within China’s unique economic and financial systems, in which the state and non-state sectors coexist and the financial system serves as a key tool of the government to fund its economic policies. Overall, we argue that: (1) financial crisis is unlikely to happen in the near future, and (2) the ultimate risk lies with China’s economic growth, as a vicious circle of distortions in the financial system lowers the efficiency of capital allocation and economic growth and will eventually exacerbate financial risks in the long run.
    JEL: E02 G01
    Date: 2018–01–17
    URL: http://d.repec.org/n?u=RePEc:bof:bofitp:2018_001&r=rmg
  11. By: Ruenzi, Stefan; Weigert, Florian
    Abstract: This paper proposes a risk-based explanation of the momentum anomaly on equity markets. Regressing the momentum strategy return on the return of a self-financing portfolio going long (short) in stocks with high (low) crash sensitivity in the USA from 1963 to 2012 reduces the momentum effect from a highly statistically significant 11.94% to an insignificant 1.84%. We find additional supportive out-of sample evidence for our risk-based momentum explanation in a sample of 23 international equity markets.
    Keywords: Asset pricing, asymmetric dependence, copulas, crash sensitivity, momentum, tail risk
    JEL: C12 G01 G11 G12 G17
    Date: 2017–12
    URL: http://d.repec.org/n?u=RePEc:usg:sfwpfi:2018:01&r=rmg
  12. By: Evzen Kocenda (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic; CESifo, Munich; IOS, Regensburg); Michala Moravcova (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic)
    Abstract: We analyze time-varying exchange rate co-movements and volatility spillovers between the Czech koruna, the Polish zloty, the Hungarian forint and the dollar/euro from 1999 to 2016. We apply the dynamic conditional correlations (DCC) model and the Diebold Yilmaz spillover index to examine the periods prior to and during the GFC, plus during and after the EU debt crisis. We found declining conditional correlations between new EU exchange rates prior to both crises. During the GFC and the European debt crisis, the correlations reach the lowest level, and increase afterwards. Based on the DCC model results we calculate portfolio weights and hedge ratios. We show that during both crises portfolio diversification benefits increase but hedging costs rise as well. Based on the spillover index we document that during calm periods most of the volatilities are due to each currency’s own history. However, during the distress periods volatility spillovers among currencies increase substantially.
    Keywords: Exchange rate, New EU forex markets, volatility, DCC model, volatility spillover index, EU debt crisis, global financial crisis
    JEL: C52 F31 F36 G15 P59
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2017_27&r=rmg
  13. By: V\'ictor Gallego; Pablo Angulo; Pablo Su\'arez-Garc\'ia; David G\'omez-Ullate
    Abstract: In this work we propose a data-driven modelization approach for the management of advertising investments of a firm. First, we propose an application of dynamic linear models to the prediction of an economic variable, such as global sales, which can use information from the environment and the investment levels of the company in different channels. After we build a robust and precise model, we propose a metric of risk, which can help the firm to manage their advertisement plans, thus leading to a robust, risk-aware optimization of their revenue.
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1801.03050&r=rmg
  14. By: Thibault Gajdos (CNRS, Aix-Marseille University); John A Weymark (Vanderbilt University); Claudio Zoli (University of Verona)
    Abstract: Social risk equity is concerned with the comparative evaluation of social risk distributions, which are probability distributions over the potential sets of fatalities. In the approach to the evaluation of social risk equity introduced by Gajdos, Weymark, and Zoli (Shared destinies and the measurement of social risk equity, Annals of Operations Research 176:409-424, 2010), the only information about such a distribution that is used in the evaluation is that contained in a shared destiny risk matrix whose entry in the kth row and ith column is the probability that person i dies in a group containing k individuals. Such a matrix is admissible if it satisfies a set of restrictions implied by its definition. It is feasible if it can be generated by a social risk distribution. It is shown that admissibility is equivalent to feasibility. Admissibility is much easier to directly verify than feasibility, so this result provides a simply way to identify which matrices to consider when the objective is to socially rank the feasible shared destiny risk matrices.
    Keywords: social risk evaluation, social risk equity, public risk, shared destinies
    JEL: D6 D8
    Date: 2018–01–16
    URL: http://d.repec.org/n?u=RePEc:van:wpaper:vuecon-sub-18-00002&r=rmg
  15. By: Le Liu; Xiao-Yang Li; Enrico Zio (LGI - Laboratoire Génie Industriel - EA 2606 - CentraleSupélec); Rui Kang; Tong-Min Jiang
    Abstract: —In accelerated degradation testing (ADT), test data from higher than normal stress conditions are used to find stochas-tic models of degradation, e.g., Wiener process, Gamma process, and inverse Gaussian process models. In general, the selection of the degradation model is made with reference to one specific product and no consideration is given to model uncertainty. In this paper, we address this issue and apply the Bayesian model averaging (BMA) method to constant stress ADT. For illustration, stress relaxation ADT data are analyzed. We also make a simulation study to compare the s-credibility intervals for single model and BMA. The results show that degradation model uncertainty has significant effects on the p-quantile lifetime at the use conditions, especially for extreme quantiles. The BMA can well capture this uncertainty and compute compromise s-credibility intervals with the highest coverage probability at each quantile.
    Keywords: Accelerated aging,Bayesian methods,degrada- tion,stochastic processes,uncertainty
    Date: 2017–09
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-01652218&r=rmg
  16. By: Yasaman Karami (Graduate School of Economics, University of Tokyo); Kenichiro Shiraya (Graduate School of Economics, University of Tokyo)
    Abstract: We approximate normal implied volatilities by means of an asymp- totic expansion method. The contribution of this paper is twofold: to our knowledge, this paper is the rst to provide a uni ed approx- imation method for the normal implied volatility under general local stochastic volatility models. Second, we compared our method with the Monte-Carlo simulations by using the parameters calibrated to the actual market data and con rmed the accuracy.
    URL: http://d.repec.org/n?u=RePEc:cfi:fseres:cf427&r=rmg
  17. By: A. Ronald Gallant; Mohammad Jahan-Parvar; Hening Liu
    Abstract: We use the Bayesian method introduced by Gallant and McCulloch (2009) to estimate consumption-based asset pricing models featuring smooth ambiguity preferences. We rely on semi-nonparametric estimation of a flexible auxiliary model in our structural estimation. Based on the market and aggregate consumption data, our estimation provides statistical support for asset pricing models with smooth ambiguity. Statistical model comparison shows that models with ambiguity, learning and time-varying volatility are preferred to the long-run risk model. We analyze asset pricing implications of the estimated models.
    Keywords: Ambiguity ; Bayesian estimation ; equity premium ; Markov-switching ; long-run risk
    JEL: C61 D81 G11 G12
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1221&r=rmg
  18. By: Juan Jose Viquez; Alexander Campos; Jorge Loria; Luis Alfredo Mendoza; Jorge Aurelio Viquez
    Abstract: This article presents a new model for demographic simulation which can be used to forecast and estimate the number of people in pension funds (contributors and retirees) as well as workers in a public institution. Furthermore, the model introduces opportunities to quantify the financial ows coming from future populations such as salaries, contributions, salary supplements, employer contribution to savings/pensions, among others. The implementation of this probabilistic model will be of great value in the actuarial toolbox, increasing the reliability of the estimations as well as allowing deeper demographic and financial analysis given the reach of the model. We introduce the mathematical model, its first moments, and how to adjust the required probabilities, showing at the end an example where the model was applied to a public institution with real data.
    Date: 2017–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1801.04841&r=rmg
  19. By: Don Schlagenhauf (Federal Reserve Bank of St Louis); Carlos Garriga (Federal Reserve Bank of St. Louis)
    Abstract: The concept of "default threshold" captures the notion of a level of debt that it is not sustainable that results in default. This paper constructs different measures based on the dynamics of the monthly debt payment to after-tax income ratio. The preliminary examination using data from the Consumer Credit Panel suggest that some of these measures have some predictive content when compared to alternative measures based on FICO scores. A quantitative model of default behavior is constructed to replicate the dynamic patterns observed in the data.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:red:sed017:1305&r=rmg
  20. By: Quinn Culver; Dennis Heitmann; Christian Wei{\ss}
    Abstract: This article contains the first published example of a real economic balance sheet where the Solvency II ratio substantially depends on the seed selected for the random number generator (RNG) used. The theoretical background and the main quality criteria for RNGs are explained in detail. To serve as a gauge for RNGs, a definition of true randomness is given. Quality tests that RNGs should pass in order to generate stable results when used in risk management under Solvency II are described.
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1801.05409&r=rmg
  21. By: Majda Lachhab (LGP - Laboratoire Génie de Production - Ecole Nationale d'Ingénieurs de Tarbes); Cédrick Béler (LGP - Laboratoire Génie de Production - Ecole Nationale d'Ingénieurs de Tarbes); Erlyn Solano-Charris (Universidad de la Sabana); Thierry Coudert (LGP - Laboratoire Génie de Production - Ecole Nationale d'Ingénieurs de Tarbes)
    Abstract: This article proposes an integrated process that combine Systems engineering processes and project management ones. These processes are defined according to the industrial standard processes existing in the literature. The main idea is to define a common information model enabling the federation of all the points of view of the different actors with regards to systems engineering, project time management, project cost management and project risk management. The resulting integrated project graph encompasses all the scenarios established after defining all the coupling points between those processes. The definition of the graph is based also on the available knowledge and the capitalized experiences resulting from experience feedback on previous projects. The scenario selection optimization is then performed using a decision-aided tool that aims to build a panel of pareto-optimal solutions taking into account uncertainties on project objectives (cost and duration). This tool will also enable the decision-maker to select one scenario according to an acceptable level of risks. The integrated process, the optimization tool based on ant colony optimization (ACO) and the method for decision making are described in the paper.
    Keywords: Project management,Systems engineering,Processes integration,Risk,Uncertainty,Ant colony optimization,Decision aiding
    Date: 2017–07–09
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-01658004&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.