|
on Computational Economics |
Issue of 2005‒05‒07
eleven papers chosen by |
By: | Sophia Levtchenkova; Jeff Petchey |
Abstract: | In this paper, a simulation model is constructed that would enable policy-makers in transitional economies to estimate regionally disparate capital needs and direct capital expenditures to improve standards of public capital or services such as education and health. The model allows policy makers to achieve some degree of ‘equalisation in the regional distribution of publicly supplied capital to allow citizens greater equality of access to services, regardless of location. After developing an appropriate input database, the model is applied to South Africa. The results show that South Africa would need to commit about 2 percent of GDP to supplementary public capital expenditures if it is to make substantial inroads into attaining some form of public infrastructure equalisation over time. |
Keywords: | transitional economies, capital expenditure, PIM, equalisation, infrastructure backlogs, public expenditure |
Date: | 2004–11–01 |
URL: | http://d.repec.org/n?u=RePEc:ays:ispwps:paper0414&r=cmp |
By: | James Alm (Andrew Young School of Policy Studies, Georgia State University); Sally Wallace (Andrew Young School of Policy Studies, Georgia State University) |
Abstract: | This staff paper analyzes this “system” of payroll taxes and contributions, focusing mainly on the tax and contribution side rather than on the benefit aspects of the contribution programs. The administration of each of these payroll programs is discussed, and the effects of the entire system are also analyzed. Much of the analyses is based on microsimulation models developed in the course of this tax reform project. |
Keywords: | Jamaica, Payroll Taxes |
Date: | 2004–12–01 |
URL: | http://d.repec.org/n?u=RePEc:ays:ispwps:paper0431&r=cmp |
By: | Beers,Wim C.M. van; Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research) |
Abstract: | This paper proposes a novel method to select an experimental design for interpolation in random simulation, especially discrete event simulation. (Though the paper focuses on Kriging, this design approach may also apply to other types of metamodels such as linear regression models.) Assuming that simulation requires much computer time, it is important to select a design with a small number of observations (or simulation runs). The proposed method is therefore sequential. Its novelty is that it accounts for the specific input/output behavior (or response function) of the particular simulation at hand; i.e., the method is customized or application-driven. A tool for this customization is bootstrapping, which enables the estimation of the variances of predictions for inputs not yet simulated. The new method is tested through two classic simulation models: example 1 estimates the expected steady-state waiting time of the M/M/1 queueing model; example 2 estimates the mean costs of a terminating (s, S) inventory simulation. For these simulations the novel design indeed gives better results than Latin Hypercube Sampling (LHS) with a prefixed sample of the same size. |
JEL: | C0 C1 C9 C15 C44 |
Date: | 2005 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:200555&r=cmp |
By: | Grigoriev,Alexander; Uetz,Marc (METEOR) |
Abstract: | We consider a scheduling problem where a set of jobs is distributed over parallel machines. The processing time of any job is dependent on the usage of a scarce renewable resource, e.g., personnel. An amount of k units of that resource can be allocated to the jobs at any time, and the more of that resource is allocated to a job, the smaller its processing time. The dependence of processing times on the amount of resources is linear for any job. The objective is to find a resource allocation and a schedule that minimizes the makespan. Utilizing an integer quadratic programming relaxation, we show how to obtain a (3+e)-approximation algorithm for that problem, for any e>0. This generalizes and improves previous results, respectively. Our approach relies on a fully polynomial time approximation scheme to solve the quadratic programming relaxation. This result is interesting in itself, because the underlying quadratic program is NP-hard to solve in general. We also briefly discuss variants of the problem and derive lower bounds. |
Keywords: | operations research and management science; |
Date: | 2005 |
URL: | http://d.repec.org/n?u=RePEc:dgr:umamet:2005014&r=cmp |
By: | Diao, Xinshen; Diaz-Bonilla, Eugenio; Robinson, Sherman; Orden, David |
Abstract: | "This paper accomplishes two objectives. First, it provides simulation results from a computable general equilibrium (CGE) model that have helped focus the debate about the potential effects of agricultural trade liberalization on developing countries. The aggregate numbers show modest net positive effects over a medium-term period (five years out). First, when developed countries fully remove their subsidies and trade barriers, welfare and GDP of the developing countries rise, as do value added in agricultural production and agro-industries, and agricultural exports. Focal point estimates that we provide are increases in welfare and GDP of $10 billion and $15 billion, respectively, while agricultural value added increases $23 billion and agricultural exports by $37 billion. Second, when developing countries also eliminate their subsidies and trade barriers, there is an additional net gain in aggregated developing country welfare and GDP—which now increase by nearly $20 billion and $38 billion. Thus, developing countries gain from developed country liberalization, but there are also gains from reform of their own policies. Our results suggest a fairly even balance between these sources of gains. The second and equally important contribution of the paper is to describe the heterogeneity among developing countries in terms of their agricultural resources, and to disaggregate the simulated results among 40 developing countries or regions. The basic model includes the innovation of assuming there is unemployed labor in developing countries, so growth in agricultural production has a modest “multiplier” effect. The basic model also allows for a slight positive effect of increased trade on productivity—the focal results cited above include this impact. Effects are distinguished between elimination of subsidies and trade barriers by the US, the EU, Japan and Korea, and all developed countries simultaneously. Effects on different developing countries and regions differ due to differences in the subsidy and trade barrier instruments utilized by the developed countries, the commodities affected, and the trade patterns and volumes evident in the initial baseline data." Authors' Abstract |
Keywords: | Computable general equilibrium (CGE) ,simulation models , |
Date: | 2005 |
URL: | http://d.repec.org/n?u=RePEc:fpr:mtiddp:84&r=cmp |
By: | José Apesteguía (Departamento de Economía-UPNA); Miguel A. Ballester (Departamento de Economía-UPNA) |
Abstract: | Kalai, Rubinstein, and Spiegler (2002) propose the rationalization of choice functions that violate the “independence of irrelevant alternatives” axiom through a collection (book) of linear orders (rationales). In this paper we present an algorithm which, for any choice function, gives (i) the minimal number of rationales that rationalizes the choice function, (ii) the composition of such rationales, and (iii) information on how choice problems are related to rationales. As in the classical case, this renders the information given by a choice function completely equivalent to that given by a minimal book of rationales. We also study the structure of several choice procedures that are prominent in the literature. |
Keywords: | Rationalization, Independence of irrelevant alternatives, Order partition, Computational effort. |
URL: | http://d.repec.org/n?u=RePEc:nav:ecupna:'0501'&r=cmp |
By: | Robert J. Shiller |
Abstract: | The life-cycle accounts proposal for Social Security reform has been justified by its proponents using a number of different arguments, but these arguments generally involve the assumption of a high likelihood of good returns on the accounts. A simulation is undertaken to estimate the probability distribution of returns in the accounts based on long-term historical experience. U.S. stock market, bond market and money market data 1871-2004 are used for the analysis. Assuming that future returns behave like historical data, it is found that a baseline personal account portfolio after offset will be negative 32% of the time on the retirement date. The median internal rate of return in this case is 3.4 percent, just above the amount necessary for holders of the accounts to break even. However, the U.S. stock market has been unusually successful historically by world standards. It would be better if we adjust the historical data to reduce the assumed average stock market return for the simulation. When this is done so that the return matches the median stock market return of 15 countries 1900-2000 as reported by Dimson et al. [2002], the baseline personal account is found to be negative 71% of the time on the date of retirement and the median internal rate of return is 2.6 percent. |
JEL: | H55 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:11300&r=cmp |
By: | Nicola Bruti-Liberati (School of Finance and Economics, University of Technology, Sydney); Filippo Martini (Faculty of Information Technology, University of Technology, Sydney); Massimo Piccardi (Faculty of Information Technology, University of Technology, Sydney); Eckhard Platen (School of Finance and Economics, University of Technology, Sydney) |
Abstract: | Monte Carlo simulation of weak approximations of stochastic differential equations constitutes an intensive computational task. In applications such as finance, for instance, to achieve "real time" execution, as often required, one needs highly efficient implementations of the multi-point distributed random number generator underlying the simulations. In this paper a fast and flexible dedicated hardware solution on a field programmable gate array is presented. A comparative performance analysis between a software-only and the proposed hardware solution demonstrates that the hardware solution is bottleneck-free, retains the flexibility of the software solution and significantly increases the computational efficiency. Moreover, simulations in applications such as economics, insurance, physics, population dynamics, epidemiology, structural mechanics, chemistry and biotechnology can benefit from the obtained speedup. |
Keywords: | random number generators; random bit generators; hardware implementation; field programmable gate arrays (FPGAs); Monte Carlo simulation; weak Taylor schemes; multi-point distributed random variables |
JEL: | G10 G13 |
Date: | 2005–04–01 |
URL: | http://d.repec.org/n?u=RePEc:uts:rpaper:156&r=cmp |
By: | Hokky Situngkir (Bandung Fe Institute); Yohanes Surya (Surya Research International) |
Abstract: | The paper revisits the investment simulation based on strategies exhibited by Generalized (m,2)-Zipf law to present an interesting characterization of the wildness in financial time series. The investigations of dominant strategies on each specific time series shows that longer words dominant in larger time scale exhibit shorter dominant ones in smaller time scale and vice versa. Moreover, denoting the term wildness based on persistence over short term trend and memory represented by particular length of words, we can see how wild historical fluctuations over time series data coped with the Zipf strategies. |
Keywords: | Generalized (m,2)-Zipf law, time series, fluctuations, investment. |
JEL: | G |
Date: | 2005–04–29 |
URL: | http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0504022&r=cmp |
By: | Sutthisit Jamdee (Kent State University); Cornelis A. Los (Kent State University) |
Abstract: | This paper demonstrates the impact of the observed financial market persistence or long term memory on European option valuation by simple simulation. Many empirical researchers have observed the non-Fickian degrees of persistence or long memory in the financial markets different from the Fickian neutral independence (i.i.d.) of the returns innovations assumption of Black-Scholes' geometric Brownian motion assumption. Moreover, Elliott and van der Hoek (2003) provide a theoretical framework for incorporating these findings into the Black- Scholes risk-neutral valuation framework. This paper provides the first graphical demonstration why and how such long term memory phenomena change European option values and provides thereby a basis for informed long term memory arbitrage. By using a simple mono-fractal Fractional Brownian motion, it is easy to incorporate the various degrees of persistence into the Black-Scholes pricing formula. Long memory options are of considerable importance in corporate remuneration packages, since stock options are written on a company's own shares for long expiration periods. It makes a significant difference in the valuation when an option is 'blue' or when it is 'red.' For a proper valuation of such stock options, the degrees of persistence of the companies' share markets must be precisely measured and properly incorporated in the warrant valuation, otherwise substantial pricing errors may result. |
Keywords: | Options, Long Memory, Persistence, Hurst Exponent, Identification, Simulation, Executive Remuneration |
JEL: | G14 G15 G33 C14 |
Date: | 2005–05–03 |
URL: | http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0505003&r=cmp |
By: | John Bower (Oxford Institute for Energy Studies); Nawal Kamel (Oxford Institute for Energy Studies) |
Abstract: | Keynes proposed that a ‘Commod Control’ agency be created after the Second World War to stabilise spot prices of key internationally traded commodities by systematically buying and selling physical buffer stocks. In this paper, the creation of a new Global Commodity Insurer (GCI) is discussed that would operate an international Commodity Price Insurance (CPI) scheme with the objective of protecting national government revenues, spending and investment against the adverse impact of short- term deviations in commodity prices, and especially oil prices, from their long-run equilibrium level. Crude oil is the core commodity in this scheme because energy represents 50% of world commodity exports, and oil price shocks have historically had a significant macroeconomic impact. In effect the GCI would develop a new international market, which is currently missing, designed to protect governments against the risk of declines in their fiscal revenue, and increases in the level of claims on that income especially from social programmes, brought about by short-term commodity price shocks. GCI would take advantage of the rapid growth of trading in derivative securities in the global capital market since the 1980s by selling CPI insurance contracts tailored to the specific commodity price exposure faced by national government, and offsetting the resulting price risk with a portfolio of derivative contracts of five-year or longer maturities, supplied by banks, insurers, reinsurers, investment institutions, and commodity trading companies, with investment grade credit ratings. The difference between the CPI and a buffer stock or export/import control scheme is that it would mitigate the macro-economic shocks posed by commodity price volatility, but not attempt to control commodity prices. The cost of the CPI scheme is estimated by simulating 5-year commodity price paths using a standard log price mean reverting model parameterised from an econometric analysis of commodity price time series. |
Keywords: | Commodity Price Insurance |
JEL: | F3 Q43 |
Date: | 2005–04–29 |
URL: | http://d.repec.org/n?u=RePEc:wpa:wuwpot:0504012&r=cmp |