|
on Information and Communication Technologies |
By: | Tobias Kretschmer (London School of Economics) |
Abstract: | In this paper, we study the dynamics of the market for Database Management Systems (DBMS), which is commonly assumed to possess network effects and where there is still some viable competition in our study period, 2000 – 2004. Specifically, we make use of a unique and detailed dataset on several thousand UK firms to study individual organizations’ incentives to adopt a particular technology. We find that there are significant internal complement effects – in other words, using an operating system and a DBMS from the same vendor seems to confer some complementarities. We also find evidence for complementarities between enterprise resource planning systems (ERP) and DBMS and find that as ERP are frequently specific and customized, DBMS are unlikely to be changed once they have been customized to an ERP. We also find that organizations have an increasing tendency to use multiple DBMS on one site, which contradicts the notion that different DBMS are near-perfect substitutes. |
Keywords: | Database software, indirect network effects, technology adoption, microdata |
JEL: | L86 O33 |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0517&r=ict |
By: | Evangelos Katsamakas (Graduate School of Business, Fordham University); Mingdi Xin (Stern School of Business, New York University) |
Abstract: | The emergence of open source and Linux has burdened IT managers with the challenge of whether, when, and in what applications to adopt open source software in their firms. We characterize the conditions under which enterprises adopt open source software. We show that adoption depends crucially on network effects, the fit of software with the range of applications used by each firm, and the IT capabilities of a firm. Our model predicts that most firms will adopt a heterogeneous IT architecture that consists of open source and proprietary software. The equilibrium adoption is often socially inefficient. This is the first paper in the open source literature to model the enterprise adoption of open source. |
Keywords: | Open source software, Linux, IT management, IT architecture, IT capabilities, technology adoption. |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0529&r=ict |
By: | Oksana Loginova (Department of Economics, University of Missouri-Columbia); X. Henry Wang (Department of Economics, University of Missouri-Columbia); Haibin Lu |
Abstract: | AIn this paper we use mechanism design approach to find the optimal file-sharing mechanism in a peer-to-peer network. This mechanism improves upon existing incentive schemes. In particular, we show that peer-approved scheme is never optimal and service-quality scheme is optimal only under certain circumstances. Moreover, we find that the optimal mechanism can be implemented by a mixture of peer-approved and service-quality schemes. |
Keywords: | peer-to-peer networks, mechanism design. |
JEL: | D82 C7 |
Date: | 2006–07–19 |
URL: | http://d.repec.org/n?u=RePEc:umc:wpaper:0608a&r=ict |
By: | Matthew T. Clements (University of Texas); Hiroshi Ohashi (University of Tokyo) |
Abstract: | This paper examines the importance of indirect network effects in the U.S. video game market between 1994 and 2002. The diffusion of game systems is analyzed by the interaction between console adoption decisions and software supply decisions. Estimation results suggest that introductory pricing is an effective practice at the beginning of the product cycle, and expanding software variety becomes more effective later. The paper also finds a degree of inertia in the software market that does not exist in the hardware market. This observation implies that software providers continue to exploit the installed base of hardware users after hardware demand has slowed. |
Keywords: | indirect network effects; penetration pricing; software variety |
JEL: | C23 L68 M21 |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0401&r=ict |
By: | Nicholas Economides (Stern School of Business, NYU); Evangelos Katsamakas (Fordham University) |
Abstract: | The paper analyzes and compares the investment incentives of platform and application developers for Linux and Windows. We find that the level of investment in applications is larger when the operating system is open source rather than proprietary. The comparison of the levels of investment in the operating systems depends, among others, on reputation effects and the number of developers. The paper also develops a short case study comparing Windows and Linux and identifies new directions for open source software research. |
JEL: | L10 L86 L31 |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0507&r=ict |
By: | Nataly Gantman (Tel Aviv University); Yossi Spiegel (Tel Aviv University) |
Abstract: | Programmers can distribute new software to online users either for a fee as shareware or bundle it with advertising banners and distribute it for free as adware. In this paper we study the programmers’ choice between these two modes of distribution in the context of a model that take explicit account of the strategic interaction between programmers who develop software, …rms that advertise their products through ad banners, and consumers who buy software and consumer products. Adware allows advertisers to send targeted information to speci…c consumers and may therefore improve their purchasing decisions. At the same time, adware also raises privacy concerns. We study the e¤ect of programmers’ choice between shareware and adware on consumers’ welfare through its e¤ect on the bene… cial information that consumers receive about consumers products on the one hand and their loss of privacy on the other hand. We also examine the implications of improvements in the technology of ad banners and the desirability of bans on the use of adware. |
Keywords: | adware, shareware, advertising, privacy, ad banners |
JEL: | L12 L13 M37 |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0402&r=ict |
By: | Nicholas Economides (Stern School of Business, NYU); Evangelos Katsamakas (Fordham University) |
Abstract: | Technology platforms, such as Microsoft Windows, are the hubs of technology industries. We develop a framework to characterize the optimal two-sided pricing strategy of a platform firm, that is, the pricing strategy towards the direct users of the platform as well as towards firms offering applications that are complementary to the platform. We compare industry structures based on a proprietary platform (such as Windows) with those based on an open-source platform (such as Linux) and analyze the structure of competition and industry implications in terms of pricing, sales, profitability, and social welfare. We find that, when the platform is proprietary, the equilibrium prices for the platform, the applications, and the platform access fee for applications may be below marginal cost, and we characterize demand conditions that lead to this. The proprietary applications sector of an industry based on an open source platform may be more profitable than the total profits of a proprietary platform industry. When users have a strong preference for application variety, the total profits of the proprietary industry are larger than the total profits of an industry based on an open source platform. The variety of applications is larger when the platform is open source. When a system based on an open source platform with an independent proprietary application competes with a proprietary system, the proprietary system is likely to dominate the open source platform industry both in terms of marketshare and profitability. This may explain the dominance of Microsoft in the market for PC operating systems. |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0506&r=ict |
By: | Mark Ginsburg (University of Arizona) |
Abstract: | Performing a search on the World Wide Web (WWW) and traversing the resulting links is an adventure in which one encounters both credible and incredible web pages. Search engines, such as Google, rely on macroscopic Web topology patterns and even highly ranked ‘authoritative’ web sites may be a mixture of informed and uninformed opinions. Without credibility heuristics to guide the user in a maze of facts, assertions, and inferences, the Web remains an ineffective knowledge delivery platform. This report presents the design and implementation of a modular extension to the popular Google search engine, MEDQUAL, which provisions both URL and content-based heuristic credibility rules to reorder raw Google rankings in the medical domain. MEDQUAL, a software system written in Java, starts with a bootstrap configuration file which loads in basic heuristics in XML format. It then provides a subscription mechanism so users can join birds of feather specialty groups, for example Pediatrics, in order to load specialized heuristics as well. The platform features a coordination mechanism whereby information seekers can effectively become secondary authors, contributing by consensus vote additional credibility heuristics. MEDQUAL uses standard XML namespace conventions to divide opinion groups so that competing groups can be supported simultaneously. The net effect is a merger of basic and supplied heuristics so that the system continues to adapt and improve itself over time to changing web content, changing opinions, and new opinion groups. The key goal of leveraging the intelligence of a large-scale and diffuse WWW user community is met and we conclude by discussing our plans to develop MEDQUAL further and evaluate it. |
Keywords: | Credibility, Web Credibility, Heuristics, Medical Informatics, Authoritative, Opinions, XML, Java |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0413&r=ict |
By: | Hoon Hian Teck (School of Economics and Social Sciences, Singapore Management University); Edmund S. Phelps (Columbia University) |
Abstract: | It seems to be taken for granted by many commentators that the sharp decline in prices of computers, telecommunications equipment and software resulting from the technological improvements in the information and communications technology (ICT)-producing sector is good for jobs and is a major driving force behind the non-inflationary employment miracle and booming stock market in the latter half of the nineties in the U.S. and their recurrence since 2004. We show that, in our model, a technical improvement in the ICT-producing sector by itself cannot explain a simultaneous increase in employment and a risein firms’ valuation (or Tobin’s Q ratio). There are two cases. If the elasticity of equipment price (pI ) with respect to ICT-producing sector’s productivity is less than one, labor’s value marginal productivity increases thus pulling up the demand wage and expanding employment. However, the increased output by adding to the capital stock and thus driving down future capital rentals causes a decline in firms’ valuation, q per unit, even though Tobin’s Q (= q=pI ) is up. If the elasticity is greater than one, equipment prices fall so dramatically that labor’s value marginal productivity declines, employment in the ICT-using sector expands proportionately more than the increase in capital stock, thus raising future capital rentals, so both firms’ valuation and Tobin’s Q rise; but then real demand wage falls and employment contracts. The key to generating a booming stock market alongside employment expansion is to hypothesize that when technical improvement in the ICT-producing sector occurs, the market forms an expectation of future productivity gains to be reaped in the ICT-using sector. Then we can explain not only the stock market boom and associated rise in investment spending and employment in the period 1995-2000 but also the subsequent decline in employment, in Tobin’s Q and in investment spending in 2001, with consumption holding up well as productivity gains in the ICT-using sector were realized. An anticipation of a future TFP improvement in the ICT-using sector can once more play the role of raising the stock market. |
Keywords: | Business asset valuation, Tobin’s Q, investment spending,employment |
JEL: | E13 E22 E23 E24 O33 |
Date: | 2006–02 |
URL: | http://d.repec.org/n?u=RePEc:siu:wpaper:07-2006&r=ict |
By: | Yuanhua Feng (Department of Mathematics and Statistics, University of Konstanz) |
Abstract: | This paper proposes a semiparametric approach by introducing a smooth scale function into the standard GARCH model so that conditional heteroskedasticity and scale change in a financial time series can be modelled simultaneously. An estimation procedure combining kernel estimation of the scale function and maximum likelihood estimation of the GARCH parameters is proposed. Asymptotic proper- ties of the kernel estimator are investigated in detail. An iterative plug-in algorithm is developed for selecting the bandwidth. Practical performance of the proposal is illustrated by simulation. The proposal is applied to the daily S&P 500 and DAX 100 returns. It is shown that there are simultaneously significant conditional heteroskedasticity and scale change in these series. |
Keywords: | Semiparametric GARCH, conditional heteroskedasticity, scale change, nonparametric regression with dependence, bandwidth selection |
JEL: | C22 C14 |
URL: | http://d.repec.org/n?u=RePEc:knz:cofedp:0212&r=ict |
By: | Prasenjit Mitra (Pennsylvania State University); Sandeep Purao (Pennsylvania State University); John W. Bagby (Pennsylvania State University); Karthikeyan Umapathy (Pennsylvania State University); Sharoda Paul (Pennsylvania State University) |
Abstract: | There is an evolution in the process used by standards-development organizations (SDOs) and this is changing the prevailing standards development activity (SDA) for information and communications technology (ICT). The process is progressing from traditional SDA modes, typically involving the selection from many candidate, existing alternative components, into the crafting of standards that include a substantial design component (SSDC), or “anticipatory” standards. SSDC require increasingly important roles from organizational players as well as SDOs. Few theoretical frameworks exist to understand these emerging processes. This project conducted archival analysis of SDO documents for a selected subset of web-services (WS) standards taken from publicly available sources including minutes of meetings, proposals, drafts and recommendations. This working paper provides a deeper understanding of SDAs, the roles played by different organizational participants and the compliance with SDO due process requirements emerging from public policy constraints, recent legislation and standards accreditation requirements. This research is influenced by a recent theoretical framework that suggests viewing the new standards-setting processes as a complex interplay among three forces: sense-making, design, and negotiation (DSN). The DSN model provides the framework for measuring SDO progress and therefore understanding future generations of standards development processes. The empirically grounded results are useful foundation for other SDO modeling efforts. |
Keywords: | antitrust, design, intellectual property rights, negotiation, sense-making, standardization, standards development organizations |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0518&r=ict |
By: | Nicholas Economides (Stern School of Business, NYU) |
Abstract: | This paper discusses the economics of the Internet backbone. I discuss competition on the Internet backbone as well as relevant competition policy issues. In particular, I show how public protocols, ease of entry, very fast network expansion, connections by the same Internet Service Provider (“ISP”) to multiple backbones (ISP multi-homing), and connections by the same large web site to multiple ISPs (customer multi-homing) enhance price competition and make it very unlikely that any firm providing Internet backbone connectivity would find it profitable to degrade or sever interconnection with other backbones in an attempt to monopolize the Internet backbone. |
Keywords: | Internet, network effects, Internet backbone, competition, monopoly, MCI, WorldCom |
JEL: | L12 L13 C63 D42 D43 |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0423&r=ict |
By: | Klaus Abberger (IFO Munich) |
Abstract: | The procedures of estimating prediction intervals for ARMA processes can be divided into model based methods and empirical methods. Model based methods require knowledge of the model and the underlying innovation distribution. Empirical methods are based on the sample forecast errors. In this paper we apply nonparametric quantile regression to the empirical forecast errors using lead time as regressor. With this method there is no need for a distribution assumption. But for the data pattern in this case a double kernel method which allows smoothing in two directions is required. An estimation algorithm is presented and applied to some simulation examples. |
Keywords: | Forecasting, Prediction intervals, Non normal distributions, Nonparametric estimation, Quantile regression |
URL: | http://d.repec.org/n?u=RePEc:knz:cofedp:0202&r=ict |
By: | Douglas Sicker (University of Colorado at Boulder); Tom Lookabaugh (University of Colorado at Boulder) |
Abstract: | Voice over Internet Protocol (VoIP) will transform many aspects of traditional telephony service including technology, the business models and the regulatory constructs that govern such service. This transformation is generating a host of technical, business, social and policy problems. The Federal Communications Commission (FCC) could attempt to mandate obligations or specific solutions to the policy issues around VoIP, but is instead looking first to industry initiatives focused on key functionality that users have come to expect of telecommunications services. High among these desired functionalities is access to emergency services that allow a user to summon fire, medical or law enforcement agencies. Such services were traditionally required (and subsequently implemented) through state and federal regulations. Reproducing emergency services in the VoIP space has proven to be a considerable task, if for no other reason then the wide and diverse variety of VoIP implementations and implementers. Regardless of this difficulty, emergency service capability is a critical social concern, making it is particularly important for the industry to propose viable solutions for promoting VoIP emergency services before regulators are compelled to mandate a solution, an outcome that often suffers compromises both through demands on expertise that may be better represented in industry and through the mechanisms of political influence and regulatory capture. While technical and business communities have, in fact, made considerable progress in this area, significant uncertainty and deployment problems still exist. The question we ask is: can an industry based certification and labeling process credibly address social and policy expectations regarding emergency services and VoIP, thus avoiding the need for government regulation at this critical time?1 We hypothesize that it can. To establish this, we developed just such a model for VoIP emergency service compliance through industry certification and device labeling. The intent of this model is to support a wide range of emergency service implementations while providing the user some validation that the service will operate as anticipated. To do this we first examine possible technical implementations for emergency services for VoIP.2 Next, we summarize the theory of certification as self-regulation and examine several relevant examples. Finally, we synthesize a specific model for certification of VoIP emergency services. We believe that the model we describe provides both short term and long-term opportunities. In the short term, an industry driven effort to solve the important current problem of emergency services in VoIP, if properly structured and overseen as we suggest, should be both effective and efficient. In the long term, such a process can serve as a model for the application of self-regulation to social policy goals in telecommunications, an attractive tool to have as telecommunications becomes increasingly diverse and heterogeneous. |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0419&r=ict |
By: | Michael D. Smith (Carnegie Mellon University); Rahul Telang (Carnegie Mellon University) |
Abstract: | Improving the information retrieval (IR) performance of peer-to-peer networks is an important and challenging problem. Recently, the computer science literature has attempted to address this problem by improving IR search algorithms. However, in peer-to-peer networks, IR performance is determined by both technology and user behavior, and very little attention has been paid in the literature to improving IR performance through incentives to change user behavior. We address this gap by combining the club goods economics literature and the IR literature to propose a next generation file sharing architecture. Using the popular Gnutella 0.6 architecture as context, we conceptualize a Gnutella ultrapeer and its local network of leaf nodes as a “club” (in economic terms). We specify an information retrieval-based utility model for a peer to determine which clubs to join, for a club to manage its membership, and for a club to determine to which other clubs they should connect. We simulate the performance of our model using a unique real-world dataset collected from the Gnutella 0.6 network. These simulations show that our club model accomplishes both performance goals. First, peers are self-organized into communities of interest — in our club model peers are 85% more likely to be able to obtain content from their local club than they are in the current Gnutella 0.6 architecture. Second, peers have increased incentives to share content — our model shows that peers who share can increase their recall performance by nearly five times over the performance offered to free-riders. We also show that the benefits provided by our club model outweigh the added protocol overhead imposed on the network for the most valuable peers. |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0412&r=ict |
By: | Chris OÕDonnell (University of Queensland); Robert G. Chambers (Dept of Agricultural and Resource Economics, University of Maryland, College Park); John Quiggin (Department of Economics, University of Queensland) |
Abstract: | In a stochastic decision environment, differences in information can lead rational decision makers facing the same stochastic technology and the same markets to make different production choices. Efficiency and productivity measurement in such a setting can be seriously and systematically biased by the manner in which the stochastic technology is represented. For example, conventional production frontiers implicitly impose the restriction that information differences have no effect on the way risk-neutral decision makers utilize the same input bundle. The result is that rational and efficient ex ante production choices can be mistakenly characterized as inefficient -- informational differences are mistaken for differences in technical efficiency. This paper uses simulation methods to illustrate the type and magnitude of empirical errors that can emerge in efficiency analysis as a result of overly restrictive representations of production technologies. |
JEL: | D81 |
URL: | http://d.repec.org/n?u=RePEc:rsm:riskun:r06_2&r=ict |
By: | Onsel Emre (University of Chicago); Ali Hortacsu (University of Chicago and NBER); Chad Syverson (University of Chicago and NBER) |
Abstract: | While a fast-growing body of research has looked at how the advent and di®usion of e- commerce has a®ected prices, much less work has investigated e-commerce's impact on the number and type of ¯rms operating in an industry. This paper theoretically and empirically takes up the question of which producers most bene¯t and most su®er as consumers switch to purchasing products online. We specify a general industry model involving consumers with di®ering search costs buying products from heterogeneous-type producers. We inter- pret e-commerce as having created reductions in consumers' search costs. We show how such shifts in the search cost distribution reallocate market shares from an industry's low- type producers to its high-type businesses. We test the model using data for two industries in which e-commerce has arguably decreased consumers' search costs considerably: travel agencies and bookstores. We ¯nd evidence in both industries of the market share shifts predicted by the model. Interestingly, while both industries experienced similar changes, the speci¯c mechanisms through which e-commerce induced them were di®erent. For travel agencies, the shifts re°ected aggregate changes driven by airlines' reductions in agent com- missions as consumers started buying tickets online. For bookstores, on the other hand, industry-wide declines in small book stores re°ected aggregated market-speci¯c impacts, evidenced by the fact that more small-store exit occurred in those local markets where consumers' use of e-commerce channels grew fastest. |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0524&r=ict |
By: | Nicholas Economides (Stern School of Business, New York University); V. Brian Viard (Graduate School of Business, Stanford University) |
Abstract: | We discuss the case of a monopolist of a base good in the presence of a complementary good provided either by it or by another firm. We assess and calibrate the extent of the influence on the profits from the base good that is created by the existence of the complementary good, i.e., the extent of the network effect. We establish an equivalence between a model of a base and a complementary good and a reduced-form model of the base good in which network effects are assumed in the consumers’ utility functions as a surrogate for the presence of direct or indirect network effects, such as complementary goods produced by other firms. We also assess and calibrate the influence on profits of the intensity of network effects and quality improvements in both goods. We evaluate the incentive that a monopolist of the base good has to improve its quality rather than that of the complementary good under different market structures. Finally, based on our results, we discuss a possible explanation of the fact that Microsoft Office has a significantly higher price than Microsoft Windows although both products have comparable market shares. |
Keywords: | calibration; monopoly; network effects; complementary goods; software; Microsoft |
JEL: | L12 L13 C63 D42 D43 |
Date: | 2005–11 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0531&r=ict |
By: | Yooki Park (University of California, Berkeley); Suzanne Scotchmer (University of California, Berkeley) |
Abstract: | Digital products such as movies, music and computer software are protected both by self-help measures such as encryption and copy controls, and by the legal right to prevent copying. We explore how digital rights management and other technical protections a®ect the pricing of content, and consequently, why content users, content vendors, and antitrust authorities might have di®erent views on what technical capabilities should be deployed. We discuss the potential for \collusion through technology." |
Keywords: | technical protections, DRM, antitrust, trusted systems |
JEL: | L13 L14 L15 K21 O33 |
Date: | 2004–09–30 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0409&r=ict |
By: | Anindya Ghose (NYU, Stern School of Business); Arun Sundararajan (NYU, Stern School of Business) |
Abstract: | We present a framework for measuring software quality using pricing and demand data, and empirical estimates that quantify the extent of quality degradation associated with software ver- sioning. Using a 7-month, 108-product panel of software sales from Amazon.com, we document the extent to which quality varies across di¤erent software versions, estimating quality degradation that ranges from as little as 8% to as much as 56% below that of the corresponding ?agship ver- sion. Consistent with prescriptions from the theory of vertical di¤erentiation, we also ?nd that an increase in the total number of versions is associated with an increase in the di¤erence in quality between the highest and lowest quality versions, and a decrease in the quality di¤erence between "neighboring" versions. We compare our estimates with those derived from two sets of subjective measures of quality, based on CNET editorial ratings and Amazon.com user reviews, and discuss competing interpretations of the signi?cant di¤erences that emerge from this comparison. As the ?rst empirical study of software versioning that is based on both subjective and econometrically estimated measures of quality, this paper provides a framework for testing a wide variety of results in IS that are based on related models of vertical di¤erentiation, and its ?ndings have important implications for studies that treat web-based user ratings as cardinal data. |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0514&r=ict |
By: | Yossi Spiegel (Tel Aviv University) |
Abstract: | This paper examines the incentives of programmers to contribute to open source software projects on a voluntary basis. In particular, the paper looks at this incentive changes as (i) performance becomes more visible to the relevant audience, (ii) e¤ort has a stronger impact on performance, and (iii) performance becomes more informative about talent. In all three cases, it is shown that whether we start from a stable interior equilibrium or an unstable interior equilibrium. |
Date: | 2005–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0523&r=ict |
By: | Katja Seim (Graduate School of Business, Stanford University); V. Brian Viard (Graduate School of Business, Stanford University) |
Abstract: | We test the effect of entry on the tariff choices of incumbent cellular firms. We relate the change in the breadth of calling plans between 1996, when incumbents enjoyed a duopoly market, and 1998, when incumbents faced increased competition from personal communications services (PCS) firms. Entry by PCS competitors differed across geographic markets due to the number of licenses left undeveloped as a result of the bankruptcy of some of the auctions’ winning bidders and due to variation across markets in the time required to build a sufficiently large network of wireless infrastructure. We find that incumbents increase tariff variety in markets with more entrants and that this effect is not explained by demographic heterogeneity or cost differences in maintaining calling plans across markets. We also find that incumbents are more likely to upgrade their technology from the old analog technology to the new digital technology in markets with more entry, suggesting that entry also has indirect effects on tariff choice via firms’ technology adoption decisions. |
Keywords: | entry, market structure, cellular, price discrimination, nonlinear pricing, telecommunications |
JEL: | L11 L13 L25 L96 |
Date: | 2004–11 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0313&r=ict |
By: | Gilbert Desmarais |
Abstract: | An international PEB seminar on “Information and Communications Technology and Educational Property Management” was held in Montreal, Canada, from 31 October to 3 November 2004. The aim of this seminar was to examine how information and communications technology (ICT) can be incorporated into educational property management by investigating three issues: how ICT can make educational spaces more functional and comfortable in a sustainable development perspective, how it can improve the security and protection of facilities and, lastly, how it can optimise their technical and administrative management. The participants had the opportunity to see the theories presented in each field illustrated concretely by visiting innovative institutions in Montreal and its suburbs. A brief summary of these visits is provided below. |
Keywords: | Canada, technology, Quebec, management |
Date: | 2005–02 |
URL: | http://d.repec.org/n?u=RePEc:oec:eduaaa:2005/1-en&r=ict |
By: | Aurora García-Gallego (Universitat Jaume I (Castellón, Spain)); Nikolaos Georgantzís (Universitat Jaume I (Castellón, Spain)); Pedro Pereira (Autoridade da Concorrência (Portugal)); José C. Pernías-Cerrillo (Universitat Jaume I (Castellón, Spain)) |
Abstract: | This paper analyzes the impact on consumer prices of the size and biases of price comparison search engines. We develop several theoretical predictions, in the context of a model related to Burdett and Judd (1983) and Varian (1980), and test them experimentally. The data supports the model’s predictions regarding the impact of the number of firms, and the type of bias of the search engine. The data does not support the model’s predictions regarding the impact of the size of the search engine. We identified several data patterns, and developed an econometric model for the price distributions. Variables accounting for risk attitudes improved significantly the explanatory power of the econometric model. |
Keywords: | Search engines, incomplete information, biased information, price levels, experiments |
JEL: | D43 D83 L13 |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0403&r=ict |
By: | Tobias Kretschmer (London School of Economics); Katrin Muehlfeld (London School of Economics) |
Abstract: | The success of the CD has (partly) been attributed to the ability of Sony, Philips and Matsushita to cooperate in the run-up to the DAD conference in 1981, where the technological standard was set. We model the situation leading up to the conference in a simple game with technological progress and the possibility of prelaunching a technology. We identify players' tradeos between prelaunching (which ends technological progress) and continued development (which involves the risk of being pre-empted). Contrasting outcomes with complete and incomplete information, we nd that there appeared to be considerable uncertainty about rivals' technological progress. |
Date: | 2004–10 |
URL: | http://d.repec.org/n?u=RePEc:net:wpaper:0414&r=ict |
By: | Takatoshi Ito; Yuko Hashimoto |
Abstract: | This paper examines intra-day patterns of the exchange rate behavior, using the “firm” bid-ask quotes and transactions of USD-JPY and Euro-USD recorded in the electronic broking system of the spot foreign exchange markets. The U-shape of intra-day activities (deals and price changes) and return volatility is confirmed for Tokyo and London participants, but not for New York participants. Activities and volatility do not increase toward the end of business hours in the New York market, even on Fridays (ahead of weekend hours of non-trading). It is found that there exists a high positive correlation between volatility and activities and a negative correlation between volatility and the bid-ask spread. A negative correlation is observed between the number of deals and the width of bid-ask spread during business hours. |
JEL: | F31 F33 G15 |
Date: | 2006–08 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:12413&r=ict |