nep-sog New Economics Papers
on Sociology of Economics
Issue of 2020‒01‒06
two papers chosen by
Jonas Holmström
Axventure AB

  1. Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics By Cloos, Janis; Greiff, Matthias; Rusch, Hannes
  2. Push button replication: Is impact evaluation evidence for international development verifiable? By Wood, Benjamin; Müller, Rui; Brown, Annette Nicole

  1. By: Cloos, Janis (clausthal university of technology); Greiff, Matthias (clausthal university of technology); Rusch, Hannes (General Economics 1 (Micro))
    Abstract: We examine geographical concentration, scientific quality, and editorial favoritism in the field of experimental economics. We use a novel data set containing all original research papers (N=583) that exclusively used laboratory experiments for data generation and were published in the American Economic Review, Experimental Economics or the Journal of the European Economic Association between 1998 and 2018. The development of geographical concentration is examined using data on authors' affiliations at the time of the respective publication. Results show that research output produced by US-affiliated economists increased slower than overall research output, leading to a decrease in geographical concentration. Several proxies for scientific quality indicate that experiments conducted in Europe are of higher quality than experiments conducted in North America: European experiments rely on a larger total number of participants as well as participants per treatment, and receive more citations compared to experiments conducted in North America. Examining laboratory experiments published in the AER more closely, we find that papers authored by economists with US-affiliations receive significantly fewer citations in the first 5 and 10 years after publication compared to papers by authors from the rest of the world.
    Keywords: laboratory experiments, favoritism, geographical concentration, Methodological standards, network effects
    JEL: A11 A14 C90 I23
    Date: 2019–12–16
    URL: http://d.repec.org/n?u=RePEc:unm:umagsb:2019029&r=all
  2. By: Wood, Benjamin; Müller, Rui; Brown, Annette Nicole
    Abstract: Objective: In past years, research audit exercises conducted across several fields of study have found a high prevalence of published empirical research that cannot be reproduced using the original dataset and software code (replication files). The failure to reproduce arises either because the original authors refuse to make replication files available or because third party researchers are unable to produce the published results using the provided files. Both causes create a credibility challenge for empirical research, as it means those published findings are not verifiable. In recent years, increasing numbers of journals, funders, and academics have embraced research transparency, which should reduce the prevalence of failures to reproduce. This study reports the results of a research audit exercise, known as the push button replication (PBR) project, which tested a sample of studies published in 2014 that use similar empirical methods but span a variety of academic fields. Methods: To draw our sample of articles, we used the 3ie Impact Evaluation Repository to identify the ten journals that published the most impact evaluations (experimental and quasi-experimental intervention studies) from low- and middle-income countries from 2010 through 2012. This set includes health, economics, and development journals. We then selected all articles in these journals published in 2014 that meet the same inclusion criteria. We developed and piloted a detailed protocol for conducting push button replication and determining the level of comparability of the replication findings to the original. To ensure all materials and processes for the PBR project were transparent, we established a project site on the Open Science Framework. We divided the sample of articles across several researchers who followed the protocol to request data and conduct the replications. Results: Of the 109 articles in our sample, only 27 are push button replicable, meaning the provided code run on the provided dataset produces comparable findings for the key results in the published article. The authors of 59 of the articles refused to provide replication files. Thirty of these 59 articles were published in journals that had replication file requirements in 2014, meaning these articles are non-compliant with their journal requirements. For the remaining 23 articles, we confirmed that three had proprietary data, we received incomplete replication files for 15, and we found minor differences in the replication results for five. We found open data for only 14 of the articles in our sample.
    Date: 2018–06–19
    URL: http://d.repec.org/n?u=RePEc:osf:osfxxx:n7a4d&r=all

This nep-sog issue is ©2020 by Jonas Holmström. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.