nep-sog New Economics Papers
on Sociology of Economics
Issue of 2011‒10‒09
four papers chosen by
Jonas Holmström
Swedish School of Economics and Business Administration

  1. How Do Editors Select Papers, and How Good are They at Doing It? By Robert Hofmeister; Matthias Krapf
  2. Ein Ranking von Hochschulen und (Bundes-)Ländern am Beispiel der Betriebswirtschaftslehre By Müller, Harry; Dilger, Alexander
  3. The Use of Bibliometrics to Measure Research Performance in Education Sciences By Andrea Diem; Stefan C. Wolter
  4. The end of the "European paradox" By Neus Herranz; Javier Ruiz-Castillo

  1. By: Robert Hofmeister (Department of Economics, University of Konstanz, Germany); Matthias Krapf (Chair for International Personnel Management, University Wien, Austria)
    Abstract: Using data on the B.E. Journals that rank articles into four quality tiers, this paper examines the accuracy of the research evaluation process in economics. We find that submissions by authors with strong publication records and authors affiliated with highly-ranked institutions are significantly more likely to be published in higher tiers. Citation success as measured by RePEc statistics also depends heavily on the overall research records of the authors. Finally and most importantly, we measure how successful the B.E. Journals’ editors and their reviewers have been at assigning articles to quality tiers. While, on average, they are able to distinguish more influential from less influential manuscripts, we also observe many assignments that are not compatible with the belief that research quality is reflected by the number of citations.
    Keywords: Peer Review, Research Evaluation, Citations, Journal Quality
    JEL: A10 A14
    Date: 2011–09–29
    URL: http://d.repec.org/n?u=RePEc:knz:dpteco:1137&r=sog
  2. By: Müller, Harry; Dilger, Alexander
    Abstract: Die Analyse und der Vergleich des Outputs wirtschaftswissenschaftlicher Fachbereiche erfahren eine anhaltend hohe Aufmerksamkeit. Evaluationen und Rankings schaffen für die Hochschule als Anbieter von Forschungs- und Lehrleistungen mehr Klarheit über den eigenen Output und verringern die Informationsasymmetrien auf den Märkten, auf denen die Hochschule mit potentiellen Studierenden oder den Nachfragern von Forschungsergebnissen interagiert. Das bekannteste Forschungsleistungsranking betriebswirtschaftlicher Fachbereiche ist 2009 vom Handelsblatt vorgelegt worden. Dessen Methodologie erweist sich allerdings sowohl hinsichtlich der Datenbasis als auch der verwendeten Aggregationsmethode als nicht unproblematisch. Bezugnehmend auf diese Kritik wird ein zitationsbasiertes Ranking mit Google Scholar entworfen. Es soll den aktuellen Output der betriebswirtschaftlichen Fachbereiche in Deutschland, Österreich und der Schweiz abbilden. Für die Rangfolgebildung werden drei alternative Kriterien vorgeschlagen. Im Anschluss sollen mittels einer ökonometrischen Schätzung mögliche Einflussfaktoren auf die Platzierungen identifiziert werden. Abschließend werden die Daten zudem auf der Ebene der (Bundes-)Länder aggregiert, um auch diese hinsichtlich der in ihnen erbrachten betriebswirtschaftlichen Forschungsleistungen miteinander zu vergleichen. -- The analysis and comparison of the academic output of business departments receive a sustained high attention. For universities as producers of services in research and teaching, evaluations and rankings provide more clarity about their own output. Furthermore, they reduce information asymmetries in the markets on which the university interacts with prospective students and users of research. The best-known research ranking of business departments has been published by the newspaper Handelsblatt in 2009. Its methodology, however, is problematic in regard of the underlying data base and the methods of aggregation. Referring to this critique our paper outlines a citation based approach with Google Scholar. Its purpose is to reflect the current research output of business departments in Germany, Austria and Switzerland, and we propose three alternative aggregation methods. Furthermore, we identify possible influencing factors. Finally, the data gets aggregated at the level of (federal) states in order to compare the different jurisdictions according to the business research produced within.
    JEL: I23 I20 A11
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:umiodp:82011&r=sog
  3. By: Andrea Diem (Swiss Coordination Centre for Research in Education (SCCRE), Aarau); Stefan C. Wolter (Swiss Coordination Centre for Research in Education (SCCRE), Aarau, and University of Bern, CESifo and IZA)
    Abstract: This paper uses bibliometric data to investigate the research performance of Swiss professors in the field of education sciences. The analyses are based on two separate databases: Web of Science and Google Scholar. A comparison of the various indicators used to measure research performance (quantity of publications and citation impact) from the two data sources indicates highly positive correlations between all of them, to a greater or lesser degree. At the same time, there is evidence that significant individual factors that would serve to explain the great variance in research performance can be identified only if the Web of Science is used as a benchmark of research performance. However, the Web of Science inclusion policy is associated with certain issues that put some research authors at a disadvantage. Therefore, problems currently exist in regard to both citation databases when used to benchmark individual research performance: Web of Science adopts a selective approach, but some of the criteria employed are problematic. Google Scholar on the other hand is so inclusive that it is virtually impossible to identify explanatory variables for the existing major individual differences in research performance.
    Keywords: bibliometrics, education sciences, research performance, scientometric methods, science research
    JEL: I23 I29
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:iso:educat:0066&r=sog
  4. By: Neus Herranz; Javier Ruiz-Castillo
    Abstract: This paper evaluates the European Paradox according to which Europe plays a leading world role in terms of scientific excellence, measured in terms of the number of publications, but lacks the entrepreneurial capacity of the U.S. to transform this excellent performance into innovation, growth, and jobs. Citation distributions for the U.S., the European Union (EU), and the rest of the world are evaluated using a pair of high- and low-impact indicators, as well as the mean citation rate. The dataset consists of 3.6 million articles published in 1998-2002 with a common five-year citation window. The analysis is carried at a low aggregation level: the 219 sub-fields identified with the Web of Science categories distinguished by Thomson Scientific. The problems posed by international co-authorship and the multiple assignments of articles to sub-fields are solved following a multiplicative strategy. In the first place, we find that, although the EU has more publications than the U.S. in 113 out of 219 sub-fields, the U.S. is ahead of the EU in 189 and 163 sub-fields in terms of the high- and low-impact indicators. In the second place, we verify that using the high-impact indicator the U.S./EU gap is usually greater than when using the mean citation rate.
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we1127&r=sog

This nep-sog issue is ©2011 by Jonas Holmström. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.