Reproducibility in Management Science

Miloš Fišar, Ben Greiner*, Christoph Huber, Elena Katok, Ali I. Ozkes, The Management Science Repoducibility Collaboration, Tom Grad, Paul Hünermund, Giacomo Marchesini, Michel Van der Borgh

*Corresponding author af dette arbejde

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

120 Downloads (Pure)

Abstract

With the help of more than 700 reviewers, we assess the reproducibility of nearly 500 articles published in the journal Management Science before and after the introduction of a new Data and Code Disclosure policy in 2019. When considering only articles for which data accessibility and hardware and software requirements were not an obstacle for reviewers, the results of more than 95% of articles under the new disclosure policy could be fully or largely computationally reproduced. However, for 29% of articles, at least part of the data set was not accessible to the reviewer. Considering all articles in our sample reduces the share of reproduced articles to 68%. These figures represent a significant increase compared with the period before the introduction of the disclosure policy, where only 12% of articles voluntarily provided replication materials, of which 55% could be (largely) reproduced. Substantial heterogeneity in reproducibility rates across different fields is mainly driven by differences in data set accessibility. Other reasons for unsuccessful reproduction attempts include missing code, unresolvable code errors, weak or missing documentation, and software and hardware requirements and code complexity. Our findings highlight the importance of journal code and data disclosure policies and suggest potential avenues for enhancing their effectiveness.
OriginalsprogEngelsk
TidsskriftManagement Science
Vol/bind70
Udgave nummer3
Sider (fra-til)1343-1356
Antal sider14
ISSN0025-1909
DOI
StatusUdgivet - mar. 2024

Bibliografisk note

Published online: 22. December 2023.

Over 700 people have contributed to this work. Only the cited authors and the CBS-affiliated researchers are listed here. See the full list of contributors via the DOI and the supplemental materials.

Emneord

  • Reproducibility
  • Replication
  • Crowd Science

Citationsformater