Motives, Gender, and Experience: Performance Effects in Crowdsourcing Contests

Jonas Heite, Karin Hoisl*, Rainer Widmann

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

Abstract

Our study examines how individual characteristics—economic versus achievement-based motives, gender, and experience—moderate the “performance revision effect” in tournament-based crowdsourcing competitions. This effect refers to a phenomenon in which contestants reduce their effort when competing against significantly higher-ability opponents. Using data from Topcoder, a leading crowdsourcing platform, we conducted a quasiexperimental study with 1,677 coders in 38 single-round matches. Our regression discontinuity design exploits Topcoder’s skill-based divisions to assess contestants’ responses to differing opponent abilities. The results confirm the performance revision effect, revealing an average performance decline of 20% when contestants face higher-ability opponents. Moreover, female and more experienced participants show a stronger response to the performance revision effect than their male and less-experienced peers. Our findings contribute to the crowdsourcing literature by highlighting the boundary conditions of the performance revision effect and by quantifying the performance implications of contest designs for different contestants, allowing platform operators to make data-driven cost-benefit decisions about contest design to mitigate performance losses.
Original languageEnglish
JournalStrategy Science
Number of pages24
ISSN2333-2050
DOIs
Publication statusPublished - 8 Sept 2025

Bibliographical note

Epub ahead of print. Published online: 08 September 2025.

Keywords

  • Innovation management
  • Information technology
  • Technology strategy

Cite this