Crowdsourcing Research Questions in Science

Susanne Beck*, Tiare Brasseur, Marion Poetz, Henry Sauermann

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

208 Downloads (Pure)


Scientists are increasingly crossing the boundaries of the professional system by involving the general public (the crowd) directly in their research. However, this crowd involvement tends to be confined to empirical work and it is not clear whether and how crowds can also be involved in conceptual stages such as formulating the questions that research is trying to address. Drawing on five different “paradigms” of crowdsourcing and related mechanisms, we first discuss potential merits of involving crowds in the formulation of research questions (RQs). We then analyze data from two crowdsourcing projects in the medical sciences to describe key features of RQs generated by crowd members and compare the quality of crowd contributions to that of RQs generated in the conventional scientific process. We find that the majority of crowd contributions are problem restatements that can be useful to assess problem importance but provide little guidance regarding potential causes or solutions. At the same time, crowd-generated research questions frequently cross disciplinary boundaries by combining elements from different fields within and especially outside medicine. Using evaluations by professional scientists, we find that the average crowd contribution has lower novelty and potential scientific impact than professional research questions, but comparable practical impact. Crowd contributions outperform professional RQs once we apply selection mechanisms at the level of individual contributors or across contributors. Our findings advance research on crowd and citizen science, crowdsourcing and distributed knowledge production, as well as the organization of science. We also inform ongoing policy debates around the involvement of citizens in research in general, and agenda setting in particular.
Original languageEnglish
Article number104491
JournalResearch Policy
Issue number4
Number of pages21
Publication statusPublished - May 2022


  • Crowd science
  • Citizen science
  • Crowdsourcing
  • Problem solving
  • Problem finding
  • Agenda setting
  • Organization of science

Cite this