What’s the Problem? How Crowdsourcing Contributes to Identifying Scientific Research Questions

Research output: Contribution to conferencePaperResearchpeer-review

Abstract

An increasing number of research projects successfully involves the general public (the crowd) in tasks such as collecting observational data or classifying images to answer scientists’ research questions. Although such crowd science projects have generated great hopes among scientists and policy makers, it is not clear whether the crowd can also meaningfully contribute to other stages of the research process, in particular the identification of research questions that should be studied. We first develop a conceptual framework that ties different aspects of “good” research questions to different types of knowledge. We then discuss potential strengths and weaknesses of the crowd compared to professional scientists in developing research questions, while also considering important heterogeneity among crowd members. Data from a series of online and field experiments has been gathered and is currently analyzed to test individual- and crowd-level hypotheses focusing on the underlying mechanisms that influence a crowd’s performance in generating research questions. Our results aim for advancing the literatures on crowd and citizen science as well as the broader literature on crowdsourcing and the organization of open and distributed knowledge production. Our findings have important implications for scientists and policy makers.
Original languageEnglish
Publication date2019
Number of pages36
Publication statusPublished - 2019
EventDRUID19 Conference - Copenhagen Business School, Frederiksberg, Denmark
Duration: 19 Jun 201921 Jun 2019
Conference number: 41
https://conference.druid.dk/Druid/?confId=59

Conference

ConferenceDRUID19 Conference
Number41
LocationCopenhagen Business School
CountryDenmark
CityFrederiksberg
Period19/06/201921/06/2019
Internet address

Bibliographical note

CBS Library does not have access to the material

Keywords

  • Crowd science
  • Open science
  • Scientific knowledge production
  • Experimental design
  • Problem finding
  • Crowdsourcing

Cite this

Beck, S., Brasseur, T-M., Poetz, M., & Sauermann, H. (2019). What’s the Problem? How Crowdsourcing Contributes to Identifying Scientific Research Questions. Paper presented at DRUID19 Conference, Frederiksberg, Denmark.
@conference{080c4a7734814b729d64cd2169a0ca5c,
title = "What’s the Problem?: How Crowdsourcing Contributes to Identifying Scientific Research Questions",
abstract = "An increasing number of research projects successfully involves the general public (the crowd) in tasks such as collecting observational data or classifying images to answer scientists’ research questions. Although such crowd science projects have generated great hopes among scientists and policy makers, it is not clear whether the crowd can also meaningfully contribute to other stages of the research process, in particular the identification of research questions that should be studied. We first develop a conceptual framework that ties different aspects of “good” research questions to different types of knowledge. We then discuss potential strengths and weaknesses of the crowd compared to professional scientists in developing research questions, while also considering important heterogeneity among crowd members. Data from a series of online and field experiments has been gathered and is currently analyzed to test individual- and crowd-level hypotheses focusing on the underlying mechanisms that influence a crowd’s performance in generating research questions. Our results aim for advancing the literatures on crowd and citizen science as well as the broader literature on crowdsourcing and the organization of open and distributed knowledge production. Our findings have important implications for scientists and policy makers.",
keywords = "Crowd science, Open science, Scientific knowledge production, Experimental design, Problem finding, Crowdsourcing, Crowd science, Open science, Scientific knowledge production, Experimental design, Problem finding, Crowdsourcing",
author = "Susanne Beck and Tiare-Maria Brasseur and Marion Poetz and Henry Sauermann",
note = "CBS Library does not have access to the material; null ; Conference date: 19-06-2019 Through 21-06-2019",
year = "2019",
language = "English",
url = "https://conference.druid.dk/Druid/?confId=59",

}

Beck, S, Brasseur, T-M, Poetz, M & Sauermann, H 2019, 'What’s the Problem? How Crowdsourcing Contributes to Identifying Scientific Research Questions' Paper presented at, Frederiksberg, Denmark, 19/06/2019 - 21/06/2019, .

What’s the Problem? How Crowdsourcing Contributes to Identifying Scientific Research Questions. / Beck, Susanne; Brasseur, Tiare-Maria; Poetz, Marion; Sauermann, Henry.

2019. Paper presented at DRUID19 Conference, Frederiksberg, Denmark.

Research output: Contribution to conferencePaperResearchpeer-review

TY - CONF

T1 - What’s the Problem?

T2 - How Crowdsourcing Contributes to Identifying Scientific Research Questions

AU - Beck, Susanne

AU - Brasseur, Tiare-Maria

AU - Poetz, Marion

AU - Sauermann, Henry

N1 - CBS Library does not have access to the material

PY - 2019

Y1 - 2019

N2 - An increasing number of research projects successfully involves the general public (the crowd) in tasks such as collecting observational data or classifying images to answer scientists’ research questions. Although such crowd science projects have generated great hopes among scientists and policy makers, it is not clear whether the crowd can also meaningfully contribute to other stages of the research process, in particular the identification of research questions that should be studied. We first develop a conceptual framework that ties different aspects of “good” research questions to different types of knowledge. We then discuss potential strengths and weaknesses of the crowd compared to professional scientists in developing research questions, while also considering important heterogeneity among crowd members. Data from a series of online and field experiments has been gathered and is currently analyzed to test individual- and crowd-level hypotheses focusing on the underlying mechanisms that influence a crowd’s performance in generating research questions. Our results aim for advancing the literatures on crowd and citizen science as well as the broader literature on crowdsourcing and the organization of open and distributed knowledge production. Our findings have important implications for scientists and policy makers.

AB - An increasing number of research projects successfully involves the general public (the crowd) in tasks such as collecting observational data or classifying images to answer scientists’ research questions. Although such crowd science projects have generated great hopes among scientists and policy makers, it is not clear whether the crowd can also meaningfully contribute to other stages of the research process, in particular the identification of research questions that should be studied. We first develop a conceptual framework that ties different aspects of “good” research questions to different types of knowledge. We then discuss potential strengths and weaknesses of the crowd compared to professional scientists in developing research questions, while also considering important heterogeneity among crowd members. Data from a series of online and field experiments has been gathered and is currently analyzed to test individual- and crowd-level hypotheses focusing on the underlying mechanisms that influence a crowd’s performance in generating research questions. Our results aim for advancing the literatures on crowd and citizen science as well as the broader literature on crowdsourcing and the organization of open and distributed knowledge production. Our findings have important implications for scientists and policy makers.

KW - Crowd science

KW - Open science

KW - Scientific knowledge production

KW - Experimental design

KW - Problem finding

KW - Crowdsourcing

KW - Crowd science

KW - Open science

KW - Scientific knowledge production

KW - Experimental design

KW - Problem finding

KW - Crowdsourcing

M3 - Paper

ER -

Beck S, Brasseur T-M, Poetz M, Sauermann H. What’s the Problem? How Crowdsourcing Contributes to Identifying Scientific Research Questions. 2019. Paper presented at DRUID19 Conference, Frederiksberg, Denmark.