Crowdsourcing in medical research: concepts and applications

Authors : Joseph D. Tucker, Suzanne Day, Weiming Tang, Barry Bayus

Crowdsourcing shifts medical research from a closed environment to an open collaboration between the public and researchers. We define crowdsourcing as an approach to problem solving which involves an organization having a large group attempt to solve a problem or part of a problem, then sharing solutions.

Crowdsourcing allows large groups of individuals to participate in medical research through innovation challenges, hackathons, and related activities. The purpose of this literature review is to examine the definition, concepts, and applications of crowdsourcing in medicine.

This multi-disciplinary review defines crowdsourcing for medicine, identifies conceptual antecedents (collective intelligence and open source models), and explores implications of the approach. Several critiques of crowdsourcing are also examined.

Although several crowdsourcing definitions exist, there are two essential elements: (1) having a large group of individuals, including those with skills and those without skills, propose potential solutions; (2) sharing solutions through implementation or open access materials.

The public can be a central force in contributing to formative, pre-clinical, and clinical research. A growing evidence base suggests that crowdsourcing in medicine can result in high-quality outcomes, broad community engagement, and more open science.

URL : Crowdsourcing in medical research: concepts and applications

DOI : https://doi.org/10.7717/peerj.6762

Bibliothèques numériques et crowdsourcing : expérimentations autour de Numalire, projet de numérisation à la demande par crowdfunding

Auteur/Author : Mathieu Andro

Au lieu d’externaliser certaines tâches auprès de prestataires ayant recours à des pays dont la main d’œuvre est bon marché, les bibliothèques dans le monde font de plus en plus appel aux foules d’internautes, rendant plus collaborative leur relation avec les usagers.

Après un chapitre conceptuel sur les conséquences de ce nouveau modèle économique sur la société et sur les bibliothèques, un panorama des projets est présenté dans les domaines de la numérisation à la demande, de la correction participative de l’OCR notamment sous la forme de jeux (gamification) et de la folksonomie.

Ce panorama débouche sur un état de l’art du crowdsourcing appliqué à la numérisation et aux bibliothèques numériques et sur des analyses dans le domaine des sciences de l’information et de la communication.

Enfin, sont présentées des apports conceptuels et des expérimentations originales, principalement autour du projet Numalire de numérisation à la demande par crowdfunding.

URL : Bibliothèques numériques et crowdsourcing : expérimentations autour de Numalire, projet de numérisation à la demande par crowdfunding

Alternative location : http://prodinra.inra.fr/record/373583

Using Crowdsourcing to Evaluate Published Scientific Literature: Methods and Example

Statut

“Systematically evaluating scientific literature is a time consuming endeavor that requires hours of coding and rating. Here, we describe a method to distribute these tasks across a large group through online crowdsourcing. Using Amazon’s Mechanical Turk, crowdsourced workers (microworkers) completed four groups of tasks to evaluate the question, “Do nutrition-obesity studies with conclusions concordant with popular opinion receive more attention in the scientific community than do those that are discordant?” 1) Microworkers who passed a qualification test (19% passed) evaluated abstracts to determine if they were about human studies investigating nutrition and obesity. Agreement between the first two raters’ conclusions was moderate (κ = 0.586), with consensus being reached in 96% of abstracts. 2) Microworkers iteratively synthesized free-text answers describing the studied foods into one coherent term. Approximately 84% of foods were agreed upon, with only 4 and 8% of ratings failing manual review in different steps. 3) Microworkers were asked to rate the perceived obesogenicity of the synthesized food terms. Over 99% of responses were complete and usable, and opinions of the microworkers qualitatively matched the authors’ expert expectations (e.g., sugar-sweetened beverages were thought to cause obesity and fruits and vegetables were thought to prevent obesity). 4) Microworkers extracted citation counts for each paper through Google Scholar. Microworkers reached consensus or unanimous agreement for all successful searches. To answer the example question, data were aggregated and analyzed, and showed no significant association between popular opinion and attention the paper received as measured by Scimago Journal Rank and citation counts. Direct microworker costs totaled $221.75, (estimated cost at minimum wage: $312.61). We discuss important points to consider to ensure good quality control and appropriate pay for microworkers. With good reliability and low cost, crowdsourcing has potential to evaluate published literature in a cost-effective, quick, and reliable manner using existing, easily accessible resources.”

URL : Using Crowdsourcing to Evaluate Published Scientific Literature: Methods and Example

DOI: 10.1371/journal.pone.0100647