Why it matters
Citizen science is proving there is strength – and accuracy – in numbers.
Systematic reviews are a cornerstone of evidence-based medicine, but collecting and critically analysing all relevant research for a systematic review can take thousands of hours. That task is growing as the scientific literature base increases, leading researchers to look for ways to make systematic reviews faster and more efficient.
The first of THIS Institute’s three learning reports on citizen science – published in May 2018 – highlighted how crowdsourcing can bring the expert voices of everyday people into research, even if they don’t consider themselves researchers. This second report in the series focuses on crowdsourcing’s potential to help conduct systematic reviews, and provides a practical overview of the associated tools, opportunities, and challenges.
What we did
Our study included a review of relevant literature in academic databases, desk research (including websites) and interviews with six experts who have conducted systematic reviews using crowdsourcing.
What we found
- Early evidence shows that citizen science can make the systematic review process more timely, efficient, and relevant.
- In one crowdsourced systematic review, 15,000 screening decisions were made in just 100 hours.
- With the right quality control measures, crowdsourced systematic reviews can meet the ‘gold standard’ of traditional systematic reviews.
- Participant drop-out rates can be high, but providing training, well-defined tasks, feedback and rewards can encourage participation in crowdsourced systematic reviews.
- Interest in crowdsourcing for research is growing, and new tools and platforms to facilitate the process continue to be developed.