Skip to content
Complete
Methods

Crowdsourcing in systematic reviews

Background

Systematic reviews are an important research activity. By summarising all relevant evidence, they tell us what is currently known about a particular topic, and what needs further research.

But systematic reviews are also time-consuming. A small research team can take well over a year to screen, review and synthesise potentially thousands of citations.

In light of these challenges, crowdsourcing has emerged as a way to do systematic reviews more quickly, while involving new communities in the research process. In crowdsourced reviews, large numbers of people are recruited to do small tasks or activities. Citation screening is one such task, where members of the ‘crowd’ can help screen publications from a literature search to see which ones are relevant.

By recruiting large groups of people to help screen citations, this project sought to advance understanding of the best methods for crowdsourcing in systematic reviews, and to evaluate how the work of the ‘crowd’ compares to that of traditional researchers.

Approach

While most research into crowdsourcing in systematic reviews has focused on the crowd’s ability to screen for randomised controlled trials, our study is focused on crowdsourced citation screening for more complicated reviews that bring together all sorts of study designs. Toward that aim, we experimented with a number of different ways of using crowdsourcing in systematic reviews.

Using online research participation platforms, we are exploring how screening decisions on titles and abstracts made by a non-expert crowd of volunteers compared with the decisions made by our in-house review team, as part of a complex systematic review. We are also exploring whether crowd-sourced approaches may benefit from different approaches to screening titles and abstracts – for example, by screening for exclusion rather than inclusion.

Some of the systematic reviews conducted as part of this project will be used to inform other THIS Institute projects.

Results

78 crowd contributors completed a screening task. The crowd’s ability to identify correctly studies included within the systematic review was 84%, and its ability to identify correctly excluded studies was 99%. The task took 33 hours to complete for the crowd and 410 hours for the review team. The task was replicated with 85 new contributors and a revised crowd screening process. The replicated task found an increase in the crowd’s ability to identify studies for inclusion, rising to 96%. Crowd contributors reported positive experiences of the task.

It is feasible to recruit and train a crowd to perform accurate topic-based citation screening for a mixed studies systematic review. In the face of long review production times, crowd screening may enable a more time-efficient conduct of reviews, with minimal reduction of citation screening accuracy. Further questions remain about the approach including questions relating to time and cost savings, resources required and screening for inclusion.

Funding and ethics

This study is funded by the Health Foundation’s grant to The Healthcare Improvement Studies Institute (THIS Institute). It is independently led by THIS Institute.

Sign up to receive the latest news, reports and articles from THIS Institute.