Systematic reviews are an important research activity. By summarising all relevant evidence, they tell us what is currently known about a particular topic, and what needs further research.
But systematic reviews are also time-consuming. A small research team can take well over a year to screen, review and synthesise potentially thousands of citations.
In light of these challenges, crowdsourcing has emerged as a way to do systematic reviews more quickly, while involving new communities in the research process. In crowdsourced reviews, large numbers of people are recruited to do small tasks or activities. Citation screening is one such task, where members of the ‘crowd’ can help screen publications from a literature search to see which ones are relevant.
By recruiting large groups of people to help screen citations, this project seeks to advance understanding of the best methods for crowdsourcing in systematic reviews, and to evaluate how the work of the ‘crowd’ compares to that of traditional researchers.
While most research into crowdsourcing in systematic reviews has focused on the crowd’s ability to screen for randomised controlled trials, our study is focused on crowdsourced citation screening for more complicated reviews that bring together all sorts of study designs. Toward that aim, we are experimenting with a number of different ways of using crowdsourcing in systematic reviews.
Using online research participation platforms, we are conducting a series of pilots to explore:
Some of the systematic reviews conducted as part of this project be used to inform other THIS Institute projects.