Skip to main content

Usability and acceptability of four systematic review automation software packages: a mixed method design

Abstract

Aim

New software packages help to improve the efficiency of conducting a systematic review through automation of key steps in the systematic review. The aim of this study was to gather qualitative data on the usability and acceptability of four systematic review automation software packages (Covidence, SRA-Helper for EndNote, Rayyan and RobotAnalyst) for the citation screening step of a systematic review.

Methods

We recruited three volunteer systematic reviewers and asked them to use allocated software packages during citation screening. They then completed a 12-item online questionnaire which was tailored to capture data for the software packages used.

Findings

All four software packages were reported to be easy or very easy to learn and use. SRA-Helper for EndNote was most favoured by participants for screening citations and Covidence for resolving conflicts. Overall, participants reported that SRA-Helper for EndNote would be their software package of choice, primarily due to its efficiency.

Conclusion

This study identified a number of considerations which systematic reviewers can use as a basis of their decision which software to use when performing the citation screening and dispute resolution steps of a systematic review.

Peer Review reports

Background

Systematic reviews are the foundation of evidence-based practice. Yet, despite advancements in automation of some of the steps of systematic reviews [1, 2], conducting a systematic review remains a largely manual process that requires considerable expertise, time and financial resources. Software packages have recently become available to help to improve the efficiency of some of the steps of the systematic review process, including literature searching, de-duplicating of search results, screening citations and resolving conflicts. Each software package has its strengths and limitations. This study aimed to, firstly, assess the usability (ease of use and learnability) and acceptability (sufficient to serve the purpose for which it is intended) of the following, commonly used, systematic review automation software packages: Covidence, SRA-Helper for EndNote, Rayyan and RobotAnalyst, and secondly, to identify key advantages and disadvantages of each, as perceived by the users.

Methods

Three volunteer systematic reviewers, who were commencing a systematic review (882 total citation records to screen), were recruited via email invitation from the study authors (GC and EB). The systematic reviewers were asked to use allocated software packages for the title/abstract screening and dispute resolution steps of the systematic review and to complete an online questionnaire which was tailored to capture data for the software package used. One participant was assigned to use and review Covidence and SRA-Helper for EndNote (BJ), and one was assigned to Rayyan and RobotAnalyst (FI). The final reviewer was assigned to use and review Covidence vs SRA-Helper for EndNote vs Rayyan vs RobotAnalyst (AMS). Each software was used to screen 220 or 221 references, for a total of 882 references screened.

Covidence (www.covidence.org) is a web-based screening and data extraction tool; it is one of the tools recommended by the Cochrane Collaboration [3]. Covidence allows authors to import and screen citations and full-text articles, resolve conflicts, extract data using customisable forms and export results in various formats.

SRA-Helper for EndNote (https://github.com/CREBP/EndNoteHelper) is a downloadable automation script which works as an add-on to EndNote; it is part of the Systematic Review Accelerator. SRA-Helper for EndNote allows users to map keyboard keys (e.g. 1, 2, 3) to folders (e.g. include, exclude, background). When the user highlights a reference (e.g. Smith 1998), and presses the key (e.g. 1), the reference automatically moves to the relevant folder (e.g. include).

Rayyan (https://rayyan.qcri.org) is a web-based application which allows multiple authors to create and collaborate on systematic reviews. Throughout the citation screening process, Rayyan offers suggestions for article inclusion based on the authors’ prior selections [4].

RobotAnalyst (www.nactem.ac.uk/robotanalyst) is a web-based application, developed to support the citation screening phase of systematic reviews. RobotAnalyst prioritises references by relevancy predictions and updates the predictive model after the author makes each screening selection.

To identify the advantages and disadvantages of each software package, we used a 12-item questionnaire which included 3 Likert-scale questions and 9 free text questions (Additional file 1: Table S1). Questions 1–8 focused on the usability and acceptability of each software package; these questions were repeated for each software package being assessed. Questions 9–11 were comparator questions, which assessed the user’s preference for one software tool over another when screening citations and resolving conflicts. We used Qualtrics (www.qualtrics.com) to disseminate the questionnaires and collect the data.

Results

The three systematic reviewers answered all of the questions presented in the qualitative questionnaire.

We summed the participants’ quantitative responses for each software package as displayed in Table 1 (lowest possible score = 2 points; highest possible score = 10 points). All four software packages were reported to be easy or very easy to learn and use (Table 1). Covidence had the highest rating for general usability, scoring 9 out of a possible 10 points. This may be due to its ‘straightforward process and simple layout’, and it being available online, without the need to download it, ‘making it versatile/accessible when out of office’. SRA-Helper for EndNote was rated fastest for response time (10/10), as ‘it’s not dependent on internet connection’ and RobotAnalyst the slowest (5/10). When combining scores for ease of learning how to use the software, usability and response time, SRA-Helper for EndNote scored highest (28/30).

Table 1 Sum of quantitative responses for each software package

SRA-Helper for EndNote was most favoured by participants for screening citations and Covidence for resolving conflicts (Table 2). Overall, participants reported that in conducting future systematic reviews, SRA-Helper for EndNote would be their software package of choice, primarily due to its efficiency (Table 2).

Table 2 Comparison between software packages

The key advantages considered relevant by systematic reviewers included visibility of already screened/yet to be screened citations (e.g. in a form of a countdown or summary of progress), ability to highlight key terms for inclusion and exclusion, ability to use keyboard shortcuts, fast response time of the software (e.g. between decision to include and the reference moving off the screen and into the ‘included’ folder), ability to add notes or labels to the references, guidance of decision for subsequent references on the basis of prior decisions and ease of learning how to use the tool and intuitiveness of the layout.

The key disadvantages considered relevant by the systematic reviewers included glitches in the software (e.g. crashes), slow response time of the software, inability to see progress (references screened vs those remaining to be screened), inability to change mind once decision to include/exclude was made, requirement to download or install software, inability to highlight included/excluded terms, unreliable predictions and poor layout (e.g. decision buttons too close together). The advantages and disadvantages of using each automation software package to screen citations and resolve conflicts are summarised in Table 3.

Table 3 Summary of the reported advantages and disadvantages of each software package

Discussion

Overall, the systematic reviewers found all four of the software tools easy to learn and use. SRA-Endnote helper was strongly preferred (28/30 points), with RobotAnalyst, Covidence and Rayyan scoring lower but similarly (22, 23 and 24/30, respectively). The strong preference for SRA-Endnote Helper may be explained by its ease of learning to use and very quick response time due to it being a desktop (rather than web-based) tool.

Among the key characteristics considered relevant were display of screening progress, ability to revise decisions, ability to highlight inclusion/exclusion terms, ability to use keyboard shortcuts rather than the mouse, software response time and reliability (i.e. no bugs or crashes). Users, particularly those newer to systematic reviews, also cited the intuitiveness of the layout and ease of learning how to use the tool as important.

It is worth emphasising here that what is considered an advantage or disadvantage will vary by systematic reviewer: one may prefer a web-based tool that is a bit slower in response time but available anywhere without a download or install, whilst another may prefer a downloadable tool that is faster in response time but requires a download or install. However, whilst the individual preferences will vary, our aim here was to identify what those key considerations are—to help systematic reviewers (particularly those new to these tools) to make their own decisions which to use.

Although the sample of systematic reviewers included in the present assessment is small (n = 3), it deliberately included both novice systematic reviewers and an experienced systematic reviewer. We are therefore confident that the considerations they raised are likely to be the issues considered relevant by the larger systematic review population. This can only be formally measured with a larger sample in future research.

Ongoing and future developments in the automation of screening—including automated screening and text mining—would help to increase the efficiency and reduce human effort required [1, 2].

Conclusion

The results of this qualitative report highlight multiple advantages and disadvantages of automation software packages for screening in systematic reviews. The outcomes show that the SRA-Endnote helper was the preferred software due to the fast response time, user-friendly set up, intuitive layout and the fact that it does not rely on internet connection. This study identified a number of relevant considerations for systematic review software packages, and individual systematic reviewers can take this into consideration whilst performing the citation screening and dispute resolution steps of a systematic review.

Availability of data and materials

All data generated or analysed during this study are included in this published article [and its supplementary information files].

Abbreviations

SRA:

Systematic Review Accelerator

References

  1. Tsafnat G, Glasziou P, Karystianis G, Coiera E. Automated screening of research studies for systematic reviews using study characteristics. Syst Rev. 2018;25:64.

    Article  Google Scholar 

  2. O’Mara-Eves A, Thomas J, McNaught J, Miwa M, Ananiadou S. Using text mining for study identification in systematic reviews: a systematic review of current approaches. Syst Rev. 2015;4:5.

    Article  Google Scholar 

  3. Covidence. Cochrane Community. https://community.cochrane.org/help/tools-and-software/covidence. Accessed 3 Jul 2018.

  4. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5:210.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable

Funding

GC, FI and BJ received no funding with respect for this study. AMS and EB are supported by a NHMRC grant APP1044904 (Centre for Research Excellence in Minimising Antibiotic Resistance for Acute Respiratory Infection, CREMARA).

Author information

Authors and Affiliations

Authors

Contributions

GC developed the interview schedule and analysed the data. AMS, FI and BJ tested the software and completed the questionnaires. EB conceived and was overseeing the study. All authors read, contributed to and approved the final manuscript.

Corresponding author

Correspondence to Gina Cleo.

Ethics declarations

Ethics approval and consent to participate

As this research involved staff participating by virtue of their professional role, ethics approval was not sought.

Consent for publication

Not applicable

Competing interests

EndNote Helper was developed by our research centre (Centre for Research in Evidence-Based Practice). The authors declare that they have no other competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Table S1. Qualitative questionnaire schedule. (DOCX 16 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cleo, G., Scott, A.M., Islam, F. et al. Usability and acceptability of four systematic review automation software packages: a mixed method design. Syst Rev 8, 145 (2019). https://0-doi-org.brum.beds.ac.uk/10.1186/s13643-019-1069-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s13643-019-1069-6

Keywords