Skip to main content

The trials methodological research agenda: results from a priority setting exercise

Abstract

Background

Research into the methods used in the design, conduct, analysis, and reporting of clinical trials is essential to ensure that effective methods are available and that clinical decisions made using results from trials are based on the best available evidence, which is reliable and robust.

Methods

An on-line Delphi survey of 48 UK Clinical Research Collaboration registered Clinical Trials Units (CTUs) was undertaken. During round one, CTU Directors were asked to identify important topics that require methodological research. During round two, their opinion about the level of importance of each topic was recorded, and during round three, they were asked to review the group’s average opinion and revise their previous opinion if appropriate. Direct reminders were sent to maximise the number of responses at each round. Results are summarised using descriptive methods.

Results

Forty one (85%) CTU Directors responded to at least one round of the Delphi process: 25 (52%) responded in round one, 32 (67%) responded in round two, 24 (50%) responded in round three. There were only 12 (25%) who responded to all three rounds and 18 (38%) who responded to both rounds two and three. Consensus was achieved amongst CTU Directors that the top three priorities for trials methodological research were ‘Research into methods to boost recruitment in trials’ (considered the highest priority), ‘Methods to minimise attrition’ and ‘Choosing appropriate outcomes to measure’. Fifty other topics were included in the list of priorities and consensus was reached that two topics, ‘Radiotherapy study designs’ and ‘Low carbon trials’, were not priorities.

Conclusions

This priority setting exercise has identified the research topics felt to be most important to the key stakeholder group of Directors of UKCRC registered CTUs. The use of robust methodology to identify these priorities will help ensure that this work informs the trials methodological research agenda, with a focus on topics that will have most impact and relevance.

Peer Review reports

Background

Clinical trials are studies conducted in patients to evaluate the potential benefits and harms of interventions such as drugs and non-pharmaceutical treatments. Clinical trials provide evidence to inform and improve patient care, and the randomised trial is considered at the upper level of the hierarchy of evidence (see reference [1] for example) for effectiveness studies. Methodological research in the field of clinical trials can lead to the development and progress of methodology, as well providing valuable insights that improve the quality of the design, conduct, analysis, and reporting of clinical trials.

The United Kingdom Clinical Research Collaboration (UKCRC) has a network of registered Clinical Trials Units (CTUs) which have provided evidence to an international panel of experts of their capability to coordinate multi-centre clinical trials (that is, having overall responsibility for the design, development, recruitment, data management, publicity and analysis of a portfolio of trials), and of robust systems to ensure the conduct and delivery of clinical trials to the highest quality standards [2]. These CTUs are key stakeholders in the UK clinical trials research community, through their involvement with all aspects of clinical trial design, conduct, analysis and reporting, across a wide range of clinical areas that encompass an array of trial designs. They are frequently faced with methodological challenges and many are actively undertaking research to address those challenges.

The joint National Institute for Health Research (NIHR) and Medical Research Council (MRC) programme for methodology research has the dual aims of developing a high-quality methodology research portfolio and founding the United Kingdom (UK) as a world leader in methodological innovation in health research. Led by the MRC, the NIHR-MRC partnership provided £16 million to establish a network of regional centres dedicated to trials methodology research (Hubs for Trials Methodology Research; HTMR) in 2008. Located across the UK, the national HTMR network supports the research community in the UK by providing scientific leadership in methodology research and investigating new avenues of methodological research (http://www.methodologyhubs.mrc.ac.uk).

One HTMR project, led by the North West HTMR, planned to identify priorities for trials methodology research which would inform the trials methodology research agenda and allow planned research to be placed into context. This manuscript summarises a Delphi priority setting exercise involving the Directors of UKCRC registered CTUs. The Delphi approach allows the anonymous opinions of stakeholders to be obtained in a way that gives equal influence to all who participate, and avoids an individual participant being overtly influenced by the opinions of any other participant [3].

Methods

The Delphi process was used to develop consensus amongst the Directors of UKCRC registered CTUs. All 48 Directors were approached through direct Email contact and asked to participate in each round of the process, regardless of participation in previous rounds. A hyperlink to a web based data collection tool was provided in the Email at each round. A unique identification number was provided to each Director so that non-responders could be identified and sent direct reminder Emails and to allow responses to be linked between each round. Details of the methods to be used were described in a protocol and a summary information sheet provided to Directors. Due to the on-line format of the survey, completion of the questionnaire was regarded as consent to participate.

During round one of the Delphi process, CTU Directors were asked to provide information about their professional experience and list any topics that they felt were important priorities for trials methodology research. Two members of the research team independently reviewed and categorised the list of topics from this round. Disagreements were resolved through discussion and involvement of a third member of the research team. Topics were split into two separate lists to take forward to round two: the primary list consisted of topics which had been identified by more than one Director, and a secondary list consisted of topics identified by a single Director.

During round two, all 48 CTU Directors were provided with the primary list (completion compulsory) and the secondary list (completion optional) of topics and asked to assign a score between 1 and 9 to indicate their opinion about each topic as a research priority. This used a similar scale to that proposed by the GRADE group [4], with scores of 1 to 3 indicating that the topic was not important, scores of 4 to 6 indicating that the topic was important but not critical, and scores of 7 to 9 indicating that the topic was critical.

During round three, CTU Directors were presented with a similar web based format to round two but with additional information displaying the number of responders in round two, the percentage of Directors that assigned a particular score to each topic from round two, and, if relevant, the recipient Director’s individual score that had been selected in the previous round (Figure 1). As establishing consensus was the aim, each Director was asked to consider the responses from the group of Directors and to provide a revised score if they considered it appropriate.

Figure 1
figure 1

Part of the on-line data collection form used in round three.

A previously suggested [5] approach was used to classify the consensus status of each topic at the end of rounds two and three (Table 1). This approach was pre-defined before the start of round two. Results from rounds two and three are presented as percentage of responders, ranked in descending order according to the critical category (scores 7 to 9).

Table 1 Definition of consensus

Results

All 48 UKCRC registered CTU Directors were contacted between July 2011 and July 2012 and invited to participate in each of three rounds of the Delphi process. Forty one CTUs (85%) responded to at least one round; 25 (52%) responded in round one, 32 (67%) responded in round two, and 24 (50%) responded in round three. There were 12 (25%) who responded to all three rounds and 18 (38%) who responded to both round two and three. The pattern of responses across each round is shown in Figure 2.

Figure 2
figure 2

Response pattern and characteristics of 48 Clinical Trials Unit (CTU) Directors at each round of Delphi process.

The seven CTUs (15%) which did not respond in any round did not appear to be atypical compared to responding CTUs. They covered a range of geographical areas and clinical specialties which were already largely represented by the responders.

The 25 CTU Directors responding to round one of the survey had a median of 20 years’ experience working in clinical trials (range: 10 to 34 years; interquartile range: 15 to 22 years) and the majority had a statistical (11 (44%)) or clinical (8 (32%)) background. During round one, fifty five topics were identified as being important; 28 topics had been mentioned by more than one Director (primary list), and 27 topics were each identified by only one Director (secondary list).

For the primary list (topics identified by more than one Director in round one), consensus was not reached for any topic during round two (Table 2). However, at the end of round three there were three topics that emerged as agreed priorities (consensus in) for methodology research: (i) methods to boost recruitment, selected as critical by 83% of responders; (ii) methods to minimise attrition, selected as critical by 71% of responders; and (iii) choosing appropriate outcomes to measure, selected as critical by 71% of responders. In addition, consensus was reached that ‘radiotherapy study designs’ should not be a priority for methodology research (consensus out), selected as not important by 75% of responders.

Table 2 List of topics identified as priorities by more than one Clinical Trials Unit (CTU) Director from round one (primary list) ordered by round three priority ranks

From the secondary list (topics identified by only one Director in round one) there was consensus amongst CTU Directors that ‘low carbon trials’, selected as not important by 78% of responders, should not be a research priority (Table 3).

Table 3 List of topics identified as priorities by one director from round one (secondary list) ordered by round three priority ranks

Discussion

The Delphi process has been applied to identify topics of importance and establish consensus for trials methodological research priorities. ‘Methods to boost recruitment in trials’ were considered the highest priority, closely followed by ‘Methods to minimise attrition’ and ‘Choosing appropriate outcomes to measure’. This group of Directors of UKCRC registered CTUs also reached consensus that methodology research into ‘radiotherapy study designs’ and ‘low carbon trials’ should not be priority areas. Whilst this work is informative in providing broad topic areas to guide future research, specific research questions within a particular topic have not been prioritised. A further study using similar methodology to the current study could be undertaken to help identify and prioritise specific research questions.

Examples from the literature where consensus methods have been applied in clinical research include identifying gaps and prioritising trials in colorectal cancer [6], defining national research priorities in bone metastases [7], and work undertaken by the James Lind Alliance to determine important questions about treatments where uncertainty remains [8]. However, to our knowledge no priority setting exercise using rigorous consensus methods has previously been undertaken to guide methodological research in clinical trials. Within the growing literature on the elicitation of opinion and the development of core outcome sets in clinical trials, the range of consensus methods that have been applied include expert panel meetings [9], Delphi surveys [10], Nominal Group Techniques [11], focus groups [12], individual interviews [13] and individual questionnaires [14]. The practical difficulties of getting the CTU Directors together for a face-to-face meeting, and ensuring that each participant had the opportunity to contribute equally to the consensus process, made the on-line Delphi approach the preferred method for this particular priority setting exercise. Further rounds of the Delphi process may also have identified additional consensus priorities but this would have created a further burden for CTU Directors who are frequently targeted for on-line surveys.

Directors of UKCRC registered CTUs were selected for this exercise as they have relevant, and established, expertise in designing and running clinical trials, as well as being knowledgeable about the current landscape of methodological research. Ensuring that trials methodological research focuses on topics that meet the needs of this group is vital - all too often methodology is developed without considering the end-user, or the demand for such research, thus limiting the scope for practical implementation. It is also worth noting that five out of eight HTMR Directors are also CTU Directors and participated in the Delphi process. Further work with additional stakeholder groups such as clinical trial investigators, other methodologists, trial funders and policymakers, albeit with an anticipated divergence in perspectives, would be worthwhile and may show different preferences for methodological research priorities.

Overall, 41 (85%) of the registered CTUs participated in at least one of the three completed rounds. However, whilst this is a good response rate compared to previous on-line surveys undertaken with this stakeholder group, only 12 (25%) CTUs responded to all three rounds and only 18 (38%) responded to both rounds two and three. The seven CTUs that did not respond to any round covered a range of geographical areas and clinical specialties which were largely represented by the responding CTUs. Several reminders were sent using direct Email contact and it seems reasonable to assume that lack of time was the main reason for non-response, rather than substantially different or opposing views.

Of course, as has been noted by others, ‘the output from consensus approaches is rarely an end in itself’ [3], but this prioritised list of research topics provides a robust guide for trials methodology researchers and could help to focus resources more appropriately. Furthermore, the three identified priority areas are represented across six working groups that have been established by the MRC-funded HTMR network (Evidence Synthesis, Stratified Medicine, Outcomes, Recruitment, Trial Conduct, Adaptive Designs) to undertake projects on specific issues in methodology, including the provision of workshops, development of guidance papers, and liaison with stakeholders. Results from this work have been shared with the joint MRC-NIHR methodology research programme panel which funds investigator-led and needs-led research on methods development to underpin a number of areas including clinical trials.

Conclusions

Research into methods to boost recruitment in trials, methods to minimise attrition, and methods for choosing appropriate outcomes to measure are priority topics for methodological research.

Abbreviations

CTU:

Clinical Trials Unit

HTMR:

Hubs for Trials Methodology Research

MAMS:

Multi-arm Multi-stage

MRC:

Medical Research Council

NIHR:

National Institute for Health Research

UKCRC:

UK Clinical Research Collaboration.

References

  1. Phillips B, Ball C, Sackett D, Badenoch D, Straus S, Haynes B: Oxford centre for evidence-based medicine - levels of evidence (March 2009). Available from: http://www.cebm.net/?O=1025

  2. UKCRC registered clinical trials units network. Available from: http://www.ukcrc-ctu.org.uk/

  3. Jones J, Hunter D: Consensus methods for medical and health services research. BMJ. 1995, 311: 376-380. 10.1136/bmj.311.7001.376.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. GRADE working group (20 June 2013). Available from: http://www.gradeworkinggroup.org

  5. Williamson PR, Altman DG, Blazeby JM, Clarke M, Devane D, Gargon E: Developing core outcome sets for clinical trials: issues to consider. Trials. 2012, 13: 132-10.1186/1745-6215-13-132.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Dear RF, Barratt AL, Evans A, Simes J, Newsom J, Kent D: Identifying and prioritising gaps in colorectal cancer trials research in Australia. Med J Aust. 2012, 197 (9): 507-511. 10.5694/mja12.10623.

    Article  PubMed  Google Scholar 

  7. Wu J, Bezjak A, Chow E, Cross P, Genest P, Grant N: A consensus development approach to define national research priorities in bone metastases: proceedings from NCIC CTG workshop. Clin Oncol. 2003, 15 (8): 496-499. 10.1016/j.clon.2003.07.004.

    Article  CAS  Google Scholar 

  8. James Lind Alliance Guidebook. http://www.jlaguidebook.org/ (last accessed 20 January 2014)

  9. Reddel HK, Taylor DR, Bateman ED, Boulet LP, Boushey HA, Busse WW: An official American Thoracic Society/European Respiratory Society statement: asthma control and exacerbations: standardizing endpoints for clinical asthma trials and clinical practice. Am J Respir Crit Care Med. 2009, 180 (1): 59-99. 10.1164/rccm.200801-060ST.

    Article  PubMed  Google Scholar 

  10. Schmitt J, Langan S, Stamm T, Williams HC: Harmonizing outcome measurements in eczema (HOME) Delphi panel. Core outcome domains for controlled trials and clinical recordkeeping in eczema: international multiperspective Delphi consensus process. J Invest Dermatol. 2011, 131 (3): 623-630. 10.1038/jid.2010.303.

    Article  CAS  PubMed  Google Scholar 

  11. Tugwell P, Boers M, Brooks P, Simon L, Strand V, Idzerda L: OMERACT: an international initiative to improve outcome measurement in rheumatology. Trials. 2007, 8: 38-10.1186/1745-6215-8-38.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Sanderson T, Morris M, Calnan M, Richards P, Hewlett S: What outcomes from pharmacologic treatments are important to people with rheumatoid arthritis? Creating the basis of a patient core set. Arthritis Care Res. 2010, 62 (5): 640-646. 10.1002/acr.20034.

    Article  CAS  Google Scholar 

  13. Kirwan J, Heiberg T, Hewlett S, Hughes R, Kvien T, Ahlmen M: Outcomes from the patient perspective workshop at OMERACT 6. J Rheumatol. 2003, 30 (4): 868-872.

    PubMed  Google Scholar 

  14. Rosenbaum SE, Glenton C, Nylund HK, Oxman AD: User testing and stakeholder feedback contributed to the development of understandable and useful summary of findings tables for Cochrane reviews. J Clin Epidemiol. 2010, 63 (6): 607-619. 10.1016/j.jclinepi.2009.12.013.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We thank James Oyee and Andy Mather for help with developing the web based survey. We thank Tom Walley, Bridget Young and Kerry Woolfall for their input on an early version of the protocol. CTS, HH and PRW are supported by Medical Research Council funding from the North West Hub for Trials Methodology Research at the University of Liverpool (G0800792). MC is supported by Medical Research Council funding from the All Ireland Hub for Trials Methodology Research (G0901530), at the Queen’s University of Belfast. JB is supported by Medical Research Council funding from the ConDuCT Hub for Trials Methodology Research (G0800800), at the University of Bristol. The funders did not play any role in the design, in the collection, analysis, and interpretation of data; in the writing of the manuscript; or in the decision to submit the manuscript for publication of this study.

Grant support

Medical Research Council (grant numbers G0800792, G0901530, G0800800).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Catrin Tudur Smith.

Additional information

Competing interests

All authors are engaged in methodology research, and might use the results of this survey when seeking funding for such research. The authors declare that they have no competing interests.

Authors’ contributions

CTS contributed to the design of the study and the on-line data collection tool, coordinated the data collection, coded text responses, summarised results and wrote the manuscript. HH contributed to the design of the study and the on-line data collection tool, and contributed to manuscript writing. MC and JB contributed to the design of the study and the on-line data collection tool, coded text responses, and contributed to manuscript writing. PW proposed the idea of a priority setting exercise, contributed to the design of the study and the on-line data collection tool, coded text responses, and contributed to manuscript writing. All authors read and approved the final manuscript.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article

Tudur Smith, C., Hickey, H., Clarke, M. et al. The trials methodological research agenda: results from a priority setting exercise. Trials 15, 32 (2014). https://0-doi-org.brum.beds.ac.uk/10.1186/1745-6215-15-32

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1745-6215-15-32

Keywords