Skip to main content

Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide

Abstract

Background

Designing implementation research can be a complex and daunting task, especially for applied health researchers who have not received specialist training in implementation science. We developed the Implementation Science Research Development (ImpRes) tool and supplementary guide to address this challenge and provide researchers with a systematic approach to designing implementation research.

Methods

A multi-method and multi-stage approach was employed. An international, multidisciplinary expert panel engaged in an iterative brainstorming and consensus-building process to generate core domains of the ImpRes tool, representing core implementation science principles and concepts that researchers should consider when designing implementation research. Simultaneously, an iterative process of reviewing the literature and expert input informed the development and content of the tool. Once consensus had been reached, specialist expert input was sought on involving and engaging patients/service users; and economic evaluation. ImpRes was then applied to 15 implementation and improvement science projects across the National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London, a research organisation in London, UK. Researchers who applied the ImpRes tool completed an 11-item questionnaire evaluating its structure, content and usefulness.

Results

Consensus was reached on ten implementation science domains to be considered when designing implementation research. These include implementation theories, frameworks and models, determinants of implementation, implementation strategies, implementation outcomes and unintended consequences. Researchers who used the ImpRes tool found it useful for identifying project areas where implementation science is lacking (median 5/5, IQR 4–5) and for improving the quality of implementation research (median 4/5, IQR 4–5) and agreed that it contained the key components that should be considered when designing implementation research (median 4/5, IQR 4–4). Qualitative feedback from researchers who applied the ImpRes tool indicated that a supplementary guide was needed to facilitate use of the tool.

Conclusions

We have developed a feasible and acceptable tool, and supplementary guide, to facilitate consideration and incorporation of core principles and concepts of implementation science in applied health implementation research. Future research is needed to establish whether application of the tool and guide has an effect on the quality of implementation research.

Peer Review reports

Background

Evidence-based and cost-effective interventions consistently fail to be implemented into routine practice and policy [1], and even when such interventions are implemented, this is an effortful, unpredictable and typically slow process [2]. As a result, despite increasing pressure to improve the safety and quality of financially overstretched healthcare services, patients fail to receive optimal care and healthcare organizations fail to benefit from cost saving opportunities [1]. Over the past few years, increased and focused efforts to close the evidence-to-practice gap have resulted in further recognition of the importance of implementation science as a conceptual and methodological approach to translate evidence into routine practice [1, 3, 4].

However, despite the rapid growth of implementation science, designing implementation research remains a complex and daunting task for health researchers who have not completed specialist training in implementation science. Crable et al. [5] operationalized Proctor et al.’s recommended ten key ingredients for writing implementation research grant proposals [6] to quantitatively evaluate the quality of proposals on ten criteria. Thirty pilot grant applications submitted to a call for implementation and improvement science projects at an academic medical center were assessed. Crable et al. reported that most proposals assessed performed poorly on most of the ten criteria [5]. For example, 67% of proposals failed to identify and describe the implementation strategies to be used and/or incorrectly described the intervention as an implementation strategy. Furthermore, 70% of proposals failed to describe implementation or improvement science related outcomes and/or failed to link outcomes to the proposed study aims and/or the unit of analysis was inappropriate for the proposed study.

The challenge of designing implementation research is exacerbated by the fact that implementation research cuts across diverse scientific fields, resulting in the inevitable difficulty of identifying, appraising and synthesising relevant literature to inform design decisions. A recently published editorial in Implementation Science highlighted the need for capacity building initiatives in the research and practice of implementation to fulfil the demand for expertise in implementation science [7].

Concerns regarding the lack of guidance for designing implementation research have been raised and, to a certain extent, are being addressed. For example, Waltz et al. highlighted that guidance regarding how best to select implementation strategies is lacking [8]. Similarly, lack of guidance on how to select appropriate implementation theories and frameworks has been highlighted and efforts to develop such guidance is currently underway [9].

Whilst these efforts are worthwhile and necessary to advance the science of implementation, a tool consolidating design guidance, to the best of our knowledge, does not currently exist. As a result, healthcare researchers without access to specialist implementation science expertise are tasked with identifying and assimilating design guidance and recommendations reported across a wide range of journals when designing implementation research, or research with substantial implementation components. This is a challenging task, not always successfully accomplished—as evidenced by the aforementioned literature.

To address this challenge, we report the development, application, feasibility and preliminary evaluation of the Implementation Science Research Development (ImpRes) tool and supplementary guide to provide health researchers with a step-by-step approach to designing high-quality implementation research. Specifically, we aimed to (1) identify the core principles and concepts that research teams should consider when designing high-quality implementation research and (2) identify and synthesize key methodological/conceptual literature containing guidance and recommendations for designing and evaluating implementation research. Based on the above aims, we aimed to develop, apply and evaluate a tool that guides researchers through the key principles and research design considerations of implementation science when designing studies. The ImpRes tool and guide aim to enable research teams to design high-quality implementation research and as a result more effectively implement evidence-based interventions into routine practice, thereby reducing research waste, and improving health outcomes.

Methods

A multi-stage, multi-method approach was used to develop and evaluate the ImpRes tool and the subsequent development of its supplementary guide (see Fig. 1).

Fig. 1
figure 1

Development of the ImpRes tool and supplementary guide

Stage 1: development of the ImpRes tool (July 2015–May 2016)

Development of the ImpRes tool began in July 2015. The starting point and primary factor motivating the development of the tool was firstly to evaluate the degree to which the core principles and concepts of implementation science were embedded into research projects conducted within a research organisation—the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London; and secondly to provide implementation research support to research teams as required. After an initial review of the literature and consultation with experts in the field, it was evident that a tool (or framework) did not exist to allow for this form of evaluation. The NIHR CLAHRC South London is a collaborative partnership between two universities (King’s College London and St George’s, University of London), four geographically surrounding health service organisations (Guy’s and St Thomas’ NHS Foundation Trust, King’s College Hospital NHS Foundation Trust, St George’s University Hospitals NHS Foundation Trust and South London and Maudsley NHS Foundation Trust) and two other collaborating organisations (NHS England-funded Health Innovation Network (HIN), south London’s academic health science network; and King’s Health Partners, the academic health sciences centre in south-east London). NIHR CLAHRC South London was established in 2014 as one of 13 such research organisations supporting the conduct and application of applied health research across England; it has a catchment area of approximately 3.5 million people living and/or using the health services within south London, England. NIHR CLAHRC South London conducts research spanning a diversity of healthcare areas including clinical (e.g. maternity and women’s health and psychosis) and public health. In addition to our collaborators within the London geography, a significant body of the research conducted by NIHR CLAHRC South London researchers is conducted with national and international collaborators. For example, researchers in our ‘diabetes theme’ are leading a hybrid effectiveness-implementation study runs across the UK and US treatment sites [10]. Similarly, a wide-ranging portfolio of hybrid effectiveness-implementation and implementation research is being conducted across the research infrastructure, in many cases with international collaborators [11], as well as methodological implementation science research [12]. The NIHR CLAHRC South London includes the Centre for Implementation Science (CIS), which consists of a multidisciplinary group of implementation and improvement scientists, statisticians, health economists and behavioural and social science experts [13].

Development of the ImpRes tool involved an iterative process of:

1a. Expert brainstorming and consensus-building sessions (July 2015–December 2015)

The CIS research team as well as international experts in the fields of implementation science and other healthcare disciplines, who were members of the CIS Scientific Advisory Panel (SAP), participated in three brainstorming sessions which informed the initial development and content of the ImpRes tool. Brainstorming sessions took place, and were the focus of, the CIS SAP face-to-face meetings over a 6-month period (July 2015–December 2015); additional input and feedback was sought from the CIS SAP virtually (via email). Content suggestions were collated by LH. Subsequently, the CIS research team participated in a consensus-building session which involved reviewing and considering the content suggestions for inclusion in the ImpRes tool.

The CIS SAP is an international multidisciplinary panel including clinicians and academics with expertise in implementation science, improvement science, social science, health policy, biostatistics, health economics, health service research and patient and public involvement. A full list of members of the CIS SAP is available in the ImpRes guide [14]. The CIS SAP convenes on a quarterly basis and offers critical appraisal and advice to the CIS research and strategy.

1b. Identification of key methodological/conceptual literature containing guidance and recommendations for designing and evaluating implementation research (July 2015–March 2018)

Concurrently with the consensus-building brainstorming sessions, and informed by emerging ImpRes domains, a review of the literature was undertaken to identify key methodological/conceptual articles and reports containing guidance and recommendations relating to the design and evaluation of implementation research. Searching via PubMed and Google, articles and reports containing design guidance relating to the domains of ImpRes were identified and screened for relevance. Google was searched to identify gray literature and relevant content that would not have otherwise been identified through PubMed. For example, in addition to peer-reviewed publications, searching via Google allowed for the identification of websites, reports, webinars and blogs providing implementation research design guidance. The search was conducted between July 2015 and March 2018, with no date restrictions. Basic search terms reflected the emerging core ImpRes domains (e.g. ‘implementation outcomes’ and ‘implementation strategies’) and were used to identify relevant papers in PubMed and Google. Further, key articles by prominent authors/research groups leading work on specific domains of implementation science were also searched, including guidelines and recommendations relating to implementation outcomes (Enola Proctor) and implementation strategies (Byron Powell). Furthermore, identification of key literature was informed by the CIS SAP and the CIS implementation research teams’ expert knowledge. In this way, the ImpRes tool represents a consolidation and unification of key implementation science constructs informed by experts and key implementation science literature.

1c. Additional specialist input into specific ImpRes domains (January 2016–March 2016)

Once the core domains of the ImpRes tool were established (i.e. through consensus amongst the CIS research team and SAP), additional specialist expert input into the content of specific domains of the ImpRes tool was sought. Specifically, the lead researcher (LH) met with NIHR CLAHRC South London specialists in patient and public involvement, and specialists in health economics (AH, a member of the research team), to review and co-design the content of the ‘patient and public involvement and engagement’ and ‘economic evaluation’ ImpRes domains, respectively. Specialist input was sought as PPI and economic evaluation are specialist fields, both of which are considered important facets of implementation research [15, 16].

1d. Pilot testing and refinement (December 2015–May 2016)

The ImpRes tool was piloted by an experienced health services researcher without implementation science expertise (i.e. the intended target audience of the ImpRes tool). The ImpRes tool was completed independently by the researcher, using a research project that the researcher was leading on. Following completion, the lead researcher (LH) sought feedback on the ImpRes tool.

Stage 2: application of the ImpRes tool (June 2016–August 2018)

The ImpRes tool was applied to 15 implementation and improvement science research projects prospectively (i.e. at project design stage) or retrospectively (i.e. after all project design decisions had been made and/or the project had been completed), across the NIHR CLAHRC South London and partner healthcare organizations. At the time of application, the ImpRes tool was completed by researchers at varying stages of developing implementation science expertise following the NIHR CLAHRC South London’s launch in 2014. The research projects focused on a wide range of healthcare areas and ranged from hybrid type 1 effectiveness-implementation studies to pure implementation research (see ‘Results’ section, Table 3).

To suit the needs and desires of researchers, a pragmatic and flexible approach was used to apply the ImpRes tool. For example, some researchers felt confident applying the ImpRes tool without the guidance of an implementation scientist (LH), whereas others lacked confidence in applying the tool independently and welcomed expert guidance. To suit the needs of all, researchers were given the option to either (1) complete the tool independently and then participate in a one-to-one feedback session after the implementation scientist had reviewed the completed ImpRes tool, or (2) complete the tool with direct facilitation provided by the implementation scientist. Facilitation consisted of explaining the rationale for developing the ImpRes tool, its aims and an overview of the ten domains (via one-to-one or group presentation sessions). Researchers applying the ImpRes tool were asked to complete ten sections corresponding to the ten domains of ImpRes. A number of domains require researchers to provide written responses in the form of a paragraph (e.g. in the ‘Implementation Strategies’ domain, researchers were asked to describe the implementation strategies that they intended to use/had used). Other sections involve completion of a checklist (e.g. ‘Implementation Outcomes’ and ‘Patient and Public Involvement and Engagement’ domains are formatted in the form of a checklist). Researchers were asked to complete ImpRes as best as they were able to. Although researchers were not asked to document the amount of time taken to complete the ImpRes tool, we are aware that many researchers invested a considerable amount of time completing ImpRes, especially when applied prospectively (i.e. at project design stage).

Stage 3: Evaluation of the ImpRes tool (June 2016–August 2018)

3a: Questionnaire (June 2016-August 2018)

After completing the ImpRes tool, participating researchers completed a short questionnaire indicating their level of agreement with 11 statements relating to the structure, content and the usefulness of the ImpRes tool. Responses were provided on 5-point Likert scales, ranging from 1 (strongly disagree) to 5 (strongly agree). For example, participants responded to the following statements relating to the structure and usefulness of ImpRes tool respectively: ‘The ImpRes tool is easy to understand’ and ‘The ImpRes tool is useful for identifying project areas where implementation science is lacking.’ Participants were also encouraged to provide free-text comments and critique. One questionnaire per ImpRes tool application was completed.

3b: Download figures (April 2018-September 2018)

The ImpRes tool and supplementary guide were made freely available on the King’s Improvement Science (KIS) website [14] in April 2018 and the Implementation Science Exchange website [17] in May 2018. Since then, monthly download figures have been collected for both the ImpRes tool and guide on the KIS website (download figures from the Implementation Science Exchange website are not available to report as they are not currently collected by the website owners).

Stage 4: development of the ImpRes supplementary guide (January 2017–April 2018)

During application of the ImpRes tool and informal feedback provided by researchers who had applied the tool, it became apparent that in order to maximize the potential benefits, usability and scalability of the ImpRes tool, a detailed guide to supplement its use was required. The guide provides the rationale for the inclusion of the ImpRes tool domains, guidance regarding the application of the ImpRes tool and directs researchers to further literature and specialist resources.

Results

The process to develop and evaluate the ImpRes tool and supplementary guide is illustrated in Fig. 1. In what follows, we present the results corresponding to each stage of development and evaluation as described in the methods section and as depicted in Fig. 1. Results reporting the development of the ImpRes tool (stage 1) and guide (stage 4) correspond to the study aims of (1) identifying the core principles and concepts that research teams should consider when designing high-quality implementation research and (2) identifying and synthesizing key methodological/conceptual literature containing guidance and recommendations for designing and evaluating implementation research. Results reporting the application (stage 2) and evaluation of the ImpRes tool (stage 3) correspond to the aim of applying and evaluating a tool that guides researchers through the key principles and research design considerations of implementation science when designing studies.

Overview of the ImpRes tool

The ImpRes tool contains ten domains that experts agreed, based on current evidence, cover the core principles and methods of implementation science that researchers should consider when designing implementation research (see Fig. 2).

Fig. 2
figure 2

Domains of the ImpRes tool

Stage 1: development of the ImpRes tool

1a: Expert consensus-building brainstorming sessions

The ImpRes tool is organized into ten domains; each domain and the rationale underpinning its inclusion is presented in Table 1.

Table 1 ImpRes domains and rationale underpinning inclusion

1b: Identification of key methodological/conceptual literature containing implementation research design guidance and recommendations

Concurrently with the consensus-building brainstorming sessions and informed by emerging ImpRes domains (presented above), key methodological and conceptual articles and reports (both peer-reviewed and non-peer-reviewed) were identified in relation to each of the 10 ImpRes domains. These are presented in Table 2.

Table 2 Key methodological/conceptual articles, reports and resources identified that influenced the content of the ImpRes tool and/or guide

1c: Additional specialist input into specific ImpRes domains

After consulting with experts in the field of PPI and health economics, several refinements to the ImpRes tool were made. For example, after consulting with an expert in PPI in health services research, to ensure that researchers considered and described involvement and engagement opportunities at different stages of the research cycle (i.e. from identifying and prioritizing research topics to evaluating impact), and to distinguish between the different levels of involvement (i.e. consultation to collaboration), responses were tabulated rather than requesting researchers to describe planned involvement and engagement activities. Furthermore, four additional questions were included in the ‘Patient and Public Involvement’ section of the ImpRes tool, including, but not limited to, whether the patients/service users that researchers intended to involve in their research have formal research training.

1d: Pilot testing and refinement

The tool did not undergo any significant refinements after piloting. In addition to the refinements based on specialist expert input (described above), additional feedback received by the CIS SAP at subsequent meetings (March 2016 onwards) and discussions amongst the CIS Research team, a number of refinements were made to the ImpRes tool. For example, to improve the usability of the tool, a number of sections were tabulated (e.g. implementation outcomes, service and patient outcomes, and economic evaluation domains).

Stage 2: application of the ImpRes tool

The ImpRes tool was applied to 15 implementation and improvement science projects, either prospectively or retrospectively. The ImpRes tool was fully completed for 14 projects and partially completed for one project. The ImpRes tool was applied independently by researchers (i.e. without expert guidance) to 14 projects. ImpRes was applied with the guidance of an implementation scientist (LH) to the remaining project. The tool developers are part of the research organization in which the tool was applied and evaluated. As a large research organization, the relationship between the tool developers and the researchers that applied and evaluated the ImpRes tool was variable. In the majority of cases, the application of the ImpRes tool was the first meeting between the tool developers and the lead researcher. The research projects covered a diverse range of clinical areas, including diabetes, mental health, dementia and maternity and women’s health. Two hybrid type 1, five hybrid type 2, three hybrid type 3 and five implementation research projects, as defined by Curran et al., [22], were included. Additional details of each project can be found in Table 3.

Table 3 Application of the ImpRes tool

Stage 3: evaluation of the ImpRes tool

3a. Questionnaire

Thirteen out of 15 questionnaires were completed. Evaluative feedback from researchers on the structure, content and usefulness of the ImpRes tool is reported in Table 4. In all but one case (where the researcher stated that ‘The ImpRes tool is too long’), the median rating for statements relating to the structure and content and the usefulness of the ImpRes tool were above the scale mid-point (3) thereby indicating that researchers who applied the ImpRes tool viewed the structure, content and usefulness of the tool favorably.

Table 4 Evaluation of the ImpRes tool

Free text comments provided by researchers after completing the ImpRes tool

Five out of 13 researchers who applied the ImpRes tool, to 15 individual projects, provided free-text comments. Suggestions for improvement included expanding the glossary at the end of the ImpRes tool: ‘Helpful to provide definitions of “implementation/project team” and “stakeholders” distinction not clear. Also, helpful to provide definitions of ‘research project’ and ‘improvement project’ (Researcher 3)’. One researcher suggested that the layout of the ImpRes tool needed modification; ‘I think the layout will need some tweaking to be a bit more user friendly. I felt that I needed help to clarify where which responses should go where’ (Researcher 8). Whilst some researchers thought that the ImpRes tool was too long, one researcher commented that ImpRes was necessarily long: ‘The length [of the ImpRes tool] and level of detail are necessary to capture the complexity of the issues explored’ (Researcher 15). Furthermore, acknowledging that the ability to successfully conduct and complete an implementation project depends on the skills, expertise and experiences of the research team, one researcher suggested including an additional question: ‘There is maybe a question missing about the skill set of the project teams’ (Researcher 8).

3b: Download figures

Figure 3 displays the download figures for the ImpRes tool and guide over a 6-month period (April–September 2018). In total, the ImpRes tool and guide were downloaded 2687 times. The ImpRes guide was downloaded 1215 times in June 2018, representing a substantial increase in comparison to other months. We are aware that the ImpRes tool and guide were presented as part of a workshop at a large international conference and received social media attention (twitter).

Fig. 3
figure 3

Download figures for the ImpRes tool and guide

Stage 4: Development of the ImpRes supplementary guide

The evaluation data collected on the ImpRes tool (described in stage 3: evaluation of the ImpRes tool) led to the development, and informed the content, of the ImpRes supplementary guide. The guide is intended to be used in conjunction with the ImpRes tool. The guide aims to facilitate the use of the ImpRes tool, highlight the importance of implementation science in optimizing the successful implementation of evidence-based interventions into clinical practice, define terminology commonly used in the implementation science literature and direct researchers to relevant literature and online resources that can be used to design implementation research. Throughout the guide, several features designed to support researchers are presented. These include a ‘jargon buster’ and ‘useful resources’ feature. Each feature, rationale for inclusion and feature examples are presented in Table 5. Together with the ImpRes tool, the ImpRes supplementary guide is free to download via the King’s Improvement Science website [14] and the Implementation Science Exchange website [17].

Table 5 ImpRes guide features, rationale for inclusion and feature examples

Discussion

We developed the ImpRes tool and supplementary guide to help researchers design high-quality implementation research. Development of the tool and guide was informed by a consensus-building brainstorming process, involving an international multidisciplinary expert panel, and identification of key methodological and conceptual literature containing design guidance and recommendations. The ImpRes tool contains ten core domains, representing core implementation science principles and concepts that should be reviewed and considered when designing implementation research. Whilst we recommend that all components of ImpRes are worthy of consideration by research teams when designing implementation research, we recognize that not all sections will be applicable to, or feasible to explore, in every implementation study. Rather, each ImpRes domain should be considered strategically, in the context of the research aims and objectives, to determine applicability, importance and feasibility. For example, implementation research, as defined by Curran et al., 2012, which is focused on the adoption or uptake of a clinical intervention and as intervention effectiveness is typically assumed, gathering information on the clinical intervention’s impact on relevant outcomes is typically not assessed, as such ImpRes’ ‘Service and Patient Outcomes’ domain would not be applicable to include in such research. To the best of our knowledge, the ImpRes tool and supplementary guide is the only currently available comprehensive research design instrument which synthesizes guidance and recommendations for designing high-quality implementation research.

We envisage that the ImpRes tool and supplementary guide will increase users’ confidence and ability to design high-quality implementation research. The results of the present study support this vision; researchers applying the ImpRes tool found the tool to be useful in identifying areas where implementation science was lacking in research projects and in identifying how the methodology of projects could be improved. We anticipate that research teams who use the ImpRes tool will be more likely to consider and integrate core principles and concepts of implementation science when designing and evaluating the implementation of evidence-based health interventions. We aspire that, in turn, this will contribute to more successful implementation and evaluation of evidence-based interventions.

Strengths and limitations

Strengths of our research include the application of the ImpRes tool to a number of implementation and improvement science research projects, by researchers with varying levels of implementation and improvement science expertise and across a diverse range of health care areas, including physical and mental health. Our initial evaluation of the ImpRes tool found that the tool is acceptable and feasible to apply. Evaluation of the ImpRes tool by researchers who applied the tool viewed the structure, content and usefulness of the tool very favorably; all but one structure evaluation item (‘the ImpRes tool is too long’) had a median score ≥ 4/5 and all usefulness evaluation items had a median score ≥ 4/5. Our initial dissemination efforts (via the Kings Improvement Science and Implementation Science Exchange websites) [14, 17] suggest that the ImpRes tool and guide fill an important capacity building gap; over a 6-month period, the ImpRes tool and guide have been downloaded over 2600 times.

These findings, however, must be interpreted with some caution. ImpRes was evaluated by researchers with varying levels of implementation and improvement science expertise and it could be argued that those with limited implementation science expertise are not best placed to evaluate the content, structure and usefulness of the ImpRes tool. That said, the acceptability and adoption of the ImpRes tool and guide depends on the views of this large cohort of researchers, the primary intended end user, who have expertise in applied health research but lack specialist implementation science training. Although our evaluation data to date indicates that experienced applied health researchers, with varying levels of implementation expertise, believe that ImpRes is likely to lead to better designed implementation studies, more robust evidence is needed to assess whether the ImpRes tool and guide could result into better designed implementation research and improved effectiveness of interventions. Whilst we plan to investigate the impact of using the ImpRes tool and guide on the quality of implementation research in the future, here we report the preliminary findings and our reflections on the ImpRes tool and guide.

Furthermore, we identified a number of barriers that are likely to affect adoption, implementation and sustainment of the ImpRes tool, including a lack of awareness amongst applied health researchers of implementation science to improve the implementation of evidence-based interventions. Moreover, ImpRes was applied and evaluated within a research organization: NIHR CLAHRC South London. Future evaluation studies should explore the generalizability and scalability, as well as the barriers and enablers, of the ImpRes tool and guide in other research settings (e.g. outside of large, multi-million-pound funded research organizations such as NIHR CLAHRCs and research cutting across health and social care settings). The researchers who developed the ImpRes tool developed and distributed the questionnaire that was used for the preliminary evaluation and analyzed the questionnaire data returned. Although a point was made of welcoming participants’ comments and critiques to help us to modify/refine the content of ImpRes tool, we are aware a social desirability bias may be present in the evaluation data.

Reflections on implementing the ImpRes tool

Barriers to implementing the ImpRes tool

Applying the ImpRes tool retrospectively (i.e. to research projects that had already been completed or were underway) proved to be an unmotivating task to researchers and is likely to explain the reluctance and lack of engagement of a number of researchers/research themes that were approached, but declined, to apply the ImpRes tool to their projects. This is not unexpected considering that research projects already completed would not directly benefit from applying the ImpRes tool, i.e. it was not possible to amend and improve the design of implementation or improvement science research as the research was already underway or had been completed. Like many applied health researchers, several researchers lacked awareness/familiarity with implementation science. This is likely to have resulted in some reluctance to commit the time needed to complete the ImpRes tool. Whilst we believe that the content of the ImpRes tool and guide is appropriate across all types of implementation research, the presentation of information may need to be revised, perhaps shortened, to better suit practitioner-researchers based outside of an academic research organization. Without education to establish awareness of the importance of implementation science in implementing evidence-based interventions, this barrier is likely to prevent the adoption, implementation and sustainability of the ImpRes tool. Furthermore, the ImpRes tool was perceived by some as a research-heavy tool targeted more at large-scale academic research rather than more applied, small-scale, pragmatic projects (e.g. improvement/spread projects). Such perceptions were expressed predominately by those who applied the ImpRes tool to projects being conducted outside of a university setting (i.e. Health Innovation Network) rather than those who applied the tool to projects being conducted within a university setting (i.e. King’s College London). Again, this is likely to have resulted in some reluctance to commit the time needed to complete the ImpRes tool.

Facilitators to implementing the ImpRes tool

Prospective application to projects that were in the design phase, and amenable to design change, and to grant applications that were being drafted for submission, perhaps unsurprisingly proved to be a factor motivating the application of ImpRes. The benefits of applying ImpRes were clear and immediate. Researchers who were actively encouraged by principal investigators (PIs) to apply the ImpRes tool were far more likely and motivated to do so than researchers who were approached on an individual basis and who had not received direct encouragement from PIs. Furthermore, researchers who were aware and interested in implementation science and implementation research yet felt they lacked the knowledge and skills to design implementation research welcomed the ImpRes tool, together with the support and facilitation of the lead researcher (LH), to structure the process of designing implementation research.

Planned ImpRes research and development

Literature in the field of implementation science is a rapidly advancing. To ensure the ImpRes tool and guide include the most up-to-date guidance for designing and evaluating implementation research, we plan to review and update the tool and guide annually; the next review will be in April 2019. We are currently designing a formal international expert Delphi study, drawing upon a wider international expert panel, to formally content validate and refine the ImpRes tool and supplementary guide. Alongside this, we are currently developing quantitative scoring criteria for each of the ImpRes domains. The scoring criteria will provide a systematic and transparent rating system that will allow us to empirically determine whether applying the ImpRes tool and guide improves the quality of implementation research. We hope the scoring criteria will also be of benefit to multiple implementation research stakeholders (e.g. researchers, funders and decision-makers). For example, we envisage this scoring system to be of use to funders and decision-makers wishing to evaluate the quality of implementation research proposals allowing the differentiation between lower and higher quality implementation research proposals. Additionally, we hope that the scoring criteria will be of use and support researchers to improve the quality of their implementation research proposal by identifying project areas that require improvement. Furthermore, researchers who have implementation science expertise may also benefit from using the ImpRes tool and guide. The initial intention of the ImpRes tool and guide at the inception stage was to support those researchers who have no or limited expertise in implementation science research; however, we acknowledge and plan to evaluate the benefits of applying the ImpRes tool and guide for applied health researchers who have implementation science expertise.

Conclusion

We have developed a new and promising educational tool and supplementary guide to help overcome the many challenges that applied health researchers face when attempting to design implementation research. We believe that adopting the ImpRes tool and guide will improve the quality of implementation research, in turn advancing the field and leading to optimized implementation of evidence-based interventions and ultimately improved service and patient outcomes.

Abbreviations

CIS SAP:

Centre for Implementation Science Scientific Advisory Panel

CIS:

Centre for Implementation Science

CLAHRC:

Collaboration for Leadership in Applied Health Research and Care

ImpRes tool:

Implementation Science Research Development tool

IQR:

Interquartile range

NIHR:

National Institute for Health Research

PPI:

Patient and public involvement

References

  1. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.

    Article  Google Scholar 

  2. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

    Article  Google Scholar 

  3. Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, et al. An implementation research agenda. Implement Sci. 2009;4:18.

    Article  Google Scholar 

  4. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ. 1998;317(7156):465–8.

    Article  CAS  Google Scholar 

  5. Crable EL, Biancarelli D, Walkey AJ, Allen CG, Proctor EK, Drainoni ML. Standardizing an approach to the evaluation of implementation science proposals. Implement Sci. 2018;13(1):71.

    Article  Google Scholar 

  6. Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santens RL. Writing implementation research grant proposals: ten key ingredients. Implement Sci. 2012;7:96.

    Article  Google Scholar 

  7. Straus SE, Sales A, Wensing M, Michie S, Kent B, Foy R. Education and training for implementation science: our interest in manuscripts describing education and training materials. Implement Sci. 2015;10:136.

    Article  Google Scholar 

  8. Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, et al. Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9:39.

    Article  Google Scholar 

  9. Birken SA, Powell BJ, Shea CM, Haines ER, Alexis Kirk M, Leeman J, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12(1):124.

    Article  Google Scholar 

  10. The National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London. http://www.clahrc-southlondon.nihr.ac.uk/diabetes/research-projects/helping-people-type-1-diabetes-avoid-severe-hypoglycaemia#section2. Accessed 20 Mar 2019.

  11. The National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London. http://www.clahrc-southlondon.nihr.ac.uk/centre-implementation-science/our-research/emilia-project-assessing-who-mental-global-action-program. Accessed 20 Mar 2019.

  12. Khadjesari Z, Vitoratou S, Sevdalis N, Hull L. Implementation outcome assessment instruments used in physical healthcare settings and their measurement properties: a systematic review protocol. BMJ Open. 2017;7(10):e017972.

    Article  Google Scholar 

  13. The National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South London. http://www.clahrc-southlondon.nihr.ac.uk/. Accessed 3 Nov 2018.

  14. King’s Improvement Science. http://www.kingsimprovementscience.org/ImpRes. Accessed 3 Nov 2018.

  15. Hoomans T, Severens JL. Economic evaluation of implementation strategies in health care. Implement Sci. 2014;9:168.

    Article  Google Scholar 

  16. Gray-Burrows KA, Willis TA, Foy R, Rathfelder M, Bland P, Chin A, et al. Role of patient and public involvement in implementation research: a consensus study. BMJ Qual Saf. 2018;27(10):858–64.

    Article  Google Scholar 

  17. Implementation Science Exchange. https://impsci.tracs.unc.edu. Accessed 29 Oct 2018.

  18. Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.

    Article  Google Scholar 

  19. Tunis SR, Stryer DB, Clancy CM. Practical clinical trials: increasing the value of clinical research for decision making in clinical and health policy. JAMA. 2003;290(12):1624–32.

    Article  CAS  Google Scholar 

  20. Wells KB. Treatment research at the crossroads: the scientific interface of clinical trials and effectiveness research. Am J Psychiatry. 1999;156(1):5–10.

    Article  CAS  Google Scholar 

  21. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Admin Pol Ment Health. 2009;36(1):24–34.

    Article  Google Scholar 

  22. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    Article  Google Scholar 

  23. Foy R, Ovretveit J, Shekelle PG, Pronovost PJ, Taylor SL, Dy S, et al. The role of theory in research to develop and evaluate the implementation of patient safety practices. BMJ Qual Saf. 2011;20(5):453–9.

    Article  Google Scholar 

  24. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:37.

    Article  Google Scholar 

  25. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

    Article  CAS  Google Scholar 

  26. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  Google Scholar 

  27. Skolarus TA, Lehmann T, Tabak RG, Harris J, Lecy J, Sales AE. Assessing citation networks for dissemination and implementation research frameworks. Implement Sci. 2017;12(1):97.

    Article  Google Scholar 

  28. Pfadenhauer LM, Gerhardus A, Mozygemba K, Lysdahl KB, Booth A, Hofmann B, et al. Making sense of complexity in context and implementation: the context and implementation of complex interventions (CICI) framework. Implement Sci. 2017;12(1):21.

    Article  Google Scholar 

  29. Davidoff F. Understanding contexts: how explanatory theories can help. Implement Sci. 2019;14(1):23.

    Article  Google Scholar 

  30. May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11(1):141.

    Article  Google Scholar 

  31. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  Google Scholar 

  32. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10:21.

    Article  Google Scholar 

  33. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the expert recommendations for implementing change (ERIC) study. Implement Sci. 2015;10:109.

    Article  Google Scholar 

  34. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38(2):65–76.

    Article  Google Scholar 

  35. Thompson C, Pulleyblank R, Parrott S, Essex H. The cost-effectiveness of quality improvement projects: a conceptual framework, checklist and online tool for considering the costs and consequences of implementation-based quality improvement. J Eval Clin Pract. 2016;22(1):26–30.

    Article  Google Scholar 

  36. Drummond M. Methods for the Economic Evaluation of Health Care Programmes. 3rd ed. New York: Oxford University Press Inc.; 2005.

  37. National Institute for Health Research. NIHR Annual Report 2015/2016. https://www.nihr.ac.uk/about-us/documents/NIHR-Annual-Report-2015-16.pdf. Accessed 2 Nov 2018.

  38. Rycroft-Malone J, Wilkinson J, Burton CR, Harvey G, McCormack B, Graham I, et al. Collaborative action around implementation in collaborations for leadership in applied Health Research and care: towards a programme theory. J Health Serv Res Policy. 2013;18(3 Suppl):13–26.

    Article  Google Scholar 

  39. Burton C, Rycroft-Malone J. An untapped resource: patient and public involvement in implementation comment on “knowledge mobilization in healthcare organizations: a view from the resource-based view of the firm”. Int J Health Policy Manag. 2015;4(12):845–7.

    Article  Google Scholar 

  40. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health. 2017;38:1–22.

    Article  Google Scholar 

  41. National Implementation Research Network. Implementation Stages. http://nirn.fpg.unc.edu/learnimplementation/implementation-stages. Accessed 3 Nov

  42. Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N. Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005;58(2):107–12.

    Article  Google Scholar 

  43. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.

  44. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  Google Scholar 

  45. The Consolidated Framework for Implementation Research. https://cfirguide.org/. Accessed 29 Oct 2018.

  46. Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) framework. http://re-aim.org. Accessed 29 Oct 2018.

  47. Normalisation Process Theory. http://www.normalizationprocess.org/. Accessed 12 May 2019.

  48. Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22.

  49. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ (Clinical research ed). 2008;337:a1655.

  50. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sc. 2013;8:35.

  51. Health Foundation. Webinar: Quality improvement - the role of context and how to manage it. https://www.health.org.uk/webinar-quality-improvement-rolecontext-and-how-manage-it.Accessed 2 Nov 2018.

  52. Bates P, Roberts G, Fulop N, Øvretveit J, Dixon-Woods M. Perspectives on context: A selection of essays considering the role of context in successful quality improvement. https://www.health.org.uk/sites/health/files/PerspectivesOnContext_fullversion.pdf. Accessed 2 Nov 2018.

  53. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94.

    Article  Google Scholar 

  54. Clinton-McHarg T, Yoong SL, Tzelepis F, Regan T, Fielding A, Skelton E, et al. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the consolidated framework for implementation research: a systematic review. Implement Sci. 2016;11(1):148.

  55. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015;10:155.

  56. The Society for Implementation Research Collaboration (SIRC).https://societyforimplementationresearchcollaboration.org/. Accessed 2 Nov 2018.

  57. Grid-Enable Measures (GEM) Database. https://www.Public/Public/Home.aspx. Accessed 2 Nov 2018.

  58. National Institute for Health National Cancer Institute. Webinar: Advanced Topics for Implementation Science Research: Measure Development and Evaluation. https://www.youtube.com/watch?v=dGXVhRQXiz4. Accessed 2 Nov 2018.

  59. Mason J, Freemantle N, Nazareth I, Eccles M, Haines A, Drummond M. When is it cost-effective to change the behavior of health professionals? JAMA. 2001;286(23):2988–92.

    Article  CAS  Google Scholar 

  60. Wensing M. Finding common ground between health economics and implementation science. https://0-blogs-biomedcentral-com.brum.beds.ac.uk/on-health/2014 /12/18/theory-and-practice-finding-common-ground-between-healtheconomics-and-implementation-science/. Accessed 3 Nov 2018.

  61. Metz A, Boaz A. Where are the stakeholders in implementation science? http://nirn.fpg.unc.edu/where-are-stakeholders-implementationscience. Accessed 2 Nov 2018.

  62. Callard F, Rose D, Wykes T. Close to the bench as well as at the bedside: involving service users in all phases of translational research. Health Expect. 2012;15(4):389–400.

    Article  Google Scholar 

  63. Ocloo J, Matthews R. From tokenism to empowerment: progressing patient and public involvement in healthcare improvement. BMJ Qual Saf. 2016; 25(8):626–32.

    Article  Google Scholar 

  64. National Institute for Health (NIHR). Going the extra mile: Improving the nation’s health and wellbeing through public involvement in research. https://www.nihr.ac.uk/patients-andpublic/documents/Going-the-Extra-Mile.pdf. Accessed 2 Nov 2018.

  65. National Institute for Health (NIHR) INVOLVE. http://www.invo.org.uk/. Accessed 2 Nov 2018.

  66. Hayes H, Buckland S, Tarpey M. Briefing notes for researchers: public involvement in NHS, public health and social care research. http://www.invo.org.uk/resource-centre/resource-for-researchers/. Accessed 3 Nov 2018.

  67. National Institute for Health (NIHR) INVOLVE. Jargon Buster. http://www.invo.org.uk/resource-centre/jargon-buster/. Accessed 3 Nov 2018.

  68. Merton R. The unanticipated consequences of purposive social action. Am Sociol Rev. 1936;1:894e904.

    Article  Google Scholar 

  69. The Office of the National Coordinator for Health Information Technology (ONC). Module I: Introduction to Unintended Consequences. https://www.healthit.gov/unintended-consequences/content/module-i-introduction-unintended-consequences.html. Accessed 03 November 2018.

  70. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of babel? Implement Sci. 2010;5:16.

  71. Peters DH, Adam T, Alonge O, Agyepong IA, Tran N. Implementation research: what it is and how to do it. BMJ (Clinical research ed). 2013;347:f6753.

  72. The Consolidated Framework for Implementation Research Website. Implementation Technique Selection tool. https://cfirguide.org/choosingstrategies/. Accessed 11 May 2019.

Download references

Acknowledgements

The authors would like to thank the researchers across the NIHR CLAHRC South London and partner organizations who applied and evaluated the ImpRes tool.

Funding

IB, AH, LH and NS’ research is part funded by the National Institute for Health Research (NIHR) via the ‘Collaboration for Leadership in Applied Health Research and Care South London’ (CLAHRC South London) at King’s College Hospital National Health Service (NHS) Foundation Trust, London, UK. LG, LH and ZK are funded by King’s Improvement Science, which is part of the NIHR CLAHRC South London and comprises a specialist team of improvement scientists and senior researchers based at King’s College London. King’s Improvement Science is funded by King’s Health Partners (Guy’s and St Thomas’ NHS Foundation Trust, King’s College Hospital NHS Foundation Trust, King’s College London and South London and Maudsley NHS Foundation Trust), Guy’s and St Thomas’ Charity, the Maudsley Charity and the Health Foundation. NS is also funded by the South London and Maudsley NHS Foundation Trust. The views expressed are those of the authors and not necessarily those of NHS, NIHR or the Department of Health. IB is also supported by the NIHR Biomedical Research Centre at South London and Maudsley NHS Foundation Trust.

The above funding bodies were not involved in the design of the study and collection, analysis and interpretation of data or in writing the manuscript.

Availability of data and materials

The datasets used and/or analyzed, relating to the evaluation questionnaire only, during the current study are available from the corresponding author on reasonable request. Completed ImpRes tools are not available.

Author information

Authors and Affiliations

Authors

Contributions

All authors made substantial contributions to conception and design of the study. LH and ZK made substantial contribution to the acquisition of data. LH analysed and interpreted the data. LH drafted the manuscript and all authors have been involved in revising it critically for important intellectual content. All authors have given final approval of the version to be published. Each author has participated sufficiently in the work to take public responsibility for appropriate portions of the content; and have agreed to be accountable for all aspects of the work.

Corresponding author

Correspondence to Louise Hull.

Ethics declarations

Ethics approval and consent to participate

This study was performed as an education evaluation and was therefore not applicable for ethical committee review. Individual level or organizational level written consent (by email) was provided for projects to be described and evaluation data to be incorporated within this manuscript.

Consent for publication

Not applicable; no identifying information on any individual’s data is presented in this manuscript.

Competing interests

NS is the Director of London Safety and Training Solutions Ltd., which provides quality and safety training and advisory services on a consultancy basis to healthcare organizations globally. NS is an Associate Editor of Implementation Science. All other authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hull, L., Goulding, L., Khadjesari, Z. et al. Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide. Implementation Sci 14, 80 (2019). https://0-doi-org.brum.beds.ac.uk/10.1186/s13012-019-0897-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s13012-019-0897-z

Keywords