Skip to main content

PERSPECTIVE article

Front. Public Health, 22 January 2019
Sec. Public Health Education and Promotion
This article is part of the Research Topic Methods and Applications in Implementation Science View all 20 articles

Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda

\r\nByron J. Powell,,*Byron J. Powell1,2,3*Maria E. FernandezMaria E. Fernandez4Nathaniel J. WilliamsNathaniel J. Williams5Gregory A. AaronsGregory A. Aarons6Rinad S. Beidas,,Rinad S. Beidas7,8,9Cara C. LewisCara C. Lewis10Sheena M. McHughSheena M. McHugh11Bryan J. WeinerBryan J. Weiner12
  • 1Department of Health Policy and Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
  • 2Cecil G. Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
  • 3Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
  • 4Center for Health Promotion and Prevention Research, School of Public Health, University of Texas Health Science Center at Houston, Houston, TX, United States
  • 5School of Social Work, Boise State University, Boise, ID, United States
  • 6Department of Psychiatry, University of California, San Diego, La Jolla, CA, United States
  • 7Department of Psychiatry, Center for Mental Health, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States
  • 8Department of Medical Ethics and Health Policy, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States
  • 9Leonard Davis Institute of Health Economics, University of Pennsylvania, Philadelphia, PA, United States
  • 10MacColl Center for Healthcare Innovation, Kaiser Permanente Washington Health Research Institute, Seattle, WA, United States
  • 11School of Public Health, University College Cork, Cork, Ireland
  • 12Department of Global Health, Department of Health Services, University of Washington, Seattle, WA, United States

The field of implementation science was developed to better understand the factors that facilitate or impede implementation and generate evidence for implementation strategies. In this article, we briefly review progress in implementation science, and suggest five priorities for enhancing the impact of implementation strategies. Specifically, we suggest the need to: (1) enhance methods for designing and tailoring implementation strategies; (2) specify and test mechanisms of change; (3) conduct more effectiveness research on discrete, multi-faceted, and tailored implementation strategies; (4) increase economic evaluations of implementation strategies; and (5) improve the tracking and reporting of implementation strategies. We believe that pursuing these priorities will advance implementation science by helping us to understand when, where, why, and how implementation strategies improve implementation effectiveness and subsequent health outcomes.

Introduction

Nearly 20 years ago, Grol and Grimshaw (1) asserted that evidence-based practice must be complemented by evidence-based implementation. The past two decades have been marked by significant progress, as the field of implementation science has worked to develop a better understanding of implementation barriers and facilitators (i.e., determinants) and generate evidence for implementation strategies (2). In this article, we briefly review progress in implementation science and suggest five priorities for enhancing the impact of implementation strategies. We draw primarily upon the healthcare, behavioral health, and social services literature. While we hope the proposed priorities are applicable to studies conducted in a wide range of contexts, we welcome discussion regarding potential applications and enhancements for contexts outside of healthcare, such as community and public health settings (3) that often involve different types of stakeholders, interventions, and implementation strategies.

Implementation strategies are methods or techniques used to improve adoption, implementation, sustainment, and scale-up of interventions (4, 5). These strategies vary in complexity, from discrete or single component strategies (6, 7) such as computerized reminders (8) or audit and feedback (9) to multifaceted implementation strategies that combine two or more discrete strategies, some of which have been branded and tested using rigorous designs [e.g., (10, 11)]. Implementation strategies can target a range of stakeholders (12) and multilevel contextual factors across different phases of implementation (1316). For example, strategies can address patient (17), provider (18), organizational (19), community (20, 21), policy and financing (22), or multilevel (23) factors.

Several taxonomies describe and organize the types of strategies available (6, 7, 2426). Similarly, taxonomies of behavior change techniques (27) and methods (28) describe components of strategies at a more granular level. Both types of taxonomies promote a common language, inform implementation strategy development and evaluation by facilitating consideration of various “building blocks” or components of multifaceted and multilevel strategies, and improve the quality of reporting in research and practice.

The evidence base for implementation strategies is steadily developing. Initially, single-component, narrowly focused strategies that were effective in earlier studies were selected in subsequent studies despite differences between the clinical problems and contexts in which they were deployed (29). That approach was based on the assumption that strategies would be effective independent of the implementation problems being addressed (29). This “magic bullet” approach has led to limited success (30), prompting recognition that strategies should be selected or developed based upon a thorough understanding of context, including the causes of quality and implementation gaps, an assessment of implementation determinants, and an understanding of the mechanisms and processes needed to address them (29).

Evidence syntheses for discrete, multifaceted, and tailored implementation strategies have been conducted. The Cochrane Collaboration's Effective Practice and Organization of Care (EPOC) group has been a leader in this regard, with 132 systematic reviews of strategies such as educational meetings (31), audit and feedback (9), printed educational materials (32), and local opinion leaders (33). Grimshaw et al. (34) note that while median absolute effect sizes across implementation strategies are similar (see Table 1), the variation in observed effects within each strategy category suggests that effects may vary based upon whether or not they address determinants (barriers and facilitators). Indeed, determinants at multiple levels and phases may signal the need for multifaceted and tailored strategies that address key determinants (13).

TABLE 1
www.frontiersin.org

Table 1. Evidence for common implementation strategies targeting professional behavior change.

While the use of multifaceted and tailored implementation strategies is intuitive and has considerable face validity (29), the evidence regarding their superiority to single-component strategies has been mixed (37, 39, 40). A review of 25 systematic reviews (39) found “no compelling evidence that multifaceted interventions are more effective than single-component interventions” (p. 20). Grimshaw et al. (34) provide one possible explanation, emphasizing that the general lack of an a priori rationale for the selection of components (i.e., discrete strategies) in multifaceted implementation strategies makes it difficult to determine how these decisions were made. They may have been selected thoughtfully to address prospectively identified determinants through theoretically- or empirically-derived change mechanisms, or they may simply be the manifestation of a “kitchen sink” approach. Wensing et al. (41) offer a complementary perspective, noting that definitions of discrete and multifaceted strategies are problematic. A discrete strategy such as outreach visits may include instruction, motivation, planning of improvement, and technical assistance; thus, it may not be accurate to characterize it as a single-component strategy. Conversely, a multifaceted strategy including educational workshops, educational materials, and webinars may only address provider knowledge and fail to address other important implementation barriers. They propose that multifaceted strategies that truly target multiple relevant implementation determinants could be more effective than single-component strategies (41).

A systematic review of 32 studies testing strategies tailored to address determinants concluded that tailored approaches to implementation were more effective than no strategy or a strategy not tailored to determinants; however, the methods used to identify and prioritize determinants and select implementation strategies were not often well-described and no specific method has been proven superior (37). The lack of systematic methods to guide this process is problematic, as evidenced by a review of 20 studies that found that implementation strategies were often poorly conceived, with mismatches between strategies and determinants (e.g., barriers were identified at the team or organizational level, but strategies were not focused on structures and processes at those levels) (42). A multi-national program of research was undertaken to improve the methods of tailoring implementation strategies (43), but tailored strategies had little impact on primary and secondary outcomes (40). Questions remain about the best methods to develop tailored implementation strategies.

Five priorities need to be addressed to increase the public health impact of implementation strategies: (1) enhance methods for designing and tailoring; (2) specify and test mechanisms of change; (3) conduct more effectiveness research on discrete, multifaceted, and tailored strategies; (4) increase economic evaluations; and (5) improve tracking and reporting. Table 2 provides examples of studies that have pursued each priority with rigor.

TABLE 2
www.frontiersin.org

Table 2. Five priorities for research on implementation strategies.

Enhance Methods for Designing and Tailoring Implementation Strategies

Implementation strategies are too often designed in an unsystematic manner and fail to address key contextual determinants (1316). Stakeholders may rely upon inertia (i.e., “we've always done things this way”), one size fits all approaches, or utilize what Martin Eccles has called the ISLAGIATT principle (i.e., “it seemed like a good idea at the time”) (53). Consequently, strategies are not always well-matched to the contexts in which they are deployed, including the interventions to be implemented, settings, stakeholder preferences, and implementation determinants (37, 42, 54). More rational, systematic approaches to identify and prioritize barriers and link strategies to overcome them are needed (37, 42, 5557). A number of methods have been suggested. Colquhoun and colleagues (56) found 15 articles with replicable methods for designing strategies to change healthcare professionals' behavior, and Powell et al. (55) proposed Intervention Mapping (58), concept mapping (59), conjoint analysis (60), and system dynamics modeling (61) as methods to aid the design, selection, and tailoring of strategies. These methods share common steps (identification of barriers, linking barriers to strategy component selection, use of theory, and user engagement), and have potential to make the process of designing and tailoring implementation strategies more rigorous (55, 56). For example, Intervention Mapping is step-by-step approach to developing implementation strategies using a detailed and participatory needs assessment and the identification of implementers, implementation behaviors, determinants, and ultimately, behavior change methods and implementation strategies that influence determinants of implementation behaviors. Some work has been done to compare different methods for assessing determinants (62); however, several questions remain. How can determinants be accurately and efficiently assessed (ideally leveraging implementation frameworks)? Can perceived and actual determinants be differentiated? What are the best methods for prioritizing determinants that need to be proactively addressed? When should determinant assessment take place given that new challenges are likely to emerge during the course of implementation? Who should be involved in this process? Each of those questions has resource implications. Similarly, questions remain about efficiently linking prioritized determinants to effective and pragmatic implementation strategies. How can causal theory be leveraged or developed to guide the selection of implementation strategies? Can pragmatic tools be developed to systematically link strategies to determinants? Approaches to designing and tailoring implementation strategies should be tested to determine whether they improve implementation and clinical outcomes (55, 56). Given that clinical problems, clinical and public health interventions, settings, individuals, and contextual factors are highly heterogeneous, there is much to gain from developing generalizable processes for designing and tailoring strategies.

Specify and Test Mechanisms of Change

Studies of implementation strategies should increasingly focus on establishing the processes and mechanisms by which strategies exert their effects rather than simply establishing whether or not they were effective (29, 63, 64). The National Institutes of Health (64) provides this guidance:

Wherever possible, studies of dissemination or implementation strategies should build knowledge both on the overall effectiveness of the strategies, as well as “how and why” they work. Data on mechanisms of action, moderators, and mediators of dissemination and implementation strategies will greatly aid decision-making on which strategies work for which interventions, in which settings, and for which populations.

Unfortunately, it is not common that mechanisms are even mentioned, much less tested (63, 65, 66). Williams (63) emphasizes the need for trials that test a wider range of multilevel mediators of implementation strategies, stronger theoretical links between strategies and hypothesized mediators, improved design and analysis of multilevel mediation models in randomized trials, and an increasing focus on identifying implementation strategies and behavior change techniques that contribute most to improvement. Developing a more nuanced understanding of mechanisms will require researchers to thoroughly assess the context of implementation and describe causal pathways by which strategies exert their effects, moving beyond a broad identification of determinants and articulating mediators, moderators, preconditions, and proximal and distal outcomes (67). Examples of this type of approach and guidance for their development can be found in Lewis et al. (67), Weiner et al. (23), Bartholomew et al. (58), and Highfield et al. (44). Additionally, drawing more heavily upon theory (66, 68, 69), using research designs that maximize ability to make causal inferences (70, 71), leveraging methods that capture and reflect the complexity of implementation such as systems science (61, 72, 73) and mixed methods (7476) approaches, and adhering to methods standards for studies of complex interventions (77) will help to sharpen our understanding of how implementation strategies engage hypothesized mechanisms. Work to link implementation strategies and behavior change techniques to hypothesized mechanisms is underway (67, 78), which promises to improve our understanding of how, when, where, and why implementation strategies are effective.

Conduct More Effectiveness Research on Discrete, Multi-faceted, and Tailored Implementation Strategies

There is a need for more and better effectiveness research on discrete, multifaceted, and tailored implementation strategies using a wider range of innovative designs (70, 7982). First, while a number of discrete implementation strategies have been described (6, 7, 24, 25) and tested (38), there are gaps in our understanding about how to optimize these strategies. There are over 140 randomized trials of audit and feedback, but Ivers et al. (83) conclude that there is much to learn about when it will work best and why, and how to design reliable and effective audit and feedback strategies across different settings and providers. Audit and feedback is an example of how complex implementation strategies can be. The ICeBERG group (69) pointed to the fact that even varying five modifiable elements of audit and feedback (content, intensity, method of delivery, duration, and context) produces 288 potential combinations. These variations matter (84), and there is a need for tests of audit and feedback and other discrete implementation strategies that include clearly described components that are theoretically and empirically derived, and well-operationalized. The results of these studies could inform the use of discrete strategies and their inclusion in multifaceted strategies.

Second, there is a need for trials that give insight into the sequencing of multifaceted strategies and what to do if the first strategy fails (39). These strategies could be compared to discrete/single-component implementation strategies or multifaceted strategies of varying complexity and intensity with well-defined components that are theoretically aligned with implementation determinants. These strategies could be tested using MOST, SMART, or other variants of factorial designs that can evaluate the relative impact of various components of multifaceted strategies and inform their sequencing (70, 85).

Finally, tests of strategies that are prospectively tailored to different implementation contexts to address specific implementers, implementation behaviors, or determinants are needed (37). This work could involve comparisons between tailored and non-tailored multifaceted implementation strategies (86), as well as tests of established and innovative methods that could inform the identification, selection, and tailoring of implementation strategies (55, 56).

Increase Economic Evaluations of Implementation Strategies

Few studies include economic evaluations of implementation strategies (87, 88). For example, in a systematic review of 235 implementation studies, only 10% provided information about implementation costs (87). The dearth of economic evaluations severely limits our ability to understand which strategies might be feasible for different contexts, as some decision makers might underestimate the resources required to implement and sustain EBPs, while others might over-estimate them and preemptively limit themselves from implementing EBPs that could benefit their communities (89). Incorporating economic analyses into studies of implementation strategies would provide decision makers more complete information to guide strategy selection, and would encourage researchers to be more judicious and pragmatic in their design and selection of implementation strategies, narrowing attention to strategies and mechanisms hypothesized to be most essential. If methods for designing and tailoring strategies can be improved such that complex multifaceted strategies are proven superior to single-component or less complex multifaceted strategies (39) and tailored strategies are proven superior to more standard multifaceted strategies (37, 40, 43, 55), economic evaluations will be instrumental in demonstrating whether improvements in implementation are worth added costs. Practical tools for integrating economic evaluations within implementation studies have been developed, such as the Costs of Implementing New Strategies (COINS) method (89) which was developed to address the need for standardized methods for analyzing cost data in implementation research that extend beyond the cost of the clinical intervention itself (90). For example, the original COINS study presented a head-to-head trial of two implementation approaches; although one approach was significantly more costly, the implementation outcomes achieved were superior enough to warrant the additional resources (91). Increasing the number and relevance of economic evaluations will require the development of a common framework that promotes comparability across studies (88).

Improve Tracking and Reporting of Implementation Strategies

Developing a robust evidence base for implementation strategies will require that their use be contemporaneously tracked and that they be reported in the literature with sufficient detail (92). It is often difficult to ascertain which implementation strategies were used and how they might be replicated. Part of the challenge is the iterative nature of implementation. Even if strategies are meticulously described in a study protocol or trial registry, it is often unrealistic to expect that they will not need to be altered as determinants emerge across implementation phases (13, 93, 94). These changes are likely to occur within and between implementing sites in research studies and applied efforts (50, 51), and without rigorous methods for tracking implementation strategy use, efforts to understand what strategies were used and whether or not they were effective are stymied. Even when strategies are reported in study protocols or empirical articles, there are numerous problems with their description, including inconsistent labeling; lack of operational definitions; poor description and absence of manuals to guide their use; and lack of a clear theoretical, empirical, or pragmatic justification for how the strategies were developed and applied (4). Poor reporting clouds the interpretation of results, precludes replication in research and practice, and limits our ability to synthesize findings across studies (4, 92). Findings from systematic reviews illustrate this problem. For example, Nadeem et al. (95) review of learning collaboratives concluded that, “reporting on specific components of the collaborative was imprecise across articles, rendering it impossible to identify active quality improvement collaborative ingredients linked to improved care.”

A number of reporting guidelines could be leveraged to improve descriptions of strategies (4, 96100). Proctor et al. (4) recommend that researchers name and define strategies in ways that are consistent with the published literature, and carefully operationalize the strategy by specifying: (1) actor(s), (2) action(s), (3) action target(s), (4) temporality, (5) dose, (6) implementation outcomes affected, and (7) theoretical, empirical, or pragmatic justification. Specifying strategies in this way has the potential to increase our understanding of not only which strategies are most effective, but more importantly, the processes and mechanisms by which they exert their effects (29, 67). Additional options that provide structured reporting recommendations include the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations (99, 100), the Simplified Framework (96) and its extension [AIMD; (97)], and the Template for Intervention Description and Replication (TIDieR) checklist (98). Though not specific to the reporting of implementation strategies, the Standards for Reporting Implementation Studies (101) and Neta et al. (102) reporting framework emphasizes how critical it is to report on the multilevel context of implementation. The use of any of the existing guidelines would enhance the clarity of strategy description. We believe that developing approaches to tracking implementation strategies (50, 51), and assessing the extent to which they are pragmatic (e.g., acceptable, compatible, easy, and useful) for both research and applied efforts is a high priority. Further, efficient ways of linking empirical studies with study protocols to gauge the degree to which strategies have been adapted or tailored over the course of an implementation effort would be helpful. Failing to improve the quality of reporting will negate other advances in this area by hindering replication.

Conclusion

Implementation science has advanced considerably, yielding a more robust understanding of implementation strategies. Several resources can inform the use of implementation strategies, including established taxonomies of implementation strategies (6, 7, 24, 25) and behavior change techniques (27, 28), repositories of systematic reviews (38, 103, 104), methods for selecting and tailoring implementation strategies (40, 55, 56), and reporting guidelines that promote replicability (4, 98100). Nevertheless, questions remain and further effectiveness research and methodological development are needed to ensure that evidence is effectively translated into public health impact. Advancing these priorities will lead to a better understanding of when, where, why, and how implementation strategies exert their effects (29, 63).

Author Contributions

BP conceptualized the paper and wrote the first draft of the manuscript. All other authors contributed to the writing and approved the final manuscript.

Funding

BP was supported by grants and contracts from the NIH, including K01MH113806, R25MH104660, UL1TR002489, R01MH106510, R01MH103310, P30A1050410, and R25MH080916. NW was supported by P50MH113840 from the NIMH. RB was supported by grants from the NIMH through R21MH109878 and P50MH113840. CL was supported by R01MH106510 and R01MH103310 from the NIMH. SM was supported by a Fulbright-Health Research Board Impact Award.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

1. Grol R, Grimshaw JM. Evidence-based implementation of evidence-based medicine. Jt Comm J Qual Improv. (1999) 25:503–13. doi: 10.1016/S1070-3241(16)30464-3

PubMed Abstract | CrossRef Full Text | Google Scholar

2. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. (2006) 1:1–3. doi: 10.1186/1748-5908-1-1

CrossRef Full Text | Google Scholar

3. Vinson CA, Stamatkis KA, Kerner JF. Dissemination and implementation research in community and public health settings. In: Brownson RC, Colditz GA, Proctor EK, editors, Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press (2018). pp. 355–70.

Google Scholar

4. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. (2013) 8:1–11. doi: 10.1186/1748-5908-8-139

PubMed Abstract | CrossRef Full Text | Google Scholar

5. Powell BJ, Garcia K, Fernandez ME. Implementation strategies. In: Chambers D, Vinson C, Norton W, editors. Optimizing the Cancer Control Continuum: Advancing Implementation Research. New York NY: Oxford University Press (2019). pp. 98–120.

Google Scholar

6. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. (2012) 69:123–57. doi: 10.1177/1077558711430690

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. (2015) 10:1–14. doi: 10.1186/s13012-015-0209-1

PubMed Abstract | CrossRef Full Text | Google Scholar

8. Shojania KG, Jennings A, Mayhew A, Ramsay CR, Eccles MP, Grimshaw JM. The effects of on-screen, point of care computer reminders on processes and outcomes of care. Cochrane Database Syst Rev. (2009) D001096:1–68. doi: 10.1002/14651858.CD001096.pub2

CrossRef Full Text | Google Scholar

9. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. (2012) CD000259:1–227. doi: 10.1002/14651858.CD000259

CrossRef Full Text | Google Scholar

10. Glisson C, Schoenwald S, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. (2010) 78:537–50. doi: 10.1037/a0019160

PubMed Abstract | CrossRef Full Text | Google Scholar

11. Aarons GA, Ehrhart MG, Farahnak LR, Hurlburt MS. Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci. (2015) 10:1–12. doi: 10.1186/s13012-014-0192-y

PubMed Abstract | CrossRef Full Text | Google Scholar

12. Chambers DA, Azrin ST. Partnership: a fundamental component of dissemination and implementation research. Psychiatr Serv. (2013) 64:509–11. doi: 10.1176/appi.ps.201300032

PubMed Abstract | CrossRef Full Text | Google Scholar

13. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. (2011) 38:4–23. doi: 10.1007/s10488-010-0327-7

PubMed Abstract | CrossRef Full Text | Google Scholar

14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. (2009) 4:1–15. doi: 10.1186/1748-5908-4-50

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. (2013) 8:1–11. doi: 10.1186/1748-5908-8-35

PubMed Abstract | CrossRef Full Text | Google Scholar

16. Benjamin Wolk C, Powell BJ, Beidas RS. Contextual Influences and Strategies for Dissemination and Implementation in Mental Health. New York, NY: Oxford Handjournal Online (2015).

Google Scholar

17. Gagliardi AR, Légaré F, Brouwers MC, Webster F, Badley E, Straus S. Patient-mediated knowledge translation (PKT) interventions for clinical encounters: a systematic review. Implement Sci. (2016) 11:1–13. doi: 10.1186/s13012-016-0389-3

PubMed Abstract | CrossRef Full Text | Google Scholar

18. Flanagan ME, Ramanujam R, Doebbeling BN. The effect of provider- and workflow-focused strategies for guideline implementation on provider acceptance. Implement Sci. (2009) 4:1–10. doi: 10.1186/1748-5908-4-71

PubMed Abstract | CrossRef Full Text | Google Scholar

19. Wensing M, Laurant M, Ouwens M, Wollersheim H. Organizational implementation strategies for change. In: Grol R, Wensing M, Eccles M, Davis D, editors. Improving Patient Care: The Implementation of Change in Health Care. Chichester, West Sussex: Wiley-Blackwell (2013). pp. 240–253.

Google Scholar

20. Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am J Prev Med. (2010) 38:443–56. doi: 10.1016/j.amepre.2009.12.035

PubMed Abstract | CrossRef Full Text | Google Scholar

21. Chinman M, Acosta J, Ebener P, Malone PS, Slaughter ME. Can implementation support help community-based settings better deliver evidence-based sexual health promotion programs? A randomized trial of Getting To Outcomes®. Implement Sci. (2016) 11:78. doi: 10.1186/s13012-016-0446-y

PubMed Abstract | CrossRef Full Text | Google Scholar

22. Wensing M, Eccles M, Grol R. Economic and policy strategies for implementation of change. In: Grol R, Wensing M, Eccles M, Davis D, editors. Improving Patient Care: The Implementation of Change in Health Care. Chichester, West Sussex: Wiley-Blackwell (2013). pp. 269–277.

Google Scholar

23. Weiner BJ, Lewis MA, Clauser SB, Stitzenberg KB. In search of synergy: strategies for combining interventions at multiple levels. JNCI Monogr. (2012) 44:34–41. doi: 10.1093/jncimonographs/lgs001

CrossRef Full Text | Google Scholar

24. Cochrane Effective Practice and Organisation of Care Group. Data Collection Checklist. (2002). Available online at: http://epoc.cochrane.org/sites/epoc.cochrane.org/files/uploads/datacollectionchecklist.pdf

25. Mazza D, Bairstow P, Buchan H, Chakraborty SP, Van Hecke O, Grech C, et al. Refining a taxonomy for guideline implementation: results of an exercise in abstract classification. Implement Sci. (2013) 8:1–10. doi: 10.1186/1748-5908-8-32

PubMed Abstract | CrossRef Full Text | Google Scholar

26. Effective Practice and Organisation of Care (EPOC). EPOC Taxonomy. (2015) Available online at: https://epoc.cochrane.org/epoc-taxonomy

27. Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. (2013) 46:81–95. doi: 10.1007/s12160-013-9486-6

PubMed Abstract | CrossRef Full Text | Google Scholar

28. Kok G, Gottlieb NH, Peters GY, Mullen PD, Parcel GS, Ruiter RAC, et al. A taxonomy of behaviour change methods: an intervention mapping approach. Health Psychol Rev. (2016) 10:297–312. doi: 10.1080/17437199.2015.1077155

PubMed Abstract | CrossRef Full Text | Google Scholar

29. Mittman BS. Implementation science in health care In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press. pp. 400–418.

30. Oxman AD, Thomson MA, Davis DA, Haynes B. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. Can Med Assoc J. (1995) 153:1424–31.

PubMed Abstract | Google Scholar

31. Forsetlund L, Bjørndal A, Rashidian A, Jamtvedt G, O'Brien MA, Wolf F, et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. (2009) 2:CD003030. doi: 10.1002/14651858.CD003030.pub2.

CrossRef Full Text | Google Scholar

32. Farmer AP, Légaré F, Turcot L, Grimshaw JM, Harvey E, McGowan J, Wolf FM. Printed educational materials: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. (2011) 3:CD004398. doi: 10.1002/14651858.CD004398.pub2

CrossRef Full Text | Google Scholar

33. Flodgren G, Parmelli E, Doumit G, Gattellari M, O'Brien MA, Grimsshaw J, et a. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. (2011) 8:CD000125. doi: 10.1002/14651858.CD000125.pub4

CrossRef Full Text | Google Scholar

34. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. (2012) 7:1–17. doi: 10.1186/1748-5908-7-50

PubMed Abstract | CrossRef Full Text | Google Scholar

35. Giguère A, Légaré F, Grimshaw J, Turcotte S, Fiander M, Grudniewicz A, et al. Printed educational materials: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. (2012) 10:CD004398. doi: 10.1002/14651858.CD004398.pub3

PubMed Abstract | CrossRef Full Text | Google Scholar

36. O'Brien MA, Rogers S, Jamtvedt G, Oxman AD, Odgaard-Jensen J, Kristoffersen DT, et al. Educational outreach visits: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. (2007) 4:CD000409. doi: 10.1002/14651858.CD000409.pub2

CrossRef Full Text | Google Scholar

37. Baker R, Comosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. (2015) 4:1–118. doi: 10.1002/14651858.CD005470.pub3

CrossRef Full Text | Google Scholar

38. Cochrane Collaboration. Cochrane Effective Practice Organisation of Care Group. (2013) Available online at: http://epoc.cochrane.org

39. Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single component interventions in changing healthcare professionals' behaviours? An overview of systematic reviews. Implement Sci. (2014) 9:152. doi: 10.1186/s13012-014-0152-6

PubMed Abstract | CrossRef Full Text | Google Scholar

40. Wensing M. The Tailored Implementation in Chronic Diseases (TICD) project: introduction and main findings. Implement Sci. (2017) 12:1–4. doi: 10.1186/s13012-016-0536-x

PubMed Abstract | CrossRef Full Text | Google Scholar

41. Wensing M, Bosch M, Grol R. Selecting, tailoring, and implementing knowledge translation interventions. In: Straus S, Tetroe J, Graham ID, editors. Knowledge Translation in Health Care: Moving From Evidence to Practice. Oxford, UK: Wiley-Blackwell. pp. 94–113.

42. Bosch M, van der Weijden T, Wensing M, Grol R. Tailoring quality improvement interventions to identified barriers: a multiple case analysis. J Eval Clin Pract. (2007) 13:161–8. doi: 10.1111/j.1365-2753.2006.00660.x

PubMed Abstract | CrossRef Full Text | Google Scholar

43. Wensing M, Oxman A, Baker R, Godycki-Cwirko M, Flottorp S, Szecsenyi J, et al. Tailored Implementation for Chronic Diseases (TICD): a project protocol. Implement Sci. (2011) 6:1–8. doi: 10.1186/1748-5908-6-103

PubMed Abstract | CrossRef Full Text | Google Scholar

44. Highfield L, Valeria MA, Fernandez ME, Bartholomew Eldridge K. Development of an implementation intervention using intervention mapping to increase mammography among low income women. Front Public Health (2018) 6:300. doi: 10.3389/fpubh.2018.00300

PubMed Abstract | CrossRef Full Text | Google Scholar

45. Williams NJ, Glisson C, Hemmelgarn A, Green P. Mechanisms of change in the ARC organizational strategy: increasing mental health clinicians' EBP adoption through improved organizational culture and capacity. Adm Policy Ment Health (2017) 44:269–83. doi: 10.1007/s10488-016-0742-5

PubMed Abstract | CrossRef Full Text | Google Scholar

46. Gude WT, Roos-Blom M, van der Veer SN, de Jonge E, Peek N, Dongelmans DA, et al. Electronic audit and feedback intervention with action implementation toolbox to improve pain management in intensive care: protocol for a laboratory experiment and cluster randomised trial. Implement Sci. (2017) 12:1–12. doi: 10.1186/s13012-017-0594-8

PubMed Abstract | CrossRef Full Text | Google Scholar

47. Kilbourne AM, Almirall D, Eisenberg D, Waxmonsky J, Goodrich DE, Fortney JC, et al. Protocol: adaptive Implementation of Effective Programs Trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. (2014) 9:1–14. doi: 10.1186/s13012-014-0132-x

PubMed Abstract | CrossRef Full Text | Google Scholar

48. Tailored Implementation for Chronic Diseases (2017). Available online at: https://www.biomedcentral.com/collections/TICD

49. Hoomans T, Severens JL. Economic evaluation of implementation strategies in health care. Implement Sci. (2014) 9:1–6. doi: 10.1186/s13012-014-0168-y

PubMed Abstract | CrossRef Full Text | Google Scholar

50. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Res Policy Syst. (2017) 15:1–12. doi: 10.1186/s12961-017-0175-y

PubMed Abstract | CrossRef Full Text | Google Scholar

51. Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. (2018) 49:525–37. doi: 10.1016/j.beth.2017.11.012

PubMed Abstract | CrossRef Full Text | Google Scholar

52. Bunger AC, Hanson RF, Doogan NJ, Powell BJ, Cao Y, Dunn J. Can learning collaboratives support implementation by rewiring professional networks? Adm Policy Ment Health Ment Health Serv Res. (2016) 43:79–92. doi: 10.1007/s10488-014-0621-x

PubMed Abstract | CrossRef Full Text | Google Scholar

53. Michie S, Atkins L, Gainforth HL. Changing behaviour to improve clinical practice and policy. In: Dias P, Gonçalves A, Azevedo A, Lobo F, editors. Novos Desafios, Novas Competências: Contributos Atuais da Psicologia, Braga, Portugal: Axioma - Publicações da Faculdade de Filosofia. pp. 41–60.

54. Powell BJ, Proctor EK. Learning from implementation as usual in children's mental health. Implement Sci. (2016) 11:26–27. doi: 10.1186/1748-5908-8-92

CrossRef Full Text | Google Scholar

55. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. (2017) 44:177–94. doi: 10.1007/s11414-015-9475-6

PubMed Abstract | CrossRef Full Text | Google Scholar

56. Colquhoun HL, Squires JE, Kolehmainen N, Grimshaw JM. Methods for designing interventions to change healthcare professionals' behaviour: a systematic review. Implement Sci. (2017) 12:1–11. doi: 10.1186/s13012-017-0560-5

PubMed Abstract | CrossRef Full Text | Google Scholar

57. Grol R, Bosch M, Wensing M. Development and selection of strategies for improving patient care. In: Grol R, Wensing M, Eccles M, Davis D, editors. Improving Patient Care: The Implementation of Change in Health Care. Chichester: John Wiley & Sons, Inc. pp. 165–184.

58. Bartholomew Eldridge LK, Markham CM, Ruiter RAC, Fernández ME, Kok G, Parcel GS. Planning Health Promotion Programs: An Intervention Mapping Approach. 4th edition. San Francisco, CA: Jossey-Bass, Inc. (2016).

Google Scholar

59. Kane M, Trochim WMK. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA: Sage (2007).

Google Scholar

60. Farley K, Thompson C, Hanbury A, Chambers D. Exploring the feasibility of conjoint analysis as a tool for prioritizing innovations for implementation. Implement Sci. (2013) 8:1–9. doi: 10.1186/1748-5908-8-56

PubMed Abstract | CrossRef Full Text | Google Scholar

61. Hovmand PS. Community Based System Dynamics. New York, NY: Springer (2014).

Google Scholar

62. Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implement Sci. (2014) 9:102. doi: 10.1186/s13012-014-0102-3

PubMed Abstract | CrossRef Full Text | Google Scholar

63. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health Ment Health Serv Res. (2016) 43:783–98. doi: 10.1007/s10488-015-0693-2

PubMed Abstract | CrossRef Full Text | Google Scholar

64. National Institutes of Health. Dissemination and Implementation Research in Health (R01). Bethesda, MD: National Institutes of Health (2016). Available online at: http://grants.nih.gov/grants/guide/pa-files/PAR-16-238.html

65. Edmondson D, Falzon L, Sundquist KJ, Julian J, Meli L, Sumner JA, et al. A systematic review of the inclusion of mechanisms of action in NIH-funded intervention trials to improve medication adherence. Behav Res Ther. (2018) 101:12–9. doi: 10.1016/j.brat.2017.10.001

PubMed Abstract | CrossRef Full Text | Google Scholar

66. Williams NJ, Beidas RS. The state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry (2018). doi: 10.1111/jcpp.12960. [Epub ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

67. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health (2018) 6:1–6. doi: 10.3389/fpubh.2018.00136

PubMed Abstract | CrossRef Full Text | Google Scholar

68. Grol R, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M. Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q. (2007) 85:93–138. doi: 10.1111/j.1468-0009.2007.00478.x

PubMed Abstract | CrossRef Full Text | Google Scholar

69. The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG). Designing theoretically-informed implementation interventions. Implement Sci. (2006) 1:1–8. doi: 10.1186/1748-5908-1-4

CrossRef Full Text | Google Scholar

70. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health (2017) 38:1–22. doi: 10.1146/annurev-publhealth-031816-044215

PubMed Abstract | CrossRef Full Text | Google Scholar

71. Brown CH, Have TRT, Jo B, Dagne G, Wyman PA, Muthen B, et al. Adaptive designs for randomized trials in public health. Annu Rev Public Health (2009) 30:1–25. doi: 10.1146/annurev.publhealth.031308.100223

PubMed Abstract | CrossRef Full Text | Google Scholar

72. Burke JG, Lich KH, Neal JW, Meissner HI, Yonas M, Mabry PL. Enhancing dissemination and implementation research using systems science methods. Int J Behav Med. (2015) 22:283–91. doi: 10.1007/s12529-014-9417-3

PubMed Abstract | CrossRef Full Text | Google Scholar

73. Zimmerman L, Lounsbury D, Rosen C, Kimerling R, Trafton J, Lindley S. Participatory system dynamics modeling: increasing engagement and precision to improve implementation planning in systems. Adm Policy Ment Health Ment Health Serv Res. (2016) 43:834–49. doi: 10.1007/s10488-016-0754-1

PubMed Abstract | CrossRef Full Text | Google Scholar

74. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed methods designs in implementation research. Adm Policy Ment Health Ment Health Serv Res. (2011) 38:44–53. doi: 10.1007/s10488-010-0314-z

CrossRef Full Text | Google Scholar

75. Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat. (2012) 17:67–79. doi: 10.1177/1077559511426908

PubMed Abstract | CrossRef Full Text | Google Scholar

76. Alexander JA, Hearld LR. Methods and metrics challenges of delivery-systems research. Implement Sci. (2012) 7:15. doi: 10.1186/1748-5908-7-15

PubMed Abstract | CrossRef Full Text | Google Scholar

77. Patient Centered Outcomes Research Institute. Standards for Studies of Complex Interventions. Washington, DC: Patient-Centered Outcomes Research Institute (2018). Available online at: https://www.pcori.org/research-results/about-our-research/research-methodology/pcori-methodology-standards?utm_source=Funding+awards%2C+GAO+Board+deadline&utm_campaign=Funding+awards%2C+GAO+Board+deadline&utm_medium=email#Complex

78. Michie S, Carey RN, Johnston M, Rothman AJ, de Bruin M, Kelly MP, et al. From theory-inspired to theory-based interventions: a protocol for developing and testing a methodology for linking behaviour change techniques to theoretical mechanisms of action. Ann Behav Med. (2016) 52:501–12. doi: 10.1007/s12160-016-9816-6

PubMed Abstract | CrossRef Full Text | Google Scholar

79. Institute of Medicine. Initial National Priorities for Comparative Effectiveness Research. Washington, DC: The National Academies Press (2009).

80. Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, et al. An implementation research agenda. Implement Sci. (2009) 4:1–7. doi: 10.1186/1748-5908-4-18

PubMed Abstract | CrossRef Full Text | Google Scholar

81. Newman K, Van Eerd D, Powell BJ, Urquhart R, Cornelissen E, Chan V, et al. Identifying priorities in knowledge translation from the perspective of trainees: results from an online survey. Implement Sci. (2015) 10:1–4. doi: 10.1186/s13012-015-0282-5

PubMed Abstract | CrossRef Full Text | Google Scholar

82. Mazzucca S, Tabak RG, Pilar M, Ramsey AT, Baumann AA, Kryzer E, et al. Variation in research designs used to test the effectiveness of dissemination and implementation strategies: a review. Front Public Health (2018) 6:1–10. doi: 10.3389/fpubh.2018.00032

PubMed Abstract | CrossRef Full Text | Google Scholar

83. Ivers NM, Sales A, Colquhoun H, Michie S, Foy R, Francis JJ, et al. No more ‘business as usual’ with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci. (2014) 9:14. doi: 10.1186/1748-5908-9-14

PubMed Abstract | CrossRef Full Text | Google Scholar

84. Hysong SJ. Audit and feedback features impact effectiveness on care quality. Med Care (2009) 47:356–63. doi: 10.1097/MLR.0b013e3181893f6b

PubMed Abstract | CrossRef Full Text | Google Scholar

85. Collins LM, Murphy SA, Strecher V. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. Am J Prev Med. (2007) 32:S112–8. doi: 10.1016/j.amepre.2007.01.022

PubMed Abstract | CrossRef Full Text | Google Scholar

86. Lewis CC, Scott K, Marti CN, Marriott BR, Kroenke K, Putz JW, et al. Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implement Sci. (2015) 10:1–14. doi: 10.1186/s13012-015-0313-2

PubMed Abstract | CrossRef Full Text | Google Scholar

87. Vale L, Thomas R, MacLennan G, Grimshaw J. Systematic review of economic evaluations and cost analyses of guideline implementation strategies. Eur J Health Econ. (2007) 8:111–21. doi: 10.1007/s10198-007-0043-8

PubMed Abstract | CrossRef Full Text | Google Scholar

88. Raghavan R. The role of economic evaluation in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press (2018). p. 89–106.

Google Scholar

89. Saldana L, Chamberlain P, Bradford WD, Campbell M, Landsverk J. The cost of implementing new strategies (COINS): a method for mapping implementation resources using the stages of implementation completion. Child Youth Serv Rev. (2014) 39:177–82. doi: 10.1016/j.childyouth.2013.10.006

PubMed Abstract | CrossRef Full Text | Google Scholar

90. Ritzwoller DP, Sukhanova A, Gaglio B, Glasgow RE. Costing behavioral interventions: a practical guide to enhance translation. Ann Behav Med. (2009) 37:218–27. doi: 10.1007/s12160-009-9088-5

PubMed Abstract | CrossRef Full Text | Google Scholar

91. Brown CH, Chamberlain P, Saldana L, Padgett C, Wang W, Cruden G. Evaluation of two implementation strategies in 51 child county public service systems in two states: results of a cluster randomized head-to-head implementation trial. Implement Sci. (2014) 9:1–15. doi: 10.1186/s13012-014-0134-8

PubMed Abstract | CrossRef Full Text | Google Scholar

92. Michie S, Fixsen DL, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. (2009) 4:1–6. doi: 10.1186/1748-5908-4-40

PubMed Abstract | CrossRef Full Text | Google Scholar

93. Dunbar J, Hernan A, Janus E, Davis-Lameloise N, Asproloupos D, O'Reilly S, et al. Implementation salvage experiences from the Melbourne diabetes prevention study. BMC Public Health (2012) 12:806. doi: 10.1186/1471-2458-12-806.

PubMed Abstract | CrossRef Full Text | Google Scholar

94. Hoagwood KE, Chaffin M, Chamberlain P, Bickman L, Mittman B. Implementation salvage strategies: maximizing methodological flexibility in children's mental health research. In: 4th Annual NIH Conference on the Science of Dissemination and Implementation. Washington, DC (2011).

Google Scholar

95. Nadeem E, Olin S, Hoagwood KE, Horwitz SM. Understanding the components of quality improvement collaboratives: a systematic literature review. Milbank Q (2013) 91:354–94. doi: 10.1111/milq.12016

PubMed Abstract | CrossRef Full Text | Google Scholar

96. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. (2014) 9:51. doi: 10.1186/1748-5908-9-51

PubMed Abstract | CrossRef Full Text | Google Scholar

97. Bragge P, Grimshaw JM, Lokker C, Colquhoun H, The AIMD Writing/Working Group. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. (2017) 17:38. doi: 10.1186/s12874-017-0314-8

PubMed Abstract | CrossRef Full Text | Google Scholar

98. Hoffman TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ (2014) 348:g1687. doi: 10.1136/bmj.g1687

CrossRef Full Text | Google Scholar

99. Workgroup for Intervention Development and Evaluation Research. WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions (2008). Available online at: https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-7-70

100. Albrecht L, Archibald M, Arseneau D, Scott SD. Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations. Implement Sci. (2013) 8:52. doi: 10.1186/1748-5908-8-52

PubMed Abstract | CrossRef Full Text | Google Scholar

101. Pinnock H, Epiphaniou E, Sheikh A, Griffiths C, Eldridge S, Craig P, et al. Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-Delphi. Implement Sci. (2015) 10:1–9. doi: 10.1186/s13012-015-0235-z

PubMed Abstract | CrossRef Full Text | Google Scholar

102. Neta G, Glasgow RE, Carpenter CR, Grimshaw JM, Rabin BA, Fernandez ME, et al. A framework for enhancing the value of research for dissemination and implementation. Am J Public Health (2015) 105:49–57. doi: 10.2105/AJPH.2014.302206

PubMed Abstract | CrossRef Full Text | Google Scholar

103. McMaster University. Health Systems Evidence (2012). Available online at: http://www.mcmasterhealthforum.org/healthsystemsevidence-en

104. Rx for Change. Interventions Database (2011). Available online at: https://www.cadth.ca/rx-change

Keywords: implementation strategies, implementation science, designing and tailoring, mechanisms, effectiveness research, economic evaluation, reporting guidelines

Citation: Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM and Weiner BJ (2019) Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Front. Public Health 7:3. doi: 10.3389/fpubh.2019.00003

Received: 16 October 2018; Accepted: 04 January 2019;
Published: 22 January 2019.

Edited by:

Mary Evelyn Northridge, New York University, United States

Reviewed by:

Deborah Paone, Independent Researcher, Minneapolis, MN, United States
Christopher Mierow Maylahn, New York State Department of Health, United States

Copyright © 2019 Powell, Fernandez, Williams, Aarons, Beidas, Lewis, McHugh and Weiner. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Byron J. Powell, bjpowell@unc.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.