Skip to main content
  • Research article
  • Open access
  • Published:

Training needs and supports for evidence-based decision making among the public health workforce in the United States

Abstract

Background

Preparing the public health workforce to practice evidence-based decision making (EBDM) is necessary to effectively impact health outcomes. Few studies report on training needs in EBDM at the national level in the United States. We report competency gaps to practice EBDM based on four U.S. national surveys we conducted with the state and local public health workforce between 2008 and 2013.

Methods

We compared self-reported data from four U.S. national online surveys on EBDM conducted between 2008 and 2013. Participants rated the importance of each EBDM competency then rated how available the competency is to them when needed on a Likert scale. We calculated a gap score by subtracting availability scores from importance scores. We compared mean gaps across surveys and utilized independent samples t tests and Cohen’s d values to compare state level gaps. In addition, participants in the 2013 state health department survey selected and ranked three items that “would most encourage you to utilize EBDM in your work” and items that “would be most useful to you in applying EBDM in your work”. We calculated the percentage of participants who ranked each item among their top three.

Results

The largest competency gaps were consistent across all four surveys: economic evaluation, communicating research to policymakers, evaluation designs, and adapting interventions. Participants from the 2013 state level survey reported significantly larger mean importance and availability scores (p <0.001, d =1.00, and p <0.001, d = .78 respectively) and smaller mean gaps (p <0.01, d = .19) compared to the 2008 survey. Participants most often selected “leaders prioritizing EBDM” (67.9%) among top ways to encourage EBDM use. “EBDM training for specific areas” was most commonly ranked as important in applying EBDM (64.3%).

Conclusion

Perceived importance and availability of EBDM competencies may be increasing as supports for EBDM continue to grow through trends in funding, training, and resources. However, more capacity building is needed overall, with specific attention to the largest competency gaps. More work with public health departments to both situate trainings to boost competency in these areas and continued improvements for organizational practices (leadership prioritization) are possible next steps to sustain EBDM efforts.

Peer Review reports

Background

Applying an evidence-based framework to public health practice is essential for effective program and policy planning, implementation, and evaluation and has the potential for improved population health outcomes [1],[2]. An evidence-based public health approach requires practitioners to utilize the best available research evidence, use data and information sources systematically, apply appropriate planning frameworks for programs or policies, engage community members in assessment and decision making processes, conduct sound evaluation of programs, and disseminate findings to key stakeholders [3]-[5]. Using this framework to guide policy or program decisions in public health is referred to as evidence-based decision making (EBDM) [2],[4].

In order to engage in EBDM one needs a specified mix of knowledge, understanding and skills, or “competencies” [6] for EBDM processes. Main competencies needed for EBDM include action planning, adapting interventions, communicating research to policy makers, economic evaluation, designing evaluations (including qualitative and quantitative approaches), and prioritizing program and policy options [4]. Several of these key competencies are reflected in the Council on Linkages (COL) as being important for public health workforce development and education [7]. For example, in Financial Planning and Management Skills (COL’s domain 7), understanding and using cost-benefit and cost-utility analysis for program decision making mirrors the EBDM competency to incorporate economic evaluation into public health program decision making [7],[8]. Similarly, EBDM competencies are consistent with the Public Health Accreditation Board (PHAB) standards for local and state public health agencies [9]. For example, public health agencies are charged to “identify and use the best available evidence for making informed public health practice decisions” (standard 10.1) and to “promote understanding and use of research results, evaluations, and evidence-based practices with appropriate audiences” (standard 10.2) [9]. While there have been studies to examine the use of some of these competencies in general [10],[11], few studies have specifically examined the perceived competencies needed to engage in EBDM among the public health workforce.

The recognition of the importance of these competencies has led to the development of a variety of tools, resources and trainings to support the application of EBDM. For example, publicly available online tools such as the Community Guide [12], Cancer P.L.A.N.E.T. [13], Research to Reality [14], and the Community Toolbox [15] provide toolkits and resources to enhance these competencies. In addition, national agencies provide technical assistance (such as that offered through the National Association of Chronic Disease Directors (NACDD) State Technical Assistance & Review program) and recommendations for local and state agencies to more effectively implement EBDM [16]. Also, in-person trainings are offered to public health professionals to build capacity to practice core EBDM competencies through a series of stepped modules [17]-[19]. For example, a module in economic evaluation prepares attendees to incorporate cost-benefit and cost-utility analysis into decision making [19].These evidence-based resources and supports reflect the increasing awareness and perceived value of EBDM in academia, public health practice and funding organizations. However, little is known regarding the impact these have had on the perceived competency to engage in EBDM.

Identifying training needs for EBDM competencies is crucial to understand where more competency-based efforts to build capacity could be targeted through training, use of analytic tools, and other resources. However, representative, national level data on EBDM competencies are lacking. To address this gap, we compare perceptions of EBDM competencies reported in four U.S. national surveys we conducted with the public health workforce at the local and state level.

Methods

We conducted four U.S. national online surveys of the state and local public health workforce. The main objective of the current study is to compare and contrast perceptions of EBDM competencies across these surveys. Participants were state and local public health department employees. Participants in all four surveys answered questions on ten competencies needed for EBDM. The four national sets of participants included in the current study are as follows. The first group is 441 U.S. state health department chronic disease prevention staff that completed a 2008 survey that assessed barriers to practice EBDM at the state level. The second group of respondents is 904 U.S. state health department chronic disease prevention staff from a 2013 survey that examined perceptions of state level support to practice EBDM. The third group is 517 local health department directors that completed a 2012 survey and the fourth group of respondents included in the current study is from a 2013 survey with 332 local health department program managers identified by the third group. The third and fourth groups of participants received the same survey, which sought to examine organizational support and barriers for EBDM at the local level. We describe the survey methods in the paragraphs that follow.

State health department 2008 survey

In 2008, we administered an online survey with state level staff working in chronic disease prevention in all 50 states, District of Columbia, and U.S. territories. The 74-item survey derived from previous work [20],[21] and included questions about use of the Community Guide and other resources, the application of evidence-based interventions, personal chronic disease-related health behaviors of the participants, and other demographic information. The survey underwent cognitive response testing (n =12) with experts in chronic disease prevention [22],[23], and their feedback was incorporated into the final survey. The survey took participants approximately 15 minutes to complete.

All NACDD members were contacted via email to explain the survey’s purpose. A week later, participants received another email with a link to the online survey. Research staff completed follow-up via telephone and email until the survey closed in August 2008. As an incentive, participants were entered into a drawing to win one of numerous $20 gift cards or 25,000 airline miles. Of the 469 total participants (65% response rate), we included in this study the 441 state health department staff working in the 50 U.S. states and District of Columbia. The Saint Louis University Institutional Review Board reviewed and granted human subjects approval. Further details on the sampling methods and data collection are reported elsewhere [24].

State health department 2013 survey

We administered an online survey in spring of 2013 with staff working in chronic disease prevention at the state level in all U.S. states and U.S. territories. The 68-item survey tool was based on our previous work in the 2008 survey mentioned above [5],[24]-[26] and included participant characteristics, use of evidence-based interventions, presence of organizational supports for EBDM, and access to evidence resource. The survey was revised based on cognitive response testing interviews with chronic disease prevention experts that previously worked in state health departments (n =11) and reliability test-retest with 75 current state health department employees in chronic disease prevention. The survey took the participants approximately 15 minutes to complete.

Research staff identified a list of program managers and staff chronic disease prevention through state health department websites, NACDD contact lists, and contact lists from partnering organizations. Contacts from the list were sent email invitations with a survey link in Qualtrics [27]. Participants were offered an optional $20 gift card to Amazon.com upon completion. Of the 923 total participants (76% response rate), we included in this study the 904 state health department staff working in the 50 U.S. states and District of Columbia. The Institutional Review Board of Washington University in St. Louis reviewed and granted human subjects approval. Further details on the sampling methods and data collection are reported elsewhere [28].

Local public health 2012-2013 surveys

Between December of 2012 and February of 2013, we conducted two surveys with local health departments; the first with local health department directors and the second with program managers. For both groups, we used a 66-item survey tool similar to that used in the 2013 state level survey and was based on previous work [8],[17],[21],[24],[29]. In addition, the tool included items based on a public health systems logic model and related frameworks [30]-[33] and assessed views on barriers to and management practices for EBDM among other items.

We selected a random sample (stratified by jurisdictional population size) of 1,067 local health departments from the National Association of County and City Health Officials database. Project staff sent survey links in Qualtrics [27] to the director of each randomly selected health department. There were 517 (54%) directors who completed the survey and are included in this study. The survey asked the participating directors to identify program managers within their local health department in chronic disease, environmental health, and infectious disease. We then sent the same online survey to each identified program manager directly, of which 332 completed the survey (67% response rate) and are included in this study. Participants were offered a $20 gift card to Amazon.com upon completion. The Institutional Review Board of Washington University in St. Louis reviewed and granted human subjects approval. Further details on the sampling methods and data collection are reported elsewhere [34]-[36].

EBDM competency and supports measures

Each of the surveys described above asked participants to rate perceived importance and availability of ten specific EBDM competency items. Participants first rated the “importance of each of the skills to you” then rated “how available each skill is to you when you need it either in your own skill set or among others’ in your agency”. Both were rated on an 11-point Likert scale 0 = not important to 10 = very important and 0 = not available to 10 = very available. A total of eight EBDM competency items were identical or highly similar (same basic concept with only minor wording differences) across the surveys. These eight EBDM competency items are included in this study: action planning, adapting interventions, communicating research to policy makers, economic evaluation, evaluation designs, prioritization of program and policy options, qualitative evaluation, and quantitative evaluation.

In the 2013 state health department survey, participants also ranked three incentives that “would most encourage you to utilize EBDM in your work” and three training modalities that “would be most useful to you in applying EBDM in your work” from provided response option lists. Participants could also type in and rank “other” responses.

Analysis

To summarize participant characteristics, we conducted descriptive analyses of participants in each of the four survey groups. Following previously utilized methods [29], we created a competency gap score by subtracting the Likert scale rated availability of each EBDM competency from rated importance. For example, a score of 9 for importance minus a score of 6 for availability equaled a gap score of 3. We calculated confidence intervals at 95% for the mean scores of rated importance, availability, and corresponding gaps. To compare state level participants’ competency gaps, we conducted independent samples t tests. Large sample sizes are prone to produce statistically significant results for small differences that may or may not have practical significance [37]. To address this, we calculated Cohen’s d values for mean differences observed in t tests. Cohen’s d values are calculated as the mean difference between samples divided by the pooled standard deviation from both samples [38]. Cohen’s d values give additional detail regarding the magnitude and direction of observed differences. Cohen suggests the following effect ranges for d: 0.2 small, 0.5 medium, and 0.8 large [38],[39]. We aggregated mean gaps by survey, and created bar graphs to assess broader trends across all surveys. For each EBDM support rank item, we calculated the percentage of participants who ranked that item among their top three items. We restricted analysis to participants practicing in the 50 U.S. states and the District of Columbia which excluded 28 participants in the 2008 state level survey and 19 from the 2013 state level survey.

Results

Participant characteristics

Table 1 shows participant characteristics across the four respondent groups. There were higher percentages of females represented in the two state level surveys (approximately 80%) as compared to the local public health workforce surveys (between 60-65%). On average, participants in the local public health workforce surveys reported more years in their current position (7.7 for local program managers and 8.5 years for local directors) than those from the state level surveys (4.6 for 2008 and 4.9 for 2013 state level participants). Participants from the local directors survey had the highest mean years involved in public health (20.4 years). Local public health program managers had the lowest proportion of workforce reporting a Master in Public Health degree (18.1%).

Table 1 Participant characteristics from four national surveys of state and local public health department professionals

Gap analyses

Competencies with the largest training gaps were consistent across all four survey groups: economic evaluation, communicating research to policymakers, evaluation designs, and adapting interventions (Table 2). Table 3 shows mean importance, availability, and gap scores with corresponding 95% confidence intervals, p values from independent samples t tests, and Cohen’s d values for 2008 and 2013 state level workforce surveys. Importance and availability scores were significantly greater for state level staff in 2013 than in 2008 (p <0.001, d =1.00, and p <0.001, d =0.78 respectively). Participants from the 2013 state level survey reported significantly smaller mean gaps compared to those in 2008 (p <0.01, d =0.19). In addition, 2013 state level staff had significantly lower gaps in communicating research to policymakers (p <0.001, d =0.25), quantitative evaluation (p <0.001, d =0.23) and in action planning to achieve goals and objectives (p <0.001, d = 0.11) compared to state level staff in 2008. The overall mean importance score (8.8) and availability score (6.1) for the 2008 state level staff were the lowest among all four groups. The other three groups (2013 state level staff and both sets of local level participants) had similar mean importance and availability scores (mean importance range: 9.5 to 9.8 and mean availability range: 7.2 to 7.4) (data not shown). Figure 1 shows overall reported gaps for EBDM competencies were smaller for local public health staff than among state level staff.

Table 2 Largest competency gaps in evidence-based decision making reported in four national public health workforce surveys
Table 3 Five-year comparison of state health department staff self-rated a EBDM competency importance, availability, and calculated gaps
Figure 1
figure 1

Evidence-based decision making competency gaps in four national public health workforce surveys. Staff in state and local public health departments were asked to rate importance and availability of competencies in evidence-based decision making (EBDM). Gaps in EBDM competencies were calculated by subtracting rated availability from importance and then aggregated for each survey.

EBDM supports

Figure 2 shows items ranked by 2013 state health department staff as most useful for the application of EBDM. EBDM training for specific areas (64.3%), summaries of research evidence (48.6%), and help with EBDM processes (40.4%) were the top three most commonly ranked items. Two percent (1.9%) ranked an “other” item and filled in the supplied text box to represent an additional item not listed which would be useful for application of EBDM. Examples of “other” text are “guidance from funder” and “adequate staffing for project planning, implementation and evaluation.”

Figure 2
figure 2

Elements perceived by 2013 state health department staff as most useful in applying EBDM (n=904). In 2013, state health department staff in all U.S. states was asked to rank their top three items which would be most useful for applying evidence-based decision making in their work. The percentage of participants who ranked each item among their top three was calculated.

Items to encourage use of EBDM that were most often ranked in the top three by the 2013 state level staff were agency leaders prioritizing EBDM (67.9%), easy access to EBDM data resources (63.0%), and direct supervisors prioritizing EBDM (46.8%) (Figure 3). Four percent of participants typed in and ranked “other” items that would encourage EBDM use. Examples of “other” ranked items were “time,” “resources,” and “statutory requirements”.

Figure 3
figure 3

Items ranked most important to encourage EBDM use by 2013 state health department staff (n =904). In 2013, a national survey was conducted with state health department staff working in chronic disease. Participants were asked to rank their top three items which would be most important to encourage the use of evidence-based decision making. The percentage of participants who ranked each item among their top three was calculated.

Discussion

The public health workforce faces new and evolving challenges to both preventing and controlling disease in order to promote population health. This evolution requires an updated workforce and creates the need for specific training and skill sets not only for the topic areas in which people work, but also for the processes of applying the best, and integrating the newest, evidence. Overall, our findings suggest that competency to practice EBDM may be growing at the state public health department level and gaps in some skill areas (communicating research to policymakers, quantitative evaluation, and action planning) may be narrowing. However, the largest gaps remain in how to use economic evaluation data, adapt interventions, and design evaluations; and in communicating research to policymakers (despite improved scores in this area). Exact interpretation of the findings is difficult as possible reasons for narrowing gaps are likely many, complex, and compounded. Below, we offer some discussion of factors contributing to possible improvements followed by training needs indicated for the largest reported gaps.

Growing supports for EBDM

Continued and growing trends to include requirements for and/or encouragement of EBDM processes in various public health funding streams may be a possible contributing factor for narrowing gaps and growing perceived availability of EBDM competencies. Many CDC funding announcements require grantees to use evidence-based processes through the implementation of evidence-based interventions, and in some cases, provide a menu of interventions to select from [40]-[43]. For example, the CDC Colorectal Cancer Screening Program requires grantees to implement evidence-based interventions to promote and increase colorectal cancer screening among underserved population groups, highlighting the need for application of evidence-based processes within health agencies [44]. Similarly, seeking PHAB accreditation may help to align activities of health departments with the requirements of funders. EBDM processes along with standardized certification measures, required for accreditation, may increase effective use of allocated funds, and possibly increase the likelihood of receiving further funding [45].

Along with funding trends, the growth of online tools and toolkits (e.g. Community Guide [12], Cancer P.L.A.N.E.T. [13], Research to Reality [14]) to aid and encourage EBDM processes has been expanded in the last five years, though the tools may not cover all competencies needed for EBDM. In addition, evidence-based public health courses which specifically target EBDM competencies are now offered by NACDD to two states each year as well as an annual national course which is open to applicants from chronic disease prevention from all U.S. states [46]. Similarly, other projects and schools of public health conduct comparable training with state and local public health agencies [17],[19],[47]. The Physical Activity and Public Health trainings held each September may also drive EBDM processes for applying evidence-based policies and programs specifically in the area of physical activity and obesity [48].

Related to increased training opportunities, another possible EBDM driving force may be growing numbers of public health programs and/or accreditation standards [49]. The Association of Schools and Programs of Public Health currently lists over 80 certified member schools and programs of public health which represent 39 U.S. states [50].This coupled with more evidence-based trainings may increase opportunity for collaborations between universities and public health agencies to exchange both science and practice evidence and offer increasing chances for exposure to EBDM processes. Academic partnerships to exchange practice and science-based evidence and ensure EBDM competencies may, however, be limited to agencies with geographic access to such universities. Consideration should be given to creating alternative models for long-distance partnering, such as video-conferencing and other communication technologies. The value of these collaborations has been overwhelmingly recognized by PHAB’s accreditation standards which require health departments to maintain academic partnerships as a means to “contribute to and apply the evidence base of public health” [9]. Even with growing numbers of undergraduate and graduate programs, the majority of the public health workforce lacks formal public health training [51],[52]. Our findings show a range between 18% (2013 local level program managers) to 30% (2013 state level staff) of the workforce reported Master of Public Health degrees.

Training needs for top competency gaps

Despite funders’ requirements and encouragement of evidence-based approaches, public health department employees continue to report competency gaps in several key areas (economic evaluation, communicating research to policymakers, adapting interventions, and evaluation designs). While not all, many of the identified gaps in EBDM competencies can be addressed through training programs. Thus, enhancing training capacity for these consistently reported gaps is necessary.

We found that competency to adapt interventions was among the largest gaps reported for all respondent groups and the gap did not narrow between 2008 and 2013 for the state level staff. While supplied intervention menus may be evidence-based (e.g. CDC’s Colorectal Cancer Screening Program), many health departments may be uncertain how to adapt them to ensure they have the expected impact given their population and context. This coincides with recent evaluation efforts of the National Comprehensive Cancer Control Program which found 78% of states reported the need for technical assistance in adapting evidence-based cancer interventions [53] and suggests more training is needed to build this EBDM competency.

Similarly, we found that economic evaluation was the largest reported gap in importance and availability among all four respondent groups. This parallels Wilcox and colleagues’ [54] recent study which found 66% of their sample of state, territorial, and local government public health professionals in chronic disease reported needing training in health economics. Understanding how to compare programs and policies based on economic costs and benefits is integral for the processes involved in making best decisions in public health. However, those formally trained in health economics have sparse knowledge of the unique demands of public health and are often lured outside the non-profit/governmental sphere or focus in medical care economics [55],[56], making training specifically focused in this area challenging. Efforts to increase the number of health economists with public health programing knowledge may increase opportunities for agency-economist collaborations [57]. One such effort is CDC’s Prevention Effectiveness Fellowship program which trains post-doctoral candidates to assess policies and programs based on public health impact [58]. Similarly, increasing partnership opportunities between public health departments and persons with public health policy knowledge may help to improve competency in communicating with policymakers, which is imperative for effective feedback and framing of evidence-based policy [59].

In addition to increasing capacity for competency specific training, more attention to the tailoring of EBDM training and resources may be important as some competencies may simply be more difficult to master. Research in developing evidence-based competencies proposes three tiers (beginner, intermediate, and advanced) of difficulty level for EBDM competencies [8]. Of the top four consistent gaps found in this study, three (economic evaluation, communicating research to policymakers, and evaluation designs) are considered advanced and one intermediate (adapting interventions). While the national surveys in the current study largely featured those in a generalist or management position, large portions of the workforce practice at entry levels [6]. Improved training in EBDM and resource tailoring to diverse practice levels and backgrounds of the workforce, while adhering to adult learning frameworks, may help to further narrow EBDM competency gaps [6],[30],[60].

Organizational opportunities to build capacity

While growing supports for EBDM competencies are suggested in the current study, it should be noted that EBDM as a process faces more than just the challenge of competency achievement or training. Organizational factors such as high staff turnover and the ability to sustain and make time for ongoing training for newly hired staff are commonly reported challenges to EBDM practice among public health agencies [61],[62]. Evidence-based public health courses often last several days, requiring employees to reorganize busy schedules to accommodate training [17],[61]. In addition, 2013 state level staff highly ranked “the prioritization of EBDM by superior staff” as a means of encouragement to use EBDM. This finding points to the importance of administrative practices that are likely to facilitate EBDM. For example, actively seeking and incorporating employee input into decision making processes and supervisory decisions to prioritize EBDM with provision of time to learn and engage in EBDM processes are administrative practices identified as means to create work environments conducive for EBDM [25],[63].

Limitations

Several limitations are worth noting. In the current study, we provide only self-report data from cross-sectional surveys. In addition, sampling methods and response rates for each survey varied, which may suggest response bias. For example, program managers identified by local health department directors for the 2013 local level survey may have been identified based on general responsiveness to emailed surveys, knowledge, or other unknown and varying factors. In addition, the two state level surveys included those working mainly in chronic disease prevention whereas the two local level surveys represented broader areas of public health practice (e.g. directors with overarching responsibilities, environmental health, and communicable disease prevention and control). This may limit our findings as we were not able to compare our local level data to state level data representing broader areas of practice outside of chronic disease prevention. Next steps for research should explore variations in capacity for EBDM across program areas (e.g., chronic disease compared with maternal and child health) and level of public health practice (e.g., state compared with local). In addition, this study did not explicitly examine health department characteristics, such as size and population jurisdiction, which have been shown to be associated with varying levels of department performance [35],[64]. Further exploration of these characteristics is warranted given they are likely to influence capacity for EBDM and levels of competency. However, results for the largest reported EBDM competency gaps were consistent across all surveys. In the state surveys, it is difficult to ascertain the reasons for differences found in competency gaps over time. It is possible that the training and funding expectations have led to an increased awareness of the importance of EBDM but not necessarily the skills and competencies needed, suggesting the reduction in gap between importance and availability is really a function of bias due to socially desirable responses. Despite these limitations, this study offers an important national snapshot of EBDM competencies and their similarity across both the state and local level with corresponding implications for capacity development.

Conclusions

Gaps in EBDM competencies may be narrowing as awareness and prioritization of skill development continue to grow through trends in funding, training, and resources for EBDM. However, across multiple sectors of the public health workforce, more capacity building is needed overall, with specific attention to the largest gaps in economic evaluation, communicating research to policymakers, adapting interventions to different settings and populations, and evaluation designs. More work with public health departments to both situate trainings to boost competency in these areas and continue to improve organizational practices are possible next steps to sustain EBDM efforts.

Authors’ contributions

Conceptualization and design: RCB, EAB, EAD, KD, PA. Survey instrument development: RCB, EAD, KD, PA, SS, EAB. Data collection: EAD, KD, RF, SS, PA. Data management: RF, SS, RRJ. Data analyses: RRJ, RF. Manuscript revisions: All. All authors read and approved the final manuscript.

References

  1. Brownson RC: Evidence-Based Public Health. 2011, Oxford University Press, Oxford; New York, 2

    Google Scholar 

  2. Brownson RC, Fielding JE, Maylahn CM: Evidence-based decision making to improve public health practice. Front Public Health Serv Syst Res. 2013, 2 (2): article 2.

    Google Scholar 

  3. Brownson RC, Fielding JE, Maylahn CM: Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009, 30: 175-201. 10.1146/annurev.publhealth.031308.100134.

    Article  PubMed  Google Scholar 

  4. Brownson RC, Gurney JG, Land GH: Evidence-based decision making in public health. J Public Health Manag Pract. 1999, 5: 86-97. 10.1097/00124784-199909000-00012.

    Article  CAS  PubMed  Google Scholar 

  5. Kohatsu ND, Robinson JG, Torner JC: Evidence-based public health: an evolving concept. Am J Prev Med. 2004, 27: 417-421.

    PubMed  Google Scholar 

  6. Koo D, Miner K: Outcome-based workforce development and education in public health. Annu Rev Public Health. 2010, 31: 253-269. 10.1146/annurev.publhealth.012809.103705. 1 p following 269

    Article  PubMed  Google Scholar 

  7. Stewart KE, Halverson PK, Rose AV, Walker SK: Public health workforce training: application of the council on linkages’ core competencies. J Public Health Manag Pract. 2010, 16: 465-469. 10.1097/PHH.0b013e3181ce4f0b.

    Article  PubMed  Google Scholar 

  8. Brownson RC, Ballew P, Kittur ND, Elliott MB, Haire-Joshu D, Krebill H, Kreuter MW: Developing competencies for training practitioners in evidence-based cancer control. J Cancer Educ. 2009, 24: 186-193. 10.1080/08858190902876395.

    Article  PubMed  Google Scholar 

  9. Public Health Accreditation Board: PHAB Standards and Measures Version 1.0. Alexandria, VA: 2011. ., [http://www.phaboard.org/wp-content/uploads/PHAB-Standards-and-Measures-Version-1.0.pdf]

  10. National Association of County and City Health Officials: 2010 National Profile of Local Health Departments. Washington (DC): 2011. ., [http://naccho.org/topics/infrastructure/profile/resources/2010report/upload/2010_Profile_main_report-web.pdf]

  11. Association of State and Territorial Officials: ASTHO Profile of State Public Health, Volume Two. Arlington, VA: 2011. ., [http://www.astho.org/Profile/Volume-Two/]

  12. The Guide to Community Preventive Services. ., [http://www.thecommunityguide.org/index.html]

  13. Cancer Control P.L.A.N.E.T. (Plan, Link, Act, Network with Evidence-based Tools). ., [http://cancercontrolplanet.cancer.gov/]

  14. Research to Reality ., [https://researchtoreality.cancer.gov/]

  15. The Community Toolbox ., [http://ctb.ku.edu/en]

  16. State Technical Assistance and Review (STAR) Program. ., [http://www.chronicdisease.org/?page=STARFAQ]

  17. Dreisinger M, Leet TL, Baker EA, Gillespie KN, Haas B, Brownson RC: Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008, 14: 138-143. 10.1097/01.PHH.0000311891.73078.50.

    Article  PubMed  Google Scholar 

  18. Gibbert WS, Keating SM, Jacobs JA, Dodson E, Baker E, Diem G, Giles W, Gillespie KN, Grabauskas V, Shatchkute A, Brownson RC: Training the workforce in evidence-based public health: an evaluation of impact among US and international practitioners. Prev Chronic Dis. 2013, 10: E148-10.5888/pcd10.130120.

    Article  PubMed  PubMed Central  Google Scholar 

  19. O’Neall MA, Brownson RC: Teaching evidence-based public health to public health practitioners. Ann Epidemiol. 2005, 15: 540-544. 10.1016/j.annepidem.2004.09.001.

    Article  PubMed  Google Scholar 

  20. Brownson RC, Ballew P, Dieffenderfer B, Haire-Joshu D, Heath GW, Kreuter MW, Myers BA: Evidence-based interventions to promote physical activity: what contributes to dissemination by state health departments. Am J Prev Med. 2007, 33: S66-S73. 10.1016/j.amepre.2007.03.011. quiz S74-68

    Article  PubMed  Google Scholar 

  21. Brownson RC, Ballew P, Brown KL, Elliott MB, Haire-Joshu D, Heath GW, Kreuter MW: The effect of disseminating evidence-based interventions that promote physical activity to health departments. Am J Public Health. 2007, 97: 1900-1907. 10.2105/AJPH.2006.090399.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Jobe JB, Mingay DJ: Cognitive laboratory approach to designing questionnaires for surveys of the elderly. Public Health Rep. 1990, 105: 518-524.

    CAS  PubMed  PubMed Central  Google Scholar 

  23. Jobe JB, Mingay DJ: Cognitive research improves questionnaires. Am J Public Health. 1989, 79: 1053-1055. 10.2105/AJPH.79.8.1053.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC: Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep. 2010, 125: 736-742.

    PubMed  PubMed Central  Google Scholar 

  25. Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC: Fostering more-effective public health by identifying administrative evidence-based practices: a review of the literature. Am J Prev Med. 2012, 43: 309-319. 10.1016/j.amepre.2012.06.006.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Stamatakis KA, McQueen A, Filler C, Boland E, Dreisinger M, Brownson RC, Luke DA: Measurement properties of a novel survey to assess stages of organizational readiness for evidence-based interventions in community chronic disease prevention settings. Imp Sci. 2012, 7: 65-10.1186/1748-5908-7-65.

    Article  Google Scholar 

  27. Qualtrics: Survey Research Suite. ., [http://www.qualtrics.com/]

  28. Allen P, Sequeira S, Jacob RR, Hino AA, Stamatakis KA, Harris JK, Elliott L, Kerner JF, Jones E, Dobbins M, Baker EA, Brownson RC: Promoting state health department evidence-based cancer and chronic disease prevention: a multi-phase dissemination study with a cluster randomized trial component. Imp Sci. 2013, 8: 141-10.1186/1748-5908-8-141.

    Article  Google Scholar 

  29. Jacobs JA, Clayton PF, Dove C, Funchess T, Jones E, Perveen G, Skidmore B, Sutton V, Worthington S, Baker EA, Deshpande AD, Brownson RC: A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012, 12: 57-10.1186/1472-6963-12-57.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Mays GP, Smith SA, Ingram RC, Racster LJ, Lamberth CD, Lovely ES: Public health delivery systems: evidence, uncertainty, and emerging research needs. Am J Prev Med. 2009, 36: 256-265. 10.1016/j.amepre.2008.11.008.

    Article  PubMed  Google Scholar 

  31. Hajat A, Cilenti D, Harrison LM, MacDonald PD, Pavletic D, Mays GP, Baker EL: What predicts local public health agency performance improvement? a pilot study in north Carolina. J Public Health Manag Pract. 2009, 15: E22-E33. 10.1097/01.PHH.0000346022.14426.84.

    Article  PubMed  Google Scholar 

  32. Handler A, Issel M, Turnock B: A conceptual framework to measure performance of the public health system. Am J Public Health. 2001, 91: 1235-1239. 10.2105/AJPH.91.8.1235.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Tilburt JC: Evidence-based medicine beyond the bedside: keeping an eye on context. J Eval Clin Pract. 2008, 14: 721-725. 10.1111/j.1365-2753.2008.00948.x.

    Article  PubMed  Google Scholar 

  34. Reis RS, Duggan K, Allen P, Stamatakis KA, Erwin PC, Brownson RC: Developing a tool to assess administrative evidence-based practices in local health departments. Front Public Health Serv Syst Res. 2014, 3: 2.

    Google Scholar 

  35. Brownson RC, Reis RS, Allen P, Duggan K, Fields R, Stamatakis KA, Erwin PC: Understanding administrative evidence-based practices: findings from a survey of local health department leaders. Am J Prev Med. 2014, 46: 49-57. 10.1016/j.amepre.2013.08.013.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Erwin PC, Harris JK, Smith C, Leep CJ, Duggan K, Brownson RC: Evidence-based public health practice among program managers in local public health departments. J Public Health Manag Pract. 2014, 20: 472-480. 10.1097/PHH.0000000000000027.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Lin M, Lucas HC, Shmueli G: Research commentary-too big to fail: large samples and the p-value problem. Inf Syst Res. 2013, 24: 906-917. 10.1287/isre.2013.0480.

    Article  Google Scholar 

  38. Cohen J: A power primer. Psychol Bull. 1992, 112: 155-10.1037/0033-2909.112.1.155.

    Article  CAS  PubMed  Google Scholar 

  39. Field A: Discovering Statistics Using IBM SPSS Statistics. 2013, Sage Publications Ltd, London, 4

    Google Scholar 

  40. Cancer Prevention and Control Programs for State, Territorial and Tribal Organizations: CDC-RFA-DP12-1205. 2012, Centers for Disease Control and Prevention, Atlanta, GA

  41. State Public Health Actions to Prevent and Control Diabetes, Heart Disease, Obesity and Associated Risk Factors and Promote School Health: CDC-RFA-DP13-1305. 2013, Centers for Disease Control and Prevention, Atlanta, GA

  42. Public Prevention Health Fund: Community Transformation Grant: CDC-RFA-DP11-1103PPHF11. 2011, Centers for Disease Control and Prevention, Atlanta, GA

  43. Prevention and Public Health Fund Coordinated Chronic Disease Prevention and Health Promotion Program: CDC-RFA-DP09-9010301PPHF11. 2011, Centers for Disease Control and Prevention, Atlanta, GA

  44. Hannon PA, Maxwell AE, Escoffery C, Vu T, Kohn M, Leeman J, Carvalho ML, Pfeiffer DJ, Dwyer A, Fernandez ME, Vernon SW, Liang L, DeGroff A: Colorectal cancer control program grantees’ use of evidence-based interventions. Am J Prev Med. 2013, 45: 644-648. 10.1016/j.amepre.2013.06.010.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Riley WJ, Bender K, Lownik E: Public health department accreditation implementation: transforming public health department performance. Am J Public Health. 2012, 102: 237-242. 10.2105/AJPH.2011.300375.

    Article  PubMed  PubMed Central  Google Scholar 

  46. National Association of Chronic Disease Directors: NACDD opportunities improve capacity in evidence-based decision making. ., [http://www.chronicdisease.org/?EvidenceBased]

  47. Maylahn C, Bohn C, Hammer M, Waltz EC: Strengthening epidemiologic competencies among local health professionals in New York: teaching evidence-based public health. Public Health Rep. 2008, 123 (Suppl 1): 35-43.

    PubMed  PubMed Central  Google Scholar 

  48. Physical Activity and Public Health Course. ., [http://www.sph.sc.edu/paph/about.htm]

  49. Evaschwick CJ: Educating the public health workforce. Front Public Health. 2013, 1: 1-3.

    Google Scholar 

  50. The Association of Schools and Programs of Public Health (ASPPH). ., [www.aspph.org]

  51. Tilson H, Gebbie KM: The public health workforce. Annu Rev Public Health. 2004, 25: 341-356. 10.1146/annurev.publhealth.25.102802.124357.

    Article  PubMed  Google Scholar 

  52. Gebbie K, Rosenstock L, Hernandez LM: Who Will Keep the Public Healthy? Educating Public Health Professionals for the 21st Century. 2003, The National Academies Press, Washington (DC)

    Google Scholar 

  53. Steele CB, Rose JM, Chovnick G, Townsend JS, Stockmyer CK, Fonseka J, Richardson LC: Use of evidence-based practices and resources among comprehensive cancer control programs. J Public Health Manag Pract 2014, Epub ahead of print..

  54. Wilcox LS, Majestic EA, Ayele M, Strasser S, Weaver SR: National survey of training needs reported by public health professionals in chronic disease programs in state, territorial, and local governments. J Public Health Manag Pract 2013, Epub ahead of print..

  55. Halpin HA, Hankins SW, Scutchfield FD: Broadening the role of the health economist to include public health research: a commentary. Am J Prev Med. 2009, 36: 276-277. 10.1016/j.amepre.2008.12.003.

    Article  PubMed  Google Scholar 

  56. Cawley J, Morrisey MA: The earnings of U.S. health economists. J Health Econ. 2007, 26: 358-372. 10.1016/j.jhealeco.2006.10.008.

    Article  PubMed  Google Scholar 

  57. Ammerman AS, Farrelly MA, Cavallo DN, Ickes SB, Hoerger TJ: Health economics in public health. Am J Prev Med. 2009, 36: 273-275. 10.1016/j.amepre.2008.11.005.

    Article  PubMed  Google Scholar 

  58. Centers for Disease Control and Prevention: CDC Steven M. Teutsch Prevention Effectiveness Fellowship. ., [http://www.cdc.gov/pef/]

  59. Brownson RC, Chriqui JF, Stamatakis KA: Understanding evidence-based public health policy. Am J Public Health. 2009, 99: 1576-1583. 10.2105/AJPH.2008.156224.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Bryan RL, Kreuter MW, Brownson RC: Integrating adult learning principles into training for public health practice. Health Promot Pract. 2009, 10: 557-563. 10.1177/1524839907308117.

    Article  PubMed  Google Scholar 

  61. Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A: Examining the role of training in evidence-based public health: a qualitative study. Health Promot Pract. 2009, 10: 342-348. 10.1177/1524839909336649.

    Article  PubMed  Google Scholar 

  62. Dodson EA, Baker EA, Brownson RC: Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions. J Public Health Manag Pract. 2010, 16 (6): E9-E15. 10.1097/PHH.0b013e3181d1f1e2.

    Article  PubMed  Google Scholar 

  63. Allen P, Brownson RC, Duggan K, Stamatakis KA, Erwin PC: The makings of an evidence-based local health department: identifying administrative and management practices. Front Public Health Serv Syst Res. 2012, 1: 2.

    Google Scholar 

  64. Erwin PC: The performance of local health departments: a review of the literature. J Public Health Manag Pract. 2008, 14: E9-E18. 10.1097/01.PHH.0000311903.34067.89.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The current study and the 2013 survey of state health departments were funded by the National Cancer Institute of the National Institutes of Health under Award Number R01CA160327 and are part of a larger study for which the second study phase is a group randomized control trial registered with clinicaltrials.gov NCT01978054. The 2008 survey of state health department staff was funded through CDC grant 5R18DP001139-02 (Improving Public Health Practice through Translation Research) and contract U48/DP000060 (Prevention Research Centers Program). The 2012-2013 surveys of local public health department staff were funded by the Robert Wood Johnson Foundation grant 69964. The Evidence-Based Public Health training program was supported in part by: the National Association of Chronic Disease Directors contract number 482012. This article is a product of a Prevention Research Center and was also supported by Cooperative Agreement Number U48/DP001903 from the Centers for Disease Control and Prevention. The findings and conclusions in this article are those of the authors and do not necessarily represent the official positions of the National Institutes of Health or the Centers for Disease Control and Prevention.

We thank Carolyn Leep at the National Association of County and City Officials for her help with the local public health survey development and sampling. We appreciate the collaboration with Dr. Ellen Jones, Consultant; John Patton, Director of Communications and Marketing; and John W. Robitscher, MPH, Chief Executive Officer, at the National Association of Chronic Disease Directors on the state level surveys and sampling. We also thank Dr. Jenine K. Harris, Assistant Professor, Brown School, Washington University in St. Louis, and Dr. Katherine A. Stamatakis, Associate Professor, Department of Epidemiology and Prevention Research Center in St. Louis, College for Public Health and Social Justice, Saint Louis University, for their assistance with survey development. We are grateful to Carson Smith, MPH and Lindsay Elliott, MPH/MSW 2015 candidate, Graduate Research Scholars at the Prevention Research Center in St. Louis, for data collection. In addition, we thank Julie A. Jacobs, MPH, Research Consultant, Prevention Research Center in St. Louis, for her early work with the 2008 state level data. We thank Visiting Scholars Dr. Rodrigo Reis, and Dr. Adriano Akira Hino from the Department of Physical Education, Federal University of Parana, Curitiba, Parana, Brazil and School of Health and Biosciences, Pontifícia Universidade Católica do Paraná, Curitiba, Parana, Brazil, for reliability test-retest analyses. We also appreciate the administrative support of Linda Dix, Mary Adams, and Pamela Roa Hipp at the Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rebekah R Jacob.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2

Authors’ original file for figure 3

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jacob, R.R., Baker, E.A., Allen, P. et al. Training needs and supports for evidence-based decision making among the public health workforce in the United States. BMC Health Serv Res 14, 564 (2014). https://0-doi-org.brum.beds.ac.uk/10.1186/s12913-014-0564-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12913-014-0564-7

Keywords