Abstract
Background: There is increased awareness that, to minimize variation in clinician practice and improve quality, performance reporting should be implemented at the provider level. This optimizes physician engagement and creates a sense of professional responsibility for quality and performance measurement at the individual and organizational levels.
Methods: Individual provider level reporting was implemented within a provincial health region involving 56 clinicians (general surgeons, surgical oncologists, urologists and pathologists). The 2 surgical pathology indicators chosen were colorectal cancer (CRC) lymph node retrieval rate and pT2 prostate cancer margin positivity rate. Surgical resections for all prostate and colorectal cancer performed between Jan. 1, 2011, and Mar. 30, 2012, were included. We used a pre- and postsurvey design to obtain physician perceptions and focus groups with program leadership to determine organizational impact.
Results: Survey results showed that respondents felt the data provided in the reports were valid (67%), consistent with expectations (70%), maintained confidentiality (80%) and were not used in a punitive manner (77%). During the study period the pT2 prostate margin positivity rate decreased from 57.1% to 27.5%. For the CRC lymph node retrieval rate indicator, high baseline performance was maintained.
Conclusion: We developed a robust process for providing physicians with confidential, individualized surgical and pathology quality indicator reports. Our results reinforce the importance of individual physician feedback as a strategy for improving and sustaining quality in surgical and diagnostic oncology.
Clinical practice guidelines provide an evidence-based foundation for the delivery of high-quality patient care, monitoring of patient outcomes and minimizing the degree of variation in clinical practices.1 As quality improvement processes are implemented, it is increasingly appreciated that in order to minimize variation in individual practice, performance reporting must be implemented at the provider level. This optimizes physician engagement and also creates a sense of professional responsibility for quality and performance measurement.2
Cancer Care Ontario (CCO), the provincial government’s cancer advisor, is responsible for the organization of cancer care services, creation of treatment guidelines and standards, and monitoring of cancer system performance. The Ontario Cancer Plan,3 authored by CCO, includes a commitment to improving patient outcomes through high-quality care. One way CCO accomplishes this is by giving detailed information to providers to inform evidence-based quality improvement. To support this, CCO has developed sophisticated and robust information management and information technology processes that collect and analyze data from across the provincial cancer system.
Currently, CCO quality monitoring focuses on system level indicators, providing aggregate quality data for groups of providers. The timely availability of quality data has been used at the regional and hospital level to facilitate dialogue regarding performance as compared with established standards and guidelines and has helped to guide quality improvement.4 Several systematic reviews were used to guide this study. Davis and colleagues5 revealed that physicians have limited ability to self-assess and that external means of assessment are useful in promoting optimal practice. Reviews conducted by Grimshaw and colleagues6,7 and O’Brien and colleagues8 demonstrated that the most effective strategies for changing clinician practice were those used in combination with educational outreach, the use of reminders, local opinion leaders and timely feedback. A systematic review on the effects of audit and feedback9 concluded that feedback is more effective when baseline performance is low, when reports are inclusive of specific targets and when it is provided on a consistent basis. Specific to performance information, Lemire and colleagues10 state that dissemination of performance data in itself it not sufficient to impact quality and that success depends on the cumulative impact of a multifaceted approach. The importance of providing physicians with timely information about their practices supports the Institute for Healthcare Improvement framework for engaging physicians in quality and safety by encouraging the use of data in a nonpunitive manner to allow for reflection on practice and promotes both system and individual responsibility for quality.11
The purpose of our study was to determine whether providing contemporaneous quality data at the provider level is more likely than system level data to result in behavioural change and improved quality. The specific goals of this study were to develop a process for providing confidential individualized surgical and pathology quality indicator reports to physicians, to determine the level of physician acceptance and satisfaction with the reports and to determine the impact of individualized reports on physicians’ practices in terms of quality in cancer surgery and pathological assessment.
Methods
Sample
All 4 hospitals within a regional health authority in Ontario were included in the project, with the physician sample including all surgeons performing cancer resections for prostate or colorectal cancer (CRC) surgeries (n = 35) and all pathologists involved in the pathological interpretation of prostate or CRC surgery (n = 19). Surgical resections for all prostate and CRC surgeries performed between Jan. 1, 2011, and Mar. 30, 2012, were included. As this evaluation was undertaken for the purposes of program review quality assurance, ethics review was not required, as stipulated under Article 2.5 of the Canadian Tri-Council Policy Statement on the Ethical Conduct for Research Involving Humans.12
Identification of indicators for reporting
The 2 surgical-pathology indicators chosen were lymph node retrieval rate in CRC and pT2 prostate cancer (organ confined disease) margin positivity rate (R1 margin status). Both indicators are supported by evidence-based guidelines developed and promoted by CCO, including surgical and pathology recommendations to enhance a standardized approach to surgical and pathology practice.13,14 They are also the 2 indicators with which surgeons and pathologists have the most familiarity based on previously released aggregate regional reports. Finally, as they are important data points needed to make decisions about adjuvant therapy, they are good, immediate surrogates for long-term quality, such as disease-free and overall survival.
The CRC lymph node retrieval rate quality indicator is defined as the proportion of CRC resection reports where removal of at least 12 lymph nodes documented in the synoptic pathology report based on the provincial target of 90% of all colon cancer resections having 12 or more nodes examined and reported. This target is consistent with the Commission on Cancer quality measure and endorsed by the National Quality Forum.15
The quality indicator for prostate surgery is the percentage of organ confined (pT2) radical prostatectomy cases where the margin was reported as positive, with a provincial target established at less than 25%. This target was established by the content expert members of the Prostate Cancer Surgery and Pathology Expert Panel as a result of extensive review of the literature.13
The indicator results were collected by querying a centralized database of cancer pathology synoptic reports. This Pathology Synoptic Reporting (ePath) database contained within the Ontario Cancer Registry enables access to population-based synoptic pathology reports and generation of reports based on unique identifiers, which facilitates timely access to information depicting overall clinical practice and the ability to audit individual case and provider level information.16
Creation of infrastructure
Prior to the launch of the project, each hospital signed a memorandum of understanding that outlined the requirement for the establishment of a surgical pathology oncology quality of care committee (SPOQCC). Each SPOQCC was a subcommittee of the hospital’s quality of care committee or medical advisory committee, depending on the governance structure within the particular hospital. To enable open, honest dialogue and ensure anonymity of the reports, the SPOQCCs functioned under the umbrella of the Quality of Care Information Protection Act (QCIPA). Specific to Ontario, the intent of QCIPA is to support open dialogue regarding quality of care without concern that the information may be used for disciplinary action, legal action or other proceedings.17 While each hospital had autonomous discretion with respect to membership, the recommendation was to include the designated CCO regional lead for surgery; the CCO regional lead for pathology and laboratory medicine; and the individual hospitals’ chief of surgery, chief of pathology, chief of staff and representatives from the departments of pathology and surgery, specifically the divisions of general surgery and urology; and the oncology program directors.
Compilation of hospital report and dissemination to SPOQCC
Cancer Care Ontario provided the regional surgical and pathology leads with a report generated from ePath, specifically the pathology and operative reports for surgical specimens in which fewer than 12 lymph nodes were identified (in patients with CRC) or in which a histologically positive margin was identified (in the case of pT2 prostate cancer). In order to maintain the anonymity and confidentiality of the information, 1 individual (M.B.) who was authorized by the regional vice president of the Cancer Program and who signed a confidentiality agreement was tasked with creating an aggregate report with all physician and patient identifiers removed.
Case review and validation
Suboptimal cases (e.g., those requiring further review) were defined as any pT2 radical prostatectomy specimens having a positive resection margin or colorectal specimens having fewer than 12 nodes within the examined specimen. The admitting history, operative note and pathology report were then collected from the participating hospital. Prior to validation, the reports were redacted of any identifiers (i.e., personal health information, hospital, physician).
The pathology regional lead (D.D.) first reviewed the pathology reports from suboptimal cases to ensure the specimens were handled and reported according to established guidelines and standards. If the pathology lead could not verify that the specimen had been dealt with appropriately, the case was deemed to have a pathology-related deficiency and was not considered for surgical review.
Suboptimal cases that passed the pathology validation step were then forwarded to the surgical regional lead (C.M.) for review. Exclusion criteria for cases removed from the surgical analysis were as follows: colon or rectal resections done for locally recurrent disease (in which the natural lymph node draining basin may have already been removed); palliative resections14 (wherein a radical resection was not indicated or appropriate); rectal resections done after long-course neoadjuvant radiation18–20 (as low lymph node yield is commonly observed in this setting); and resections done on an emergent basis, such as those required for perforation or acute obstruction (as the lack of a preoperative diagnosis or the importance of dealing with the acute critical clinical situation may have precluded a definitive radical resection).
This validation step was essential to produce credible individualized physician quality reports; for example, 22 of 48 flagged suboptimal cases were excluded from the surgeons’ scores because of pathology-related issues or for appropriate clinical reasons. We believe that failure to remove these cases would have negatively impacted the degree of physician confidence and satisfaction with the information contained in the reports.
During the course of the project, workload requirements (e.g., no. of cases reviewed and amount of time required) were documented by each of the regional leads to determine the workload impacts of the validation process.
SPOQCC review
Once all suboptimal cases had been processed and reported appropriately, a hospital-specific aggregate report was generated and forwarded to the respective SPOQCCs for review.
The data contained in the reports were hospital-specific and anonymous; no identifiers (patient or physician) were included. Additional aggregate information was provided to allow for comparison of results to those of other hospitals in the region and to provincial results.
The regional leads presented and reviewed these reports with the SPOQCCs. Once the report was accepted by the committees, individualized reports were then prepared and forwarded to the pathologists and surgeons (Fig. 1).
Generation of individualized reports
The individualized report included the total number of cases completed; for cases of colon and rectal cancer, it included the number and percentage of cases that had fewer than 12 lymph nodes harvested, and for prostatectomy specimens, it included the number and percentage of specimens that had a positive resection margin. The report contained data that enabled the physician to compare his/her performance with regional data (therefore allowing comparison with immediate peers) and with the provincial averages. The reports were provided by registered mail or electronically, as per the stated preference of the individual physicians (Fig. 2).
Stakeholder engagement
In order to optimize physicians’ understanding of the project, a multipronged stakeholder engagement strategy was used. Prior to implementation, the regional leads for surgical oncology and pathology made presentations to physician groups (e.g., surgeons, pathologists, urologists) at each of the hospital sites. In addition, presentations were made at all of the medical advisory committees, quality of care committees and senior leadership committees. The intent of the presentations was to provide information about the project (e.g., intent, process) and address any concerns regarding confidentiality.
Assessment of physicians’ perceptions
In addition to the education sessions, all physicians (n = 54) were invited to complete a preimplementation survey designed to obtain information regarding their knowledge of the project, perceptions regarding the value added and the degree of confidentiality or anonymity of the reports. The 11-item survey asked participants to rate their level of agreement using a 5-point Likert scale (1 = strongly disagree and 5 = strongly agree). For physicians with email contact information, all communication regarding the survey was sent via an email that included the link to the web-based survey. For all others, communication was sent by fax to the clinicians’ offices and provided a secure fax number for returning completed paper surveys. Reminder notices were sent out every 2 weeks to those who had not yet returned the survey.21 A similar process was used for the postimplementation physician survey conducted in March 2013, which focused on determining the physicians’ actual experiences of receiving individualized quality reports.
The postimplementation evaluation strategy also involved conducting semistructured focus groups with the members of the 4 distinct SPOQCCs to obtain their unique perspectives on the benefits and challenges of individual physician level reporting, lessons learned and strategies for future implementation.
Assessment of organizational impacts
To determine the degree of organizational impacts, focus groups were conducted with the members of the 3 site-specific SPOQCCs at the end of the pilot project. To facilitate optimal participation, the focus groups were scheduled to occur at the end of the regularly scheduled SPOQCC meeting, included all SPOQCC members in attendance, and were facilitated by a member of the CCO project team who participated via teleconference. As the regional leads were also the champions for the pilot project, they did not participate in the focus groups to allow for optimal engagement of the SPOQCC members and to maintain the confidentiality of comments shared. A semi-structured interview process was used to guide the discussion. The sessions were audiotaped, with the permission of the participants, and additional notes were taken by the facilitator to capture comments and suggestions. Conventional content analysis22,23 was used for coding and identification of common themes.
Results
A total of 437 cancer resections (78 prostate, 241 colon and 118 rectal cases) were included in the review process. Fifty cases (11%) were suboptimal in terms of margin status (pT2 prostate) or lymph node yield (colon and rectum). Sixty percent of all suboptimal cases were associated with prostate cancer resections followed by rectal cases (22%) and colon cases (18%). Twenty-one cases (42%) were excluded from physician level reports based on previously determined clinical exclusion criteria or data error issues (e.g., wrong case assigned to physician). Of the remaining 29 suboptimal cases eligible for inclusion in physician level reporting, 7 (25%) were pathologist-related and 22 (75%) were surgeon-related. Five rounds of individual physician level reports, inclusive of one-quarter of data, were disseminated to the relevant physicians. Workload measurement reports revealed that the time required to review cases was less than 5 minutes in most cases, with only 4 surgical case reviews requiring 11–15 minutes each.
Impact on practice patterns over time
Over the course of the study, there was a noted improvement in the regional pT2 prostate margin positivity rate, which decreased from 57.1% to 27.5%. Hospital B demonstrated a dramatic improvement, with a decrease from 66.7% to 25%, thus meeting the provincial target (Fig. 3). Hospitals C and D (combined because the urology and pathology staff were the same for the 2 hospitals) showed no significant change in the percent of involved margins in pT2 prostate cancer resections over the course of the pilot project. As a percentage of hospital-specific urology staff, the urological surgical members of these 2 hospitals had the poorest attendance at engagement sessions held before and during the pilot project (Fig. 3).
For the CRC lymph node retrieval rate indicator, baseline performance was consistently high, with a regional rate of 96%. This was maintained and improved over the course of the study, with 2 sites achieving a 100% retrieval rate (Fig. 4).
Physician satisfaction
Survey response rates (56%) were similar pre- and postimplementation. Postimplementation respondents included colorectal surgeons (60%), pathologists (30%) and urologists (10%). The low response rate from urologists was disappointing, as 60% of all suboptimal cases were associated with prostate cancer surgeries. When comparing surveys, the preimplementation (anticipated) and postimplementation (actual experience) results were similar. Postimplementation results (based on combined responses of “strongly agree” and “agree”) show that respondents felt that the data provided in the individualized reports were valid (67%) and consistent with expectations (70%), that their confidentiality was maintained (80%) and that the information was not used in a punitive manner (77%). Notwithstanding the latter observation, only 47% indicated they were confident that the data would not be used in a punitive manner in the future.
In terms of relevance to personal and organizational performance, respondents to the postimplementation survey reported that having access to the personalized report could improve the quality of their practices (77%), improve organizational performance (87%) and improve patient outcomes (83%). Somewhat paradoxically, while the peer comparison data were felt to be useful (77%), only 48% of respondents indicated they would voluntarily provide the reports as part of professional peer review processes. Specific to the issue of validation, 40% were aware that the information was validated before generation of individual reports; 90% indicated that this process was important, very important or essential to the overall process. Preferred frequency of report generation was twice a year (41%), once a year (38%) and quarterly (21%), with hardcopy (53%) as the preferred method for receiving the reports, followed by secure email (37%) and Internet access (10%; Tables 1 and 2).
Organizational impact
Three focus groups were conducted with the site-specific SPOQCCs, including a total of 11 participants (75% of total SPOQCC membership), 9 of whom were physicians and 2 of whom were program administrative leaders. Two main themes emerged from the focus groups: process and quality. Overall, the participants stated that the SPOQCC structure and processes worked well. The recommended membership provided sufficient content expertise, and the presentation of data that were deidentified of physician information enabled dialogue focused on overall quality rather than individual provider performance. Scheduling the SPOQCC meetings proved to be a significant barrier to the timeliness of providing reports to physicians, as the established process was to provide anonymized data to the SPOQCCs before sending out individual reports to the-physicians. The participants also valued the SPOQCCs as an opportunity for focused discussions on quality at the regional and organizational levels. The presentation of results was deemed useful in determining areas of strength and areas for improvement, with dialogue focusing on how best to use the data to inform decision-making regarding strategies to sustain or improve performance. The SPOQCC forums were viewed as a mechanism for creating a quality culture that is supportive of open dialogue regarding the link between practice, quality and patient outcomes.
Discussion
In recent years there has been increasing focus on quality and performance indicators, as evident in programs and initiatives such as the Cancer System Quality Index (CSQI), the Quality Oncology Practice Initiative (QOPI), and the National Clinical Audit and Patient Outcomes Programme (NCAPOP).
The CSQI, produced by the Cancer Quality Council of Ontario, is an annual, system-wide monitor that tracks the quality and consistency of 25 key cancer system performance indicators.24 The CSQI report contains aggregate level data only, with comparisons against pre-established, consensus-based targets. The QOPI is a voluntary quality improvement program developed by the American Society for Clinical Oncology. Twice a year, participating practices undergo a retrospective review of patient charts, with quality measures calculated and reported back to the physicians for the purpose of performance improvement. Studies on practices enrolled in the QOPI initiative have shown positive results in terms of performance improvements.25–27 In the United Kingdom, the NCAPOP is a national clinical audit project that collects data supplied by clinicians on compliance with evidence-based standards and provides benchmarked reports to help participants identify necessary improvements.28
These programs focus on aggregate data only (CSQI) or data that is voluntarily provided by physicians (QOPI, NCAPOP). The deemed benefits of the approach taken in this project are the following.
The use of a centralized, population-based data repository (ePath) as the data source for the reports, rather than data voluntarily submitted by individual physicians, provides a strong foundation for the validity of the data used.
The presentation of results at the regional, hospital and provider levels allows for analysis of variation in practice patterns within and across each level.
The reports were not a single, stand-alone strategy. The creation of a central forum (SPOQCC) for the specific purpose of focusing on quality allowed for a more multifaceted approach to creating a culture of quality beyond the generation of reports.
The reports are used to guide decision-making at the regional, organizational and individual physician levels regarding practice and outcomes.
The results of our study reinforce the importance of individual feedback as a valuable strategy for improving and sustaining quality in surgical and diagnostic oncology. The feedback and any subsequent remediation process must have a goal of overall quality improvement for all members and be conducted in the true spirit of quality improvement rather than being deemed as punitive in nature and intent. The provision of timely feedback is congruent with the rapid-learning health care initiative where real-time clinical data are used to support a focus on quality improvement and patient safety.29 The original goals of our study were achieved, in that we were able to develop a robust process for providing valid, confidential individualized surgical and pathology quality indicator reports to physicians.
Limitations
The limitations of this study include the small sample (56 physicians). The small number of prostatectomy cases over the course of the entire pilot project (n = 63) and the number of cases within each quarter limited the use of analyses that could be conducted owing to insufficient statistical power. Although significant change management strategies were applied before the launch of the pilot project, the very poor response rate from the urologist group limits the application of findings to the area with the greatest need for improvement. Consideration for targeted follow-up with nonrespondents may have enhanced participation. Based on the results and experience in this region, there will be specific strategies, including education sessions and presentations by known content experts and author(s) of the relevant practice guideline(s), to engage the urologist community when expanding into other regions.
Conclusion
While implementation of this pilot project was facilitated by the small size of our region, the processes and principles can be applied to other jurisdictions, with process redesign considerations made to address the volume of cases and number of physicians involved. Based on the success of this pilot project, expansion of physician level reporting to other regions in Ontario is currently underway in regions with larger surgical volumes and greater numbers of physicians. This expansion will enable further evaluation of the effectiveness of the processes implemented here, the impacts on physician satisfaction and quality of practice and opportunities for process redesign.
Footnotes
Competing interests: None declared.
Contributors: C. McFayden, S. Lankshear and A. Hunter designed the study. C. McFayden, D. Divaris and M. Berry acquired the data, which C. McFayden, S. Lankshear, D. Divaris, A. Hunter, J. Srigley and J. Irish analyzed. C. McFayden and S. Lankshear wrote the article, which all authors reviewed and approved for publication.
- Accepted July 8, 2014.