Early recognition and response to increases in surgical site infections using optimized statistical process control charts-the Early 2RIS Trial: a multicenter cluster randomized controlled trial with stepped wedge design

Deverick J Anderson, Iulian Ilieş, Katherine Foy, Nicole Nehls, James C Benneyan, Yuliya Lokhnygina, Arthur W Baker, Deverick J Anderson, Iulian Ilieş, Katherine Foy, Nicole Nehls, James C Benneyan, Yuliya Lokhnygina, Arthur W Baker

Abstract

Background: Surgical site infections (SSIs) cause significant patient suffering. Surveillance and feedback of SSI rates is an evidence-based strategy to reduce SSIs, but traditional surveillance methods are slow and prone to bias. The objective of this cluster randomized controlled trial (RCT) is to determine if using optimized statistical process control (SPC) charts for SSI surveillance and feedback lead to a reduction in SSI rates compared to traditional surveillance.

Methods: The Early 2RIS Trial is a prospective, multicenter cluster RCT using a stepped wedge design. The trial will be performed in 29 hospitals in the Duke Infection Control Outreach Network (DICON) and 105 clusters over 4 years, from March 2016 through February 2020; year one represents a baseline period; thereafter, 8-9 clusters will be randomized to intervention every 3 months over a 3-year period using a stepped wedge randomization design. All patients who undergo one of 13 targeted procedures at study hospitals will be included in the analysis; these procedures will be included in one of six clusters: cardiac, orthopedic, gastrointestinal, OB-GYN, vascular, and spinal. All clusters will undergo traditional surveillance for SSIs; once randomized to intervention, clusters will also undergo surveillance and feedback using optimized SPC charts. Feedback on surveillance data will be provided to all clusters, regardless of allocation or type of surveillance. The primary endpoint is the difference in rates of SSI between the SPC intervention compared to traditional surveillance and feedback alone.

Discussion: The traditional approach for SSI surveillance and feedback has several major deficiencies because SSIs are rare events. First, traditional statistical methods require aggregation of measurements over time, which delays analysis until enough data accumulate. Second, traditional statistical tests and resulting p values are difficult to interpret. Third, analyses based on average SSI rates during predefined time periods have limited ability to rapidly identify important, real-time trends. Thus, standard analytic methods that compare average SSI rates between arbitrarily designated time intervals may not identify an important SSI rate increase on time unless the "signal" is very strong. Therefore, novel strategies for early identification and investigation of SSI rate increases are needed to decrease SSI rates. While SPC charts are used throughout industry and healthcare to improve and optimize processes, including other types of healthcare-associated infections, they have not been evaluated as a tool for SSI surveillance and feedback in a randomized trial.

Trial registration: ClinicalTrials.gov NCT03075813 , Registered March 9, 2017.

Keywords: Feedback; Outbreak detection; Statistical process control; Surgical site infection; Surveillance.

Figures

Fig. 1
Fig. 1
Example SPC charts used for SSI surveillance in the Early 2 RIS Trial; Chart A (top) is a moving average (MA) chart using DICON baseline rates and Chart B (bottom) is a moving average (MA) chart using local hospital rates
Fig. 2
Fig. 2
Schematic for stepped wedge design for the Early Recognition and Response to Increases in Surgical Site Infection (Early 2RIS) Trial. Gray = control, during which hospitals will receive traditional SSI surveillance, including biannual data reports. Any signals identified in biannual reports or detected by local personnel will undergo further investigation. White = intervention, during which hospital clusters will receive feedback from traditional surveillance and signals generated by applying optimized SPC methods to SSI surveillance data

References

    1. Anderson DJ, Pyatt DG, Weber DJ, Rutala WA. North Carolina Department of Public Health and Human Services. Statewide costs of health care-associated infections: estimates for acute care hospitals in North Carolina. Am J Infect Control. 2013;41(9):764–768. doi: 10.1016/j.ajic.2012.11.022.
    1. Lewis SS, Moehring RW, Chen LF, Sexton DJ, Anderson DJ. Assessing the relative burden of hospital-acquired infections in a network of community hospitals. Infect Control Hosp Epidemiol. 2013;34(11):1229–1230. doi: 10.1086/673443.
    1. Magill SS, Edwards JR, Bamberg W, Beldavs ZG, Dumyati G, Kainer MA, et al. Multistate point-prevalence survey of health care-associated infections. New Engl J Med. 2014;370(13):1198–1208. doi: 10.1056/NEJMoa1306801.
    1. Zimlichman E, Henderson D, Tamir O, Franz C, Song P, Yamin CK, et al. Health care-associated infections: a meta-analysis of costs and financial impact on the US health care system. JAMA Intern Med. 2013;173(22):2039–2046. doi: 10.1001/jamainternmed.2013.9763.
    1. Agency for Healthcare Research and Quality (AHRQ) Healthcare cost and utilization project - statistics on hospital stays. 2013.
    1. Anderson DJ, Kaye KS, Chen LF, Schmader KE, Choi Y, Sloane R, et al. Clinical and financial outcomes due to methicillin resistant Staphylococcus aureus surgical site infection: a multi-center matched outcomes study. PLoS One. 2009;4(12):e8305. doi: 10.1371/journal.pone.0008305.
    1. Klevens RM, Edwards JR, Richards CL, Jr, Horan TC, Gaynes RP, Pollock DA, et al. Estimating health care-associated infections and deaths in U.S. hospitals, 2002. Public Health Rep. 2007;122(2):160–166. doi: 10.1177/003335490712200205.
    1. Scott RD. The direct medical costs of healthcare-associated Infections in U.S. hospitals and the benefits of prevention. Atlanta: Division of Healthcare Quality Promotion National Center for Preparedness, Detection, and Control of Infectious Diseases Coordinating Center for Infectious Diseases Centers for Disease Control and Prevention; 2009.
    1. Hawn MT, Richman JS, Vick CC, Deierhoi RJ, Graham LA, Henderson WG, et al. Timing of surgical antibiotic prophylaxis and the risk of surgical site infection. JAMA Surg. 2013;148(7):649–657. doi: 10.1001/jamasurg.2013.134.
    1. Hawn MT, Vick CC, Richman J, Holman W, Deierhoi RJ, Graham LA, et al. Surgical site infection prevention: time to move beyond the surgical care improvement program. Ann Surg. 2011;254(3):494–499. doi: 10.1097/SLA.0b013e31822c6929.
    1. Benneyan JC. Statistical quality control methods in infection control and hospital epidemiology, part I: introduction and basic theory. Infect Control Hosp Epidemiol. 1998;19(3):194–214. doi: 10.2307/30143442.
    1. Curran E, Harper P, Loveday H, Gilmour H, Jones S, Benneyan J, et al. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project. J Hosp Infect. 2008;70(2):127–135. doi: 10.1016/j.jhin.2008.06.013.
    1. Dyrkorn OA, Kristoffersen M, Walberg M. Reducing post-caesarean surgical wound infection rate: an improvement project in a Norwegian maternity clinic. BMJ Qual Saf. 2012;21(3):206–210. doi: 10.1136/bmjqs-2011-000316.
    1. Levett JM, Carey RG. Measuring for improvement: from Toyota to thoracic surgery. Ann Thorac Surg. 1999;68(2):353–358. doi: 10.1016/S0003-4975(99)00547-0.
    1. Walberg M, Froslie KF, Roislien J. Local hospital perspective on a nationwide outbreak of Pseudomonas aeruginosa infection in Norway. Infect Control Hosp Epidemiol. 2008;29(7):635–641. doi: 10.1086/589332.
    1. Haley RW, Culver DH, White JW, Morgan WM, Emori TG, Munn VP, et al. The efficacy of infection surveillance and control programs in preventing nosocomial infections in US hospitals. Am J Epidemiol. 1985;121(2):182–205. doi: 10.1093/oxfordjournals.aje.a113990.
    1. Maruthappu M, Trehan A, Barnett-Vanes A, McCulloch P, Carty MJ. The impact of feedback of surgical outcome data on surgical performance: a systematic review. World J Surg. 2015;39(4):879–89.
    1. Condon RE, Schulte WJ, Malangoni MA, Anderson-Teschendorf MJ. Effectiveness of a surgical wound surveillance program. Arch Surg. 1983;118(3):303–307. doi: 10.1001/archsurg.1983.01390030035006.
    1. Gaynes R, Richards C, Edwards J, Emori TG, Horan T, Alonso-Echanove J, et al. Feeding back surveillance data to prevent hospital-acquired infections. Emerg Infect Dis. 2001;7(2):295–298. doi: 10.3201/eid0702.010230.
    1. Richards C, Emori TG, Peavy G, Gaynes R. Promoting quality through measurement of performance and response: prevention success stories. Emerg Infect Dis. 2001;7(2):299–301. doi: 10.3201/eid0702.010231.
    1. National Healthcare Safety Network (NHSN) Surgical Site Infection (SSI) Event. 2013.
    1. Ilies I, Anderson DJ, Salem J, Baker AW, Jacobsen M, Benneyan JC. Large-scale empirical optimisation of statistical control charts to detect clinically relevant increases in surgical site infection rates. BMJ Qual Saf. 2020;29(6):472–81.
    1. Wenzel RP, Osterman CA, Hunting KJ, Gwaltney JM., Jr Hospital-acquired infections. I. Surveillance in a university hospital. Am J Epidemiol. 1976;103(3):251–260. doi: 10.1093/oxfordjournals.aje.a112223.
    1. Perl TM. Surveillance, reporting and the use of computers. In: Wenzel RP, editor. Prevention and Control of Nosocomial Infections. 2nd ed. Baltimore: Williams & Wilkins; 1993. p. 139–76.
    1. Anderson DJ, Hartwig MG, Pappas T, Sexton DJ, Kanafani ZA, Auten G, et al. Surgical volume and the risk of surgical site infection in community hospitals: size matters. Ann Surg. 2008;247(2):343–349. doi: 10.1097/SLA.0b013e31815aab38.
    1. Gaynes RP, Solomon S. Improving hospital-acquired infection rates: the CDC experience. Jt Comm J Qual Improv. 1996;22(7):457–467.
    1. Consensus paper on the surveillance of surgical wound infections The Society for Hospital Epidemiology of America; The Association for Practitioners in Infection Control; The Centers for Disease Control; The Surgical Infection Society. Infect Control Hosp Epidemiol. 1992;13(10):599–605. doi: 10.2307/30148463.
    1. Anderson DJ, Miller BA, Chen LF, Adcock LH, Cook E, Cromer AL, et al. The network approach for prevention of healthcare-associated infections: long-term effect of participation in the Duke Infection Control Outreach Network. Infect Control Hosp Epidemiol. 2011;32(4):315–322. doi: 10.1086/658940.
    1. Edwards JR, Peterson KD, Mu Y, Banerjee S, Allen-Bridson K, Morrell G, et al. National Healthcare Safety Network (NHSN) report: data summary for 2006 through 2008, issued December 2009. Am J Infect Control. 2009;37(10):783–805. doi: 10.1016/j.ajic.2009.10.001.
    1. Benneyan JC, Lloyd RC, Plsek PE. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care. 2003;12(6):458–464. doi: 10.1136/qhc.12.6.458.

Source: PubMed

3
Abonner