TY - JOUR
T1 - Reliability and validity of multicentre surveillance of surgical site infections after colorectal surgery
AU - Verberk, Janneke D.M.
AU - van Rooden, Stephanie M.
AU - Hetem, David J.
AU - Wunderink, Herman F.
AU - Vlek, Anne L.M.
AU - Meijer, Corianne
AU - van Ravensbergen, Eva A.H.
AU - Huijskens, Elisabeth G.W.
AU - Vainio, Saara J.
AU - Bonten, Marc J.M.
AU - van Mourik, Maaike S.M.
N1 - Funding Information:
This work was supported by the Regional Healthcare Network Antibiotic Resistance Utrecht with a subsidy of the Dutch Ministry of Health, Welfare and Sport (grant number 327643).
Publisher Copyright:
© 2022, The Author(s).
PY - 2022/1/21
Y1 - 2022/1/21
N2 - BACKGROUND: Surveillance is the cornerstone of surgical site infection prevention programs. The validity of the data collection and awareness of vulnerability to inter-rater variation is crucial for correct interpretation and use of surveillance data. The aim of this study was to investigate the reliability and validity of surgical site infection (SSI) surveillance after colorectal surgery in the Netherlands.METHODS: In this multicentre prospective observational study, seven Dutch hospitals performed SSI surveillance after colorectal surgeries performed in 2018 and/or 2019. When executing the surveillance, a local case assessment was performed to calculate the overall percentage agreement between raters within hospitals. Additionally, two case-vignette assessments were performed to estimate intra-rater and inter-rater reliability by calculating a weighted Cohen's Kappa and Fleiss' Kappa coefficient. To estimate the validity, answers of the two case-vignettes questionnaires were compared with the answers of an external medical panel.RESULTS: 1111 colorectal surgeries were included in this study with an overall SSI incidence of 8.8% (n = 98). From the local case assessment it was estimated that the overall percent agreement between raters within a hospital was good (mean 95%, range 90-100%). The Cohen's Kappa estimated for the intra-rater reliability of case-vignette review varied from 0.73 to 1.00, indicating substantial to perfect agreement. The inter-rater reliability within hospitals showed more variation, with Kappa estimates ranging between 0.61 and 0.94. In total, 87.9% of the answers given by the raters were in accordance with the medical panel.CONCLUSIONS: This study showed that raters were consistent in their SSI-ascertainment (good reliability), but improvements can be made regarding the accuracy (moderate validity). Accuracy of surveillance may be improved by providing regular training, adapting definitions to reduce subjectivity, and by supporting surveillance through automation.
AB - BACKGROUND: Surveillance is the cornerstone of surgical site infection prevention programs. The validity of the data collection and awareness of vulnerability to inter-rater variation is crucial for correct interpretation and use of surveillance data. The aim of this study was to investigate the reliability and validity of surgical site infection (SSI) surveillance after colorectal surgery in the Netherlands.METHODS: In this multicentre prospective observational study, seven Dutch hospitals performed SSI surveillance after colorectal surgeries performed in 2018 and/or 2019. When executing the surveillance, a local case assessment was performed to calculate the overall percentage agreement between raters within hospitals. Additionally, two case-vignette assessments were performed to estimate intra-rater and inter-rater reliability by calculating a weighted Cohen's Kappa and Fleiss' Kappa coefficient. To estimate the validity, answers of the two case-vignettes questionnaires were compared with the answers of an external medical panel.RESULTS: 1111 colorectal surgeries were included in this study with an overall SSI incidence of 8.8% (n = 98). From the local case assessment it was estimated that the overall percent agreement between raters within a hospital was good (mean 95%, range 90-100%). The Cohen's Kappa estimated for the intra-rater reliability of case-vignette review varied from 0.73 to 1.00, indicating substantial to perfect agreement. The inter-rater reliability within hospitals showed more variation, with Kappa estimates ranging between 0.61 and 0.94. In total, 87.9% of the answers given by the raters were in accordance with the medical panel.CONCLUSIONS: This study showed that raters were consistent in their SSI-ascertainment (good reliability), but improvements can be made regarding the accuracy (moderate validity). Accuracy of surveillance may be improved by providing regular training, adapting definitions to reduce subjectivity, and by supporting surveillance through automation.
KW - Colorectal surgery
KW - Epidemiology
KW - Infection prevention
KW - Inter-rater reliability
KW - Surgical site infection
KW - Surveillance
KW - Reproducibility of Results
KW - Prospective Studies
KW - Humans
KW - Middle Aged
KW - Colorectal Surgery/statistics & numerical data
KW - Male
KW - Surgical Wound Infection/epidemiology
KW - Epidemiological Monitoring
KW - Aged, 80 and over
KW - Adult
KW - Female
KW - Aged
KW - Netherlands/epidemiology
UR - http://www.scopus.com/inward/record.url?scp=85123396023&partnerID=8YFLogxK
U2 - 10.1186/s13756-022-01050-w
DO - 10.1186/s13756-022-01050-w
M3 - Article
C2 - 35063009
SN - 2047-2994
VL - 11
SP - 1
EP - 9
JO - Antimicrobial Resistance and Infection Control
JF - Antimicrobial Resistance and Infection Control
IS - 1
M1 - 10
ER -