To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Spontaneous blastocyst collapse during in vitro embryo development has been suggested as a novel marker of embryo quality. Therefore, the aim of this multicentre study was to carry out a retrospective multicentre analysis to investigate the correlation between blastocyst collapse and pregnancy outcome. Here, 1297 intracytoplasmic sperm injection (ICSI)/in vitro fertilization (IVF) fresh cycles, with an elective single blastocyst transfer (eSET) were included in this study. Embryos were cultured individually in 6.0% CO2, 5.0% O2, 89.0% N2, using single step medium (GTLTM VitroLife, Sweden) or sequential medium (CookTM, Cook Medical, Australia) and selected for transfer using standard morphological criteria. With the use of time-lapse monitoring (TLM), blastocysts were analyzed by measuring the maximum volume reduction and defined as having collapsed, if there was ≥ 50% volume reduction from the expanded blastocyst and the collapse event. Following embryo replacement, each blastocyst was retrospectively allocated to one of two groups (collapsed or not collapsed). Here, 259 blastocysts collapsed once or more during development (19.9%) and the remaining 1038 either contracted minimally or not collapsed (80.1%). A significantly higher ongoing pregnancy rate (OPR) of 51.9% (95% CI 48.9–59.9%) was observed when blastocysts that had not collapsed were replaced compared with cycles in which collapsed blastocysts were transferred 37.5% (95% CI 31.6–43.4%). This study suggests that human blastocysts that collapse spontaneously during development are less likely to implant and generate a pregnancy compared with embryos that do not. Although this is a retrospective study, the results demonstrated the utility of collapse episodes as new marker of embryo selection following eSET at blastocyst stage.
In the Netherlands, the bulk of the Miocene to lowest Pliocene sedimentary succession is currently assigned to a single lithostratigraphical unit, the Breda Formation. Although the formation was introduced over 40 years ago, the definition of its lower and upper boundaries is still problematic. Well-log correlations show that the improved lecto-stratotype for the Breda Formation in well Groote Heide partly overlaps with the additional reference section of the older Veldhoven Formation in the nearby well Broekhuizenvorst. The distinction between the Breda and the overlying Oosterhout Formation, which was mainly based on quantitative differences in glauconite and molluscs, gives rise to ongoing discussion, in particular due to the varying concentrations of glauconitic content that occur within both formations. In addition, the Breda Formation lacks a regional-scale stratigraphic framework which relates its various regionally to locally defined shallow marine to continental members.
In order to resolve these issues, we performed renewed analyses of material from several archived cores. The results of archived and new dinocyst analyses were combined with lithological descriptions and wire-line log correlations of multiple wells, including the wells Groote Heide and Broekhuizenvorst. In this process, the updated dinocyst zonation of Munsterman & Brinkhuis (2004), recalibrated to the Geological Time Scale of Ogg et al. (2016), was used. To establish regionally consistent lithostratigraphic boundaries, additional data was used along a transect across the Roer Valley Graben running from its central part (well St-Michielsgestel-1) towards the southern rift shoulders (well Goirle-1). Along this transect, chronostratigraphic and lithostratigraphic analyses were integrated with well-log correlation and the analyses of seismic reflection data to constrain geometrical/structural relationships as well.
The results led to the differentiation of two distinct seismic sequences distinguished by three recognisable unconformities: the Early Miocene Unconformity (EMU), the Mid-Miocene Unconformity (MMU) and the Late Miocene Unconformity (LMU). The major regional hiatus, referred to as the Mid-Miocene Unconformity, occurs intercalated within the present Breda Formation and compels subdivision of this unit into two formations, viz. the here newly established Groote Heide and the younger Diessen formations. Pending further studies, the former Breda Formation will be temporally raised in rank to the newly established Hilvarenbeek subgroup, which comprises both the Groote Heide and Diessen formations. Whereas these two sequences were already locally defined, a third sequence overlying the LMU represents two newly defined lithostratigraphical units, named the Goirle and the Tilburg members, positioned in this study at the base of the Oosterhout Formation. Besides their unique lithological characteristics, in seismic reflection profiles the Goirle and the Tilburg members stand out because of their distinct seismic facies.
Use of an integrated, multidisciplinary and regional approach, an improved southern North Sea framework and more comprehensive lithostratigraphic subdivision of Neogene successions is proposed for the Netherlands, to make (cross-border) correlations more straightforward in the future.
Especially in his later years, Twain became an outspoken critic of American nationalism and American and European colonialism. The Spanish-American War and atrocities in the Philippines led him to begin making public comments about imperialism. He was a member of the leading anti-imperialism society, and polemics like King Leopold’s Soliloquy were widely distributed and read. Twain’s increasingly bitter and satiric comments about imperialism lost him some readers but gained him the respect of many around the world.
To characterize nontuberculous mycobacteria (NTM) associated with case clusters at 3 medical facilities.
Retrospective cohort study using molecular typing of patient and water isolates.
Veterans Affairs Medical Centers (VAMCs).
Isolation and identification of NTM from clinical and water samples using culture, MALDI-TOF, and gene population sequencing to determine species and genetic relatedness. Clinical data were abstracted from electronic health records.
An identical strain of Mycobacterium conceptionense was isolated from 41 patients at VA Medical Centers (VAMCs A, B, and D), and from VAMC A’s ICU ice machine. Isolates were initially identified as other NTM species within the M. fortuitum clade. Sequencing analyses revealed that they were identical M. conceptionense strains. Overall, 7 patients (17%) met the criteria for pulmonary or nonpulmonary infection with NTM, and 13 of 41 (32%) were treated with effective antimicrobials regardless of infection or colonization status. Separately, a M. mucogenicum patient strain from VAMC A matched a strain isolated from a VAMC B ICU ice machine. VAMC C, in a different state, had a 4-patient cluster with Mycobacterium porcinum. Strains were identical to those isolated from sink-water samples at this facility.
NTM from hospital water systems are found in hospitalized patients, often during workup for other infections, making attribution of NTM infection problematic. Variable NTM identification methods and changing taxonomy create challenges for epidemiologic investigation and linkage to environmental sources.
Disturbed sleep and activity are prominent features of bipolar disorder type I (BP-I). However, the relationship of sleep and activity characteristics to brain structure and behavior in euthymic BP-I patients and their non-BP-I relatives is unknown. Additionally, underlying genetic relationships between these traits have not been investigated.
Relationships between sleep and activity phenotypes, assessed using actigraphy, with structural neuroimaging (brain) and cognitive and temperament (behavior) phenotypes were investigated in 558 euthymic individuals from multi-generational pedigrees including at least one member with BP-I. Genetic correlations between actigraphy-brain and actigraphy-behavior associations were assessed, and bivariate linkage analysis was conducted for trait pairs with evidence of shared genetic influences.
More physical activity and longer awake time were significantly associated with increased brain volumes and cortical thickness, better performance on neurocognitive measures of long-term memory and executive function, and less extreme scores on measures of temperament (impulsivity, cyclothymia). These associations did not differ between BP-I patients and their non-BP-I relatives. For nine activity-brain or activity-behavior pairs there was evidence for shared genetic influence (genetic correlations); of these pairs, a suggestive bivariate quantitative trait locus on chromosome 7 for wake duration and verbal working memory was identified.
Our findings indicate that increased physical activity and more adequate sleep are associated with increased brain size, better cognitive function and more stable temperament in BP-I patients and their non-BP-I relatives. Additionally, we found evidence for pleiotropy of several actigraphy-behavior and actigraphy-brain phenotypes, suggesting a shared genetic basis for these traits.
Presenteeism, or working while ill, by healthcare personnel (HCP) experiencing influenza-like illness (ILI) puts patients and coworkers at risk. However, hospital policies and practices may not consistently facilitate HCP staying home when ill.
Objective and methods:
We conducted a mixed-methods survey in March 2018 of Emerging Infections Network infectious diseases physicians, describing institutional experiences with and policies for HCP working with ILI.
Of 715 physicians, 367 (51%) responded. Of 367, 135 (37%) were unaware of institutional policies. Of the remaining 232 respondents, 206 (89%) reported institutional policies regarding work restrictions for HCP with influenza or ILI, but only 145 (63%) said these were communicated at least annually. More than half of respondents (124, 53%) reported that adherence to work restrictions was not monitored or enforced. Work restrictions were most often not perceived to be enforced for physicians-in-training and attending physicians. Nearly all (223, 96%) reported that their facility tracked laboratory-confirmed influenza (LCI) in patients; 85 (37%) reported tracking ILI. For employees, 109 (47%) reported tracking of LCI and 53 (23%) reported tracking ILI. For independent physicians, not employed by the facility, 30 (13%) reported tracking LCI and 11 (5%) ILI.
More than one-third of respondents were unaware of whether their institutions had policies to prevent HCP with ILI from working; among those with knowledge of institutional policies, dissemination, monitoring, and enforcement of these policies was highly variable. Improving communication about work-restriction policies, as well as monitoring and enforcement, may help prevent the spread of infections from HCP to patients.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
The diet of most adults is low in fish and, therefore, provides limited quantities of the long-chain, omega-3 fatty acids (LCn-3FAs), eicosapentaenoic and docosahexaenoic acids (EPA, DHA). Since these compounds serve important roles in the brain, we sought to determine if healthy adults with low-LCn-3FA consumption would exhibit improvements in neuropsychological performance and parallel changes in brain morphology following repletion through fish oil supplementation.
In a randomized, controlled trial, 271 mid-life adults (30–54 years of age, 118 men, 153 women) consuming ⩽300 mg/day of LCn-3FAs received 18 weeks of supplementation with fish oil capsules (1400 mg/day of EPA and DHA) or matching placebo. All participants completed a neuropsychological test battery examining four cognitive domains: psychomotor speed, executive function, learning/episodic memory, and fluid intelligence. A subset of 122 underwent neuroimaging before and after supplementation to measure whole-brain and subcortical tissue volumes.
Capsule adherence was over 95%, participant blinding was verified, and red blood cell EPA and DHA levels increased as expected. Supplementation did not affect performance in any of the four cognitive domains. Exploratory analyses revealed that, compared to placebo, fish oil supplementation improved executive function in participants with low-baseline DHA levels. No changes were observed in any indicator of brain morphology.
In healthy mid-life adults reporting low-dietary intake, supplementation with LCn-3FAs in moderate dose for moderate duration did not affect neuropsychological performance or brain morphology. Whether salutary effects occur in individuals with particularly low-DHA exposure requires further study.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
We conducted active surveillance of acute respiratory viral infections (ARIs) among residents and healthcare personnel (HCP) at a long-term care facility during the 2015–2016 respiratory illness season. ARIs were observed among both HCP and patients, highlighting the importance of including HCP in surveillance programs.
The Interplay of Genes and Environment across Multiple Studies (IGEMS) is a consortium of 18 twin studies from 5 different countries (Sweden, Denmark, Finland, United States, and Australia) established to explore the nature of gene–environment (GE) interplay in functioning across the adult lifespan. Fifteen of the studies are longitudinal, with follow-up as long as 59 years after baseline. The combined data from over 76,000 participants aged 14–103 at intake (including over 10,000 monozygotic and over 17,000 dizygotic twin pairs) support two primary research emphases: (1) investigation of models of GE interplay of early life adversity, and social factors at micro and macro environmental levels and with diverse outcomes, including mortality, physical functioning and psychological functioning; and (2) improved understanding of risk and protective factors for dementia by incorporating unmeasured and measured genetic factors with a wide range of exposures measured in young adulthood, midlife and later life.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
A nationwide survey indicated that screening for asymptomatic carriers of C. difficile is an uncommon practice in US healthcare settings. Better understanding of the role of asymptomatic carriage in C. difficile transmission, and of the measures available to reduce that risk, are needed to inform best practices regarding the management of carriers.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
OBJECTIVES/SPECIFIC AIMS: (1) To evaluate the association of patient and clinical factors with adherence to adjuvant hormone therapy (HT). (2) To examine the association of HT-related symptoms and the extent of remediation with early discontinuation of hormone therapy. METHODS/STUDY POPULATION: Retrospective cohort study of risk factors for interruption and early discontinuation of adjuvant hormone therapy in hormone receptor-positive nonmetastatic breast cancer patients diagnosed between 2009 and 2015. This study will include incident hormone receptor-positive breast cancer patients who initiated their HT and were followed at Tufts MC until Dec 31, 2016. Primary data source is electronic medical records (EMRs) RESULTS/ANTICIPATED RESULTS: The primary outcome of this study is early discontinuation to HT, defined as the first treatment gap of greater than or equal to 180 days following the initiation of HT. Treatment interruption, defined as any patient- or provider-initiated treatment gap of ≥ 2 weeks, will be examined as the secondary endpoint. Any HT-related symptoms occurred during a follow-up interval will be captured and categorized into five major types (i.e., vasomotor, neuropsychological, gastrointestinal, gynecological, and musculoskeletal symptoms). Onset and duration of a HT-related symptom will be recorded. Severity of the symptoms will also be rated by clinical oncologists. Remediations in response to HT- related symptoms will be collected and categorized into to two groups (pharmacological or non-pharmacological) and whether they were patient- or provider-initiated. Response to a remediation is defined as complete relief, partial relied, no relief, or with worsening symptoms. Response to a treatment change (i.e., HT switch or hold) was collected separately but using the same criteria. Analyses will be performed on the association between patient and clinical factors with rates of nonadherence (unplanned treatment interruption and/or early discontinuation) of hormone therapy, respectively. We also will explore whether patients with elevated symptoms and/or incomplete remediation will have earlier discontinuation of hormone therapy. DISCUSSION/SIGNIFICANCE OF IMPACT: Through formal chart review, we will establish a dataset that contains highly detailed information about treatment-emergent symptoms and remediations, which will enable us to quantitatively assess the impact of these treatment factors on adherence to hormone therapy for breast cancer. The in-depth analysis of risk factors associated with nonadherence to hormone therapy will inform development of interventions to improve cancer outcomes.
To evaluate and improve the involvement of stakeholders in community-based natural resource management, we developed a stakeholder collaboration index. We compared the stakeholders of five Kenyan conservancies by conducting 10 focus group meetings with conservancy management committees and wildlife game scouts. We used the nominal group technique to identify and rank perceptions of the conservancies’ strengths, weaknesses and opportunities, and any threats. The resulting 455 responses were categorized into ecological, institutional or socio-economic themes of ecosystem management. Collaboration index scores ranged from low (0.33) to high (0.95) collaboration, on a scale of 0–1, with a mean of 0.61. Managers and game scouts had varying perceptions of the conservancies but they agreed about major strengths and threats to conservation. The index highlighted shared perspectives between managers and scouts, which could be used as opportunities for increased stakeholder involvement in collaborative management. The stakeholder collaboration index is a potentially useful tool for improving management of environmental conservation programmes.
To characterize healthcare provider diagnostic testing practices for identifying Clostridioides (Clostridium) difficile infection (CDI) and asymptomatic carriage in children.
An 11-question survey was sent by e-mail or facsimile to all pediatric infectious diseases (PID) members of the Infectious Diseases Society of America’s Emerging Infections Network (EIN).
Among 345 eligible respondents who had ever responded to an EIN survey, 196 (57%) responded; 162 of these (83%) were aware of their institutional policies for CDI testing and management. Also, 159 (98%) respondents knew their institution’s C. difficile testing method: 99 (62%) utilize NAAT without toxin testing and 60 (38%) utilize toxin testing, either as a single test or a multistep algorithm. Of 153 respondents, 10 (7%) reported that formed stools were tested for C. difficile at their institution, and 76 of 151 (50%) reported that their institution does not restrict C. difficile testing in infants and young children. The frequency of symptom- and age-based testing restrictions did not vary between institutions utilizing NAAT alone compared to those utilizing toxin testing for C. difficile diagnosis. Of 143 respondents, 26 (16%) permit testing of neonatal intensive care unit patients and 12 of 26 (46%) treat CDI with antibiotics in this patient population.
These data suggest that there are opportunities to improve CDI diagnostic stewardship practices in children, including among hospitals using NAATs alone for CDI diagnosis in children.
To describe the epidemiology of surgical site infections (SSIs) after pediatric ambulatory surgery.
Observational cohort study with 60 days follow-up after surgery.
The study took place in 3 ambulatory surgical facilities (ASFs) and 1 hospital-based facility in a single pediatric healthcare network.
Children <18 years undergoing ambulatory surgery were included in the study. Of 19,777 eligible surgical encounters, 8,502 patients were enrolled.
Data were collected through parental interviews and from chart reviews. We assessed 2 outcomes: (1) National Healthcare Safety Network (NHSN)–defined SSI and (2) evidence of possible infection using a definition developed for this study.
We identified 21 NSHN SSIs for a rate of 2.5 SSIs per 1,000 surgical encounters: 2.9 per 1,000 at the hospital-based facility and 1.6 per 1,000 at the ASFs. After restricting the search to procedures completed at both facilities and adjustment for patient demographics, there was no difference in the risk of NHSN SSI between the 2 types of facilities (odds ratio, 0.7; 95% confidence interval, 0.2–2.3). Within 60 days after surgery, 404 surgical patients had some or strong evidence of possible infection obtained from parental interview and/or chart review (rate, 48 SSIs per 1,000 surgical encounters). Of 306 cases identified through parental interviews, 176 cases (57%) did not have chart documentation. In our multivariable analysis, older age and black race were associated with a reduced risk of possible infection.
The rate of NHSN-defined SSI after pediatric ambulatory surgery was low, although a substantial additional burden of infectious morbidity related to surgery might not have been captured by standard surveillance strategies and definitions.