To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Pinnacle3 Auto-Planning (AP) package is an automated inverse planning tool employing a multi-sequence optimisation algorithm. The nature of the optimisation aims to improve the overall quality of radiotherapy plans but at the same time may produce higher modulation, increasing plan complexity and challenging linear accelerator delivery capability.
Methods and materials:
Thirty patients previously treated with intensity-modulated radiotherapy (IMRT) to the prostate with or without pelvic lymph node irradiation were replanned with locally developed AP techniques for step-and-shoot IMRT (AP-IMRT) and volumetric-modulated arc therapy (AP-VMAT). Each case was also planned with VMAT using conventional inverse planning. The patient cohort was separated into two groups, those with a single primary target volume (PTV) and those with dual PTVs of differing prescription dose levels. Plan complexity was assessed using the modulation complexity score.
Plans produced with AP provided equivalent or better dose coverage to target volumes whilst effectively reducing organ at risk (OAR) doses. For IMRT plans, the use of AP resulted in a mean reduction in bladder V50Gy by 4·2 and 4·7 % (p ≤ 0·01) and V40Gy by 4·8 and 11·3 % (p < 0·01) in the single and dual dose level cohorts, respectively. For the rectum, V70Gy, V60Gy and V40Gy were all reduced in the dual dose level AP-VMAT plans by an average of 2·0, 2·7 and 7·3 % (p < 0·01), respectively. A small increase in plan complexity was observed only in dual dose level AP plans.
The automated nature of AP led to high quality treatment plans with improvement in OAR sparing and minimised the variation in achievable dose planning metrics when compared to the conventional inverse planning approach.
United States dentists prescribe 10% of all outpatient antibiotics. Assessing appropriateness of antibiotic prescribing has been challenging due to a lack of guidelines for oral infections. In 2019, the American Dental Association (ADA) published clinical practice guidelines (CPG) on the management of acute oral infections. Our objective was to describe baseline national antibiotic prescribing for acute oral infections prior to the release of the ADA CPG and to identify patient-level variables associated with an antibiotic prescription.
We performed an analysis of national VA data from January 1, 2017, to December 31, 2017. We identified cases of acute oral infections using International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes. Antibiotics prescribed by a dentist within ±7 days of a visit were included. Multivariable logistic regression identified patient-level variables associated with an antibiotic prescription.
Of the 470,039 VA dental visits with oral infections coded, 12% of patient visits with irreversible pulpitis, 17% with apical periodontitis, and 28% with acute apical abscess received antibiotics. Although the median days’ supply was 7, prolonged use of antibiotics was frequent (≥8 days, 42%–49%). Patients with high-risk cardiac conditions, prosthetic joints, and endodontic, implant, and oral and maxillofacial surgery dental procedures were more likely to receive antibiotics.
Most treatments of irreversible pulpitis and apical periodontitis cases were concordant with new ADA guidelines. However, in cases where antibiotics were prescribed, prolonged antibiotic courses >7 days were frequent. These findings demonstrate opportunities for the new ADA guidelines to standardize and improve dental prescribing practices.
ABSTRACT IMPACT: This research will promote understanding the role of the Type-I Interferon signaling pathway during embryo implantation, potentially leading to a new diagnostic or treatment target in early pregnancy failure. OBJECTIVES/GOALS: Studies suggest interferon signaling regulation is tightly balanced between physiologic and pathophysiologic growth in early pregnancy. We propose to determine the impact of interferon-mediated inflammation on embryo implantation and early pregnancy failure in normal conditions and chronic inflammatory diseases in a novel mixed-mouse model. METHODS/STUDY POPULATION: To probe the role of type-I interferons (IFNs) in implantation, we will utilize a mouse model and non-surgically transfer both Ifnar1-/- and Ifnar1-/+ embryos into an immune-competent pseudopregnant wild-type female recipient. This will allow analysis of a litter with distinct genotypes within the same, immune-competent, uterine environment. Type-I IFN stimulation will be systemically induced with Poly-(I:C) at various time points around implantation. A similar approach will be used in mouse models of chronic inflammatory disease states associated with early pregnancy loss (e.g. systemic lupous erythematous). With this model, we will be able to control for deficiencies in maternal immune response to specifically determine the embryonic response to inflammation during implantation and development. RESULTS/ANTICIPATED RESULTS: We anticipate the Ifnar1-/+ embryos - those able to respond to Type-I IFN - and their surrounding implantation sites will exhibit more maternal-fetal barrier dysfunction in the form of impaired trophoblast fusion, improper formation of the microvascular architecture, and increased permeability of the maternal-fetal barrier, compared to embryos unable to respond to IFN. We will also conduct similar analyses in mouse models of chronic inflammatory diseases. We hypothesize these mice to have baseline endometrial inflammation that stimulated the IFN-pathway in IFN-capable embryos, producing breakdown of the maternal-fetal barrier. In these mice, we predict Ifnar1-/- embryos will show improved molecular outcomes when compared to Ifnar1-/+ embryos, and thus improved associated pregnancy outcomes. DISCUSSION/SIGNIFICANCE OF FINDINGS: This work can insight into the immunological mechanisms that govern embryo implantation and early placentation. This could provide more pointed means for management and intervention of early pregnancy failure and/or disease states that are commonly associated with poor reproductive outcomes.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
Insomnia is a common, distressing, and impairing psychological outcome experienced by informal caregivers (ICs) of patients with cancer. Cognitive behavioral therapy for insomnia (CBT-I) and acupuncture both have known benefits for patients with cancer, but such benefits have yet to be evaluated among ICs. The purpose of the present study was to evaluate the feasibility, acceptability and preliminary effects of CBT-I and acupuncture among ICs with moderate or greater levels of insomnia.
Participants were randomized to eight sessions of CBT-I or ten sessions of acupuncture.
Results highlighted challenges of identifying interested and eligible ICs and the impact of perception of intervention on retention and likely ultimately outcome.
Significance of the results
Findings suggest preliminary support for non-pharmacological interventions to treat insomnia in ICs and emphasize the importance of matching treatment modality to the preferences and needs of ICs.
An inflammation-induced imbalance in the kynurenine pathway (KP) has been reported in major depressive disorder but the utility of these metabolites as predictive or therapeutic biomarkers of behavioral activation (BA) therapy is unknown.
Serum samples were provided by 56 depressed individuals before BA therapy and 29 of these individuals also provided samples after 10 weeks of therapy to measure cytokines and KP metabolites. The PROMIS Depression Scale (PROMIS-D) and the Sheehan Disability Scale were administered weekly and the Beck depression inventory was administered pre- and post-therapy. Data were analyzed with linear mixed-effect, general linear, and logistic regression models. The primary outcome for the biomarker analyses was the ratio of kynurenic acid to quinolinic acid (KynA/QA).
BA decreased depression and disability scores (p's < 0.001, Cohen's d's > 0.5). KynA/QA significantly increased at post-therapy relative to baseline (p < 0.001, d = 2.2), an effect driven by a decrease in QA post-therapy (p < 0.001, uncorrected, d = 3.39). A trend towards a decrease in the ratio of kynurenine to tryptophan (KYN/TRP) was also observed (p = 0.054, uncorrected, d = 0.78). The change in KynA/QA was nominally associated with the magnitude of change in PROMIS-D scores (p = 0.074, Cohen's f2 = 0.054). Baseline KynA/QA did not predict response to BA therapy.
The current findings together with previous research show that electronconvulsive therapy, escitalopram, and ketamine decrease concentrations of the neurotoxin, QA, raise the possibility that a common therapeutic mechanism underlies diverse forms of anti-depressant treatment but future controlled studies are needed to test this hypothesis.
Sense of place describes both affective and cognitive — emotional and intellectual — connections to place. Affective outcomes, tied to arts and humanities education, can facilitate these connections. But little research explores environmental science, arts and humanities (eSAH) curricula on place relationships. Additionally, most research on the sense of place focuses on repeated visits to a place over time, rather than short-term experiences like a field trip. Finally, digital technology is a growing trend across science education, but little research investigates its use in field-based contexts. Our research begins to address these gaps. This article describes an eSAH field trip for middle and high school learners. Using a conventional content analysis, we present pilot data from two high school field trips. Our findings illuminate a framework for understanding active and passive place relationships in the context of short-term interdisciplinary field learning experiences.
Conventional longitudinal behavioral genetic models estimate the relative contribution of genetic and environmental factors to stability and change of traits and behaviors. Longitudinal models rarely explain the processes that generate observed differences between genetically and socially related individuals. We propose that exchanges between individuals and their environments (i.e., phenotype–environment effects) can explain the emergence of observed differences over time. Phenotype–environment models, however, would require violation of the independence assumption of standard behavioral genetic models; that is, uncorrelated genetic and environmental factors. We review how specification of phenotype–environment effects contributes to understanding observed changes in genetic variability over time and longitudinal correlations among nonshared environmental factors. We then provide an example using 30 days of positive and negative affect scores from an all-female sample of twins. Results demonstrate that the phenotype–environment effects explain how heritability estimates fluctuate as well as how nonshared environmental factors persist over time. We discuss possible mechanisms underlying change in gene–environment correlation over time, the advantages and challenges of including gene–environment correlation in longitudinal twin models, and recommendations for future research.
Background: Surgical site infections (SSIs) among cardiothoracic (CT) patients are associated with high rates of morbidity and mortality. Data are limited regarding SSI incidence among pediatric patients undergoing primary reparative procedures for congenital cardiac disease. Published evidence on targeted interventions to prevent pediatric CT-surgery SSI is lacking. We aimed to establish standard metrics for measuring CT-surgery SSI incidence and to implement bundled interventions for SSI prevention. Methods: A dedicated CT-surgery SSI prevention workgroup was established, consisting of hospital leadership, CT surgeons, cardiac critical care unit staff, anesthesia, perfusion, environmental services, instrument sterile processing, risk management, infection prevention and antibiotic stewardship. We created a standard definition for CT-surgery SSI and calculated retrospective SSI rates over a 24-month period (2017–2019). The outcome measured was incidence of CT-surgery SSI per 100 primary cardiac procedures with delayed ( 3 days after primary surgery) or non-delayed chest closure. The difference in proportion of SSI was reported separately for delayed closure and non-delayed closure; statistical significance was tested using a Fisher’s Exact test. We identified many potential improvement opportunities, including gaps in SSI surveillance, poor compliance with daily bathing, inconsistent perioperative antimicrobial prophylaxis, lack of controlled environment for bedside chest closures, and lapses in environmental cleaning. These issues informed the enhanced SSI prevention bundle, which included education on sterility with the operating room (OR) staff. Protocols for care of cardiac patients with delayed chest closures focused on universal daily and preoperative chlorhexidine baths. In addition, the bundle incorporated stringent environmental cleaning interventions including scheduled decluttering of patient rooms and clinical spaces, terminal cleaning of patient rooms prior to returning from the OR, and use of adjunctive ultraviolet light for the daily cleaning of operating rooms and patient rooms at discharge. Results: Surveillance definition of microbiological growth from a clinical sample obtained within 30 days of primary cardiac procedure sufficiently captured all CT-surgery SSIs. Of 551 CT-surgery procedures prior to intervention, 91 (17%) had delayed final operative closures. Prior to the intervention, 16 SSIs were identified from July 2017 – May 2019 for a rate of 2.90 per /100 procedures, and was higher among patients with delayed chest closure 6.59 per /100 procedures (6 SSIs/91 procedures) versus those with primary chest closure 2.17 per /100 procedures (10 SSIs/460 procedures; P = 0.034). Gram-positive organisms, including coagulase coagulase-negative Staphylococci, were most frequently identified as the causative organisms for SSIs. Compliance with bundled intervention, rolled out over a 2-month period, was associated with an immediate decrease in the number of SSIs for primary and delayed chest closures 6SSIs /185 procedures in the initial quarters (August – December 2019) of the post-intervention period. However, this decrease was not reflected in the overall rate (3.24 per /100 procedures) due to fewer procedures performed. Data collection to measure sustainability is ongoing. Conclusions: Bundled interventions targeting skin antisepsis and environmental cleaning may be associated with a decrease in SSIs among pediatric CT-surgery patients. Ongoing surveillance is required to determine sustainability of these interventions.
Background: Clinically diagnosed ventilator-associated pneumonia (VAP) is common in the long-term acute-care hospital (LTACH) setting and may contribute to adverse ventilator-associated events (VAEs). Pseudomonas aeruginosa is a common causative organism of VAP. We evaluated the impact of respiratory P. aeruginosa colonization and bacterial community dominance, both diagnosed and undiagnosed, on subsequent P. aeruginosa VAP and VAE events during long-term acute care. Methods: We enrolled 83 patients on LTACH admission for ventilator weaning, performed longitudinal sampling of endotracheal aspirates followed by 16S rRNA gene sequencing (Illumina HiSeq), and bacterial community profiling (QIIME2). Statistical analysis was performed with R and Stan; mixed-effects models were fit to relate the abundance of respiratory Psa on admission to clinically diagnosed VAP and VAE events. Results: Of the 83 patients included, 12 were diagnosed with P. aeruginosa pneumonia during the 14 days prior to LTACH admission (known P. aeruginosa), and 22 additional patients received anti–P. aeruginosa antibiotics within 48 hours of admission (suspected P. aeruginosa); 49 patients had no known or suspected P. aeruginosa (unknown P. aeruginosa). Among the known P. aeruginosa group, all 12 patients had P. aeruginosa detectable by 16S sequencing, with elevated admission P. aeruginosa proportional abundance (median, 0.97; IQR, 0.33–1). Among the suspected P. aeruginosa group, all 22 patients had P. aeruginosa detectable by 16S sequencing, with a wide range of admission P. aeruginosa proportional abundance (median, 0.0088; IQR, 0.00012–0.31). Of the 49 patients in the unknown group, 47 also had detectable respiratory Psa, and many had high P. aeruginosa proportional abundance at admission (median, 0.014; IQR, 0.00025–0.52). Incident P. aeruginosa VAP was observed within 30 days in 4 of the known P. aeruginosa patients (33.3%), 5 of the suspected P. aeruginosa patients (22.7%), and 8 of the unknown P. aeruginosa patients (16.3%). VAE was observed within 30 days in 1 of the known P. aeruginosa patients (8.3%), 2 of the suspected P. aeruginosa patients (9.1%), and 1 of the unknown P. aeruginosa patients (2%). Admission P. aeruginosa abundance was positively associated with VAP and VAE risk in all groups, but the association only achieved statistical significance in the unknown group (type S error <0.002 for 30-day VAP and <0.011 for 30-day VAE). Conclusions: We identified a high prevalence of unrecognized respiratory P. aeruginosa colonization among patients admitted to LTACH for weaning from mechanical ventilation. The admission P. aeruginosa proportional abundance was strongly associated with increased risk of incident P. aeruginosa VAP among these patients.
Background: Contaminated surfaces within patient rooms and on shared equipment is a major driver of healthcare-acquired infections (HAIs). The emergence of Candida auris in the New York City metropolitan area, a multidrug-resistant fungus with extended environmental viability, has made a standardized assessment of cleaning protocols even more urgent for our multihospital academic health system. We therefore sought to create an environmental surveillance protocol to detect C. auris and to assess patient room contamination after discharge cleaning by different chemicals and methods, including touch-free application using an electrostatic sprayer. Surfaces disinfected using touch-free methods may not appear disinfected when assessed by fluorescent tracer dye or ATP bioluminescent assay. Methods: We focused on surfaces within the patient zone which are touched by the patient or healthcare personnel prior to contact with the patient. Our protocol sampled the over-bed table, call button, oxygen meter, privacy curtain, and bed frame using nylon-flocked swabs dipped in nonbacteriostatic sterile saline. We swabbed a 36-cm2 surface area on each sample location shortly after the room was disinfected, immediately inoculated the swab on a blood agar 5% TSA plate, and then incubated the plate for 24 hours at 36°C. The contamination with common environmental bacteria was calculated as CFU per plate over swabbed surface area and a cutoff of 2.5 CFU/cm2 was used to determine whether a surface passed inspection. Limited data exist on acceptable microbial limits for healthcare settings, but the aforementioned cutoff has been used in food preparation. Results: Over a year-long period, terminal cleaning had an overall fail rate of 6.5% for 413 surfaces swabbed. We used the protocol to compare the normal application of either peracetic acid/hydrogen peroxide or bleach using microfiber cloths to a new method using sodium dichloroisocyanurate (NaDCC) applied with microfiber cloths and electrostatic sprayers. The normal protocol had a fail rate of 9%, and NaDCC had a failure rate of 2.5%. The oxygen meter had the highest normal method failure rate (18.2%), whereas the curtain had the highest NaDCC method failure rate (11%). In addition, we swabbed 7 rooms previously occupied by C. auris–colonized patients for C. auris contamination of environmental surfaces, including the mobile medical equipment of the 4 patient care units that contained these rooms. We did not find any C. auris, and we continue data collection. Conclusions: A systematic environmental surveillance system is critical for healthcare systems to assess touch-free disinfection and identify MDRO contamination of surfaces.
Background: Central-line–associated bloodstream infection (CLABSI) rates have steadily decreased as evidence-based prevention bundles were implemented. Bone marrow transplant (BMT) patients are at increased risk for CLABSI due to immunosuppression, prolonged central-line utilization, and frequent central-line accesses. We assessed the impact of an enhanced prevention bundle on BMT nonmucosal barrier injury CLABSI rates. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that houses the only BMT program in Iowa. During October 2018, we added 3 interventions to the ongoing CLABSI prevention bundle in our BMT inpatient unit: (1) a standardized 2-person dressing change team, (2) enhanced quality daily chlorhexidine treatments, and (3) staff and patient line-care stewardship. The bundle included training of nurse champions to execute a team approach to changing central-line dressings. Standard process description and supplies are contained in a cart. In addition, 2 sets of sterile hands and a second person to monitor for breaches in sterile procedure are available. Site disinfection with chlorhexidine scrub and dry time are monitored. Training on quality chlorhexidine bathing includes evaluation of preferred product, application per product instructions for use and protection of the central-line site with a waterproof shoulder length glove. In addition to routine BMT education, staff and patients are instructed on device stewardship during dressing changes. CLABSIs are monitored using NHSN definitions. We performed an interrupted time-series analysis to determine the impact of our enhanced prevention bundle on CLABSI rates in the BMT unit. We used monthly CLABSI rates since January 2017 until the intervention (October 2018) as baseline. Because the BMT changed locations in December 2018, we included both time points in our analysis. For a sensitivity analysis, we assessed the impact of the enhanced prevention bundle in a hematology-oncology unit (March 2019) that did not change locations. Results: During the period preceding bundle implementation, the CLABSI rate was 2.2 per 1,000 central-line days. After the intervention, the rate decreased to 0.6 CLABSI per 1,000 central-line days (P = .03). The move in unit location did not have a significant impact on CLABSI rates (P = .85). CLABSI rates also decreased from 1.6 per 1,000 central-line days to 0 per 1,000 central-line days (P < .01) in the hematology-oncology unit. Conclusions: An enhanced CLABSI prevention bundle was associated with significant decreases in CLABSI rates in 2 high-risk units. Novel infection prevention bundle elements should be considered for special populations when all other evidence-based recommendations have been implemented.
Background: Healthcare exposure results in significant microbiome disruption, particularly in the setting of critical illness, which may contribute to risk for healthcare-associated infections (HAIs). Patients admitted to long-term acute-care hospitals (LTACHs) have extensive prior healthcare exposure and critical illness; significant microbiome disruption has been previously documented among LTACH patients. We compared the predictive value of 3 respiratory tract microbiome disruption indices—bacterial community diversity, dominance, and absolute abundance—as they relate to risk for ventilator-associated pneumonia (VAP) and adverse ventilator-associated events (VAE), which commonly complicate LTACH care. Methods: We enrolled 83 subjects on admission to an academic LTACH for ventilator weaning and performed longitudinal sampling of endotracheal aspirates, followed by 16S rRNA gene sequencing (Illumina HiSeq), bacterial community profiling (QIIME2) for diversity, and 16S rRNA quantitative PCR (qPCR) for total bacterial abundance. Statistical analyses were performed with R and Stan software. Mixed-effects models were fit to relate the admission MDIs to subsequent clinically diagnosed VAP and VAE. Results: Of the 83 patients, 19 had been diagnosed with pneumonia during the 14 days prior to LTACH admission (ie, “recent past VAP”); 23 additional patients were receiving antibiotics consistent with empiric VAP therapy within 48 hours of admission (ie, “empiric VAP therapy”); and 41 patients had no evidence of VAP at admission (ie, “no suspected VAP”). We detected no statistically significant differences in admission Shannon diversity, maximum amplicon sequence variant (ASV)–level proportional abundance, or 16S qPCR across the variables of interest. In isolation, all 3 admission microbiome disruption indices showed poor predictive performance, though Shannon diversity performed better than maximum ASV abundance. Predictive models that combined (1) bacterial diversity or abundance with (2) recent prior VAP diagnosis and (3) concurrent antibiotic exposure best predicted 14-day VAP (type S error < 0.05) and 30-day VAP (type S error < 0.003). In this cohort, VAE risk was paradoxically associated with higher admission Shannon diversity and lower admission maximum ASV abundance. Conclusions: In isolation, respiratory tract microbiome disruption indices obtained at LTACH admission showed poor predictive performance for subsequent VAP and VAE. But diversity and abundance models incorporating recent VAP history and admission antibiotic exposure performed well predicting 14-day and 30-day VAP.
This paper describes a collaborative approach to professional learning that has provided an opportunity for refreshed practices and growth in capacity in schools supporting students with various learning needs in several schools that are part of the Association of Independent Schools in the Australian Capital Territory. An action research approach to professional learning for school staff was facilitated with the participating schools in 2018/2019, centred on the Nationally Consistent Collection of Data on School Students with Disability.
In response to advancing clinical practice guidelines regarding concussion management, service members, like athletes, complete a baseline assessment prior to participating in high-risk activities. While several studies have established test stability in athletes, no investigation to date has examined the stability of baseline assessment scores in military cadets. The objective of this study was to assess the test–retest reliability of a baseline concussion test battery in cadets at U.S. Service Academies.
All cadets participating in the Concussion Assessment, Research, and Education (CARE) Consortium investigation completed a standard baseline battery that included memory, balance, symptom, and neurocognitive assessments. Annual baseline testing was completed during the first 3 years of the study. A two-way mixed-model analysis of variance (intraclass correlation coefficent (ICC)3,1) and Kappa statistics were used to assess the stability of the metrics at 1-year and 2-year time intervals.
ICC values for the 1-year test interval ranged from 0.28 to 0.67 and from 0.15 to 0.57 for the 2-year interval. Kappa values ranged from 0.16 to 0.21 for the 1-year interval and from 0.29 to 0.31 for the 2-year test interval. Across all measures, the observed effects were small, ranging from 0.01 to 0.44.
This investigation noted less than optimal reliability for the most common concussion baseline assessments. While none of the assessments met or exceeded the accepted clinical threshold, the effect sizes were relatively small suggesting an overlap in performance from year-to-year. As such, baseline assessments beyond the initial evaluation in cadets are not essential but could aid concussion diagnosis.
Catheter-associated urinary tract infections (CAUTIs) occur frequently in pediatric inpatients, and they are associated with increased morbidity and cost. Few studies have investigated ambulatory CAUTIs, despite at-risk children utilizing home urinary catheterization. This retrospective cohort and case-control study determined incidence, risk factors, and outcomes of pediatric patients with ambulatory CAUTI.
Broad electronic queries identified potential patients with ambulatory urinary catheters, and direct chart review confirmed catheters and adjudicated whether ambulatory CAUTI occurred. CAUTI definitions included clean intermittent catheterization (CIC). Our matched case-control analysis assessed risk factors.
Five urban, academic medical centers, part of the New York City Clinical Data Research Network.
Potential patients were age <22 years who were seen between October 2010 and September 2015.
In total, 3,598 eligible patients were identified; 359 of these used ambulatory catheterization (representing186,616 ambulatory catheter days). Of these, 63 patients (18%) experienced 95 ambulatory CAUTIs. The overall ambulatory CAUTI incidence was 0.51 infections per 1,000 catheter days (1.35 for indwelling catheters and 0.47 for CIC; incidence rate ratio, 2.88). Patients with nonprivate medical insurance (odds ratio, 2.5; 95% confidence interval, 1.1–6.3) were significantly more likely to have ambulatory CAUTIs in bivariate models but not multivariable models. Also, 45% of ambulatory CAUTI resulted in hospitalization (median duration, 3 days); 5% resulted in intensive care admission; 47% underwent imaging; and 88% were treated with antibiotics.
Pediatric ambulatory CAUTIs occur in 18% of patients with catheters; they are associated with morbidity and healthcare utilization. Ambulatory indwelling catheter CAUTI incidence exceeded national inpatient incidence. Future quality improvement research to reduce these harmful infections is warranted.
There are little data about renal follow-up of neonates after cardiovascular surgery and no guidelines for long-term renal follow-up. Our objectives were to assess renal function follow-up practice after neonatal cardiac surgery, evaluate factors that predict follow-up serum creatinine measurements including acute kidney injury following surgery, and evaluate the estimated glomerular filtration rate during follow-up using routinely collected laboratory values.
Two-centre retrospective cohort study of children 5–7 years of age with a history of neonatal cardiac surgery. Univariable and multivariable analyses were performed to determine factors associated with post-discharge creatinine measurements. Glomerular filtration rate was estimated for each creatinine using a height-independent equation.
Seventeen of 55 children (30%) did not have any creatinine measured following discharge after surgery until the end of study follow-up, which occurred at a median time of 6 years after discharge. Of the 38 children who had the kidney function checked, 15 (40%) had all of their creatinine drawn only in the context of a hospitalisation or emergency department visit. Acute kidney injury following surgery did not predict the presence of follow-up creatinine measurements.
A large proportion of neonates undergoing congenital heart repair did not have a follow-up creatinine measured in the first years following surgery. In those that did have a creatinine measured, there did not appear to be any identified pattern of follow-up. A follow-up system for children who are discharged from cardiac surgery is needed to identify children with or at risk of chronic kidney disease.
The coronavirus disease 2019 (COVID-19) pandemic introduced challenges to the use of simulation, including limited personal protective equipment and restricted time and personnel. Our use of video for in situ simulation aimed to circumvent these challenges and assist in the development of a protocol for protected intubation and simultaneously educate emergency department (ED) staff. We video-recorded a COVID-19 respiratory failure in situ simulation event, which was shared by a facilitator both virtually and in the ED. The facilitator led discussions and debriefs. We followed this with in situ run-throughs in which staff walked through the steps of the simulation in the ED, handling medications and equipment and becoming comfortable with use of isolation rooms. This application of in situ simulation allowed one simulation event to reach a wide audience, while allowing participants to respect social distancing, and resulted in the education of this audience and successful crowdsourcing for a protocol amidst a pandemic.