We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Major depressive disorder (MDD) is the leading cause of disability globally, with moderate heritability and well-established socio-environmental risk factors. Genetic studies have been mostly restricted to European settings, with polygenic scores (PGS) demonstrating low portability across diverse global populations.
Methods
This study examines genetic architecture, polygenic prediction, and socio-environmental correlates of MDD in a family-based sample of 10 032 individuals from Nepal with array genotyping data. We used genome-based restricted maximum likelihood to estimate heritability, applied S-LDXR to estimate the cross-ancestry genetic correlation between Nepalese and European samples, and modeled PGS trained on a GWAS meta-analysis of European and East Asian ancestry samples.
Results
We estimated the narrow-sense heritability of lifetime MDD in Nepal to be 0.26 (95% CI 0.18–0.34, p = 8.5 × 10−6). Our analysis was underpowered to estimate the cross-ancestry genetic correlation (rg = 0.26, 95% CI −0.29 to 0.81). MDD risk was associated with higher age (beta = 0.071, 95% CI 0.06–0.08), female sex (beta = 0.160, 95% CI 0.15–0.17), and childhood exposure to potentially traumatic events (beta = 0.050, 95% CI 0.03–0.07), while neither the depression PGS (beta = 0.004, 95% CI −0.004 to 0.01) or its interaction with childhood trauma (beta = 0.007, 95% CI −0.01 to 0.03) were strongly associated with MDD.
Conclusions
Estimates of lifetime MDD heritability in this Nepalese sample were similar to previous European ancestry samples, but PGS trained on European data did not predict MDD in this sample. This may be due to differences in ancestry-linked causal variants, differences in depression phenotyping between the training and target data, or setting-specific environmental factors that modulate genetic effects. Additional research among under-represented global populations will ensure equitable translation of genomic findings.
It has been posited that alcohol use may confound the association between greater concussion history and poorer neurobehavioral functioning. However, while greater alcohol use is positively correlated with neurobehavioral difficulties, the association between alcohol use and concussion history is not well understood. Therefore, this study investigated the cross-sectional and longitudinal associations between cumulative concussion history, years of contact sport participation, and health-related/psychological factors with alcohol use in former professional football players across multiple decades.
Participants and Methods:
Former professional American football players completed general health questionnaires in 2001 and 2019, including demographic information, football history, concussion/medical history, and health-related/psychological functioning. Alcohol use frequency and amount was reported for three timepoints: during professional career (collected retrospectively in 2001), 2001, and 2019. During professional career and 2001 alcohol use frequency included none, 1-2, 3-4, 5-7 days/week, while amount included none, 12, 3-5, 6-7, 8+ drinks/occasion. For 2019, frequency included never, monthly or less, 2-4 times/month, 2-3 times/week, >4 times/week, while amount included none, 1-2, 3-4, 5-6, 7-9, 10+ drinks/occasion. Scores on a screening measure for Alcohol Use Disorder (CAGE) were also available at during professional career and 2001 timepoints. Concussion history was recorded in 2001 and binned into five groups: 0, 1-2, 3-5, 6-9, 10+. Depression and pain interference were assessed via PROMIS measures at all timepoints. Sleep disturbance was assessed in 2001 via separate instrument and with PROMIS Sleep Disturbance in 2019. Spearman’s rho correlations tested associations between concussion history and years of sport participation with alcohol use across timepoints, and whether poor health functioning (depression, pain interference, sleep disturbance) in 2001 and 2019 were associated with alcohol use both within and between timepoints.
Results:
Among the 351 participants (Mage=47.86[SD=10.18] in 2001), there were no significant associations between concussion history or years of contact sport participation with CAGE scores or alcohol use frequency/amount during professional career, 2001, or 2019 (rhos=-.072-.067, ps>.05). In 2001, greater depressive symptomology and sleep disturbance were related to higher CAGE scores (rho=.209, p<.001; rho=.176, p<.001, respectively), while greater depressive symptomology, pain interference, and sleep disturbance were related to higher alcohol use frequency (rho=.176, p=.002; rho=.109, p=.045; rho=.132, p=.013, respectively) and amount/occasion (rho=.215, p<.001; rho=.127, p=.020; rho=.153, p=.004, respectively). In 2019, depressive symptomology, pain interference, and sleep disturbance were not related to alcohol use (rhos=-.047-.087, ps>.05). Between timepoints, more sleep disturbance in 2001 was associated with higher alcohol amount/occasion in 2019 (rho=.115, p=.036).
Conclusions:
Increased alcohol intake has been theorized to be a consequence of greater concussion history, and as such, thought to confound associations between concussion history and neurobehavioral function later in life. Our findings indicate concussion history and years of contact sport participation were not significantly associated with alcohol use cross-sectionally or longitudinally, regardless of alcohol use characterization. While higher levels of depression, pain interference, and sleep disturbance in 2001 were related to greater alcohol use in 2001, they were not associated cross-sectionally in 2019. Results support the need to concurrently address health-related and psychological factors in the implementation of alcohol use interventions for former NFL players, particularly earlier in the sport discontinuation timeline.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery (WCPCCS) will be held in Washington DC, USA, from Saturday, 26 August, 2023 to Friday, 1 September, 2023, inclusive. The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery will be the largest and most comprehensive scientific meeting dedicated to paediatric and congenital cardiac care ever held. At the time of the writing of this manuscript, The Eighth World Congress of Pediatric Cardiology and Cardiac Surgery has 5,037 registered attendees (and rising) from 117 countries, a truly diverse and international faculty of over 925 individuals from 89 countries, over 2,000 individual abstracts and poster presenters from 101 countries, and a Best Abstract Competition featuring 153 oral abstracts from 34 countries. For information about the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery, please visit the following website: [www.WCPCCS2023.org]. The purpose of this manuscript is to review the activities related to global health and advocacy that will occur at the Eighth World Congress of Pediatric Cardiology and Cardiac Surgery.
Acknowledging the need for urgent change, we wanted to take the opportunity to bring a common voice to the global community and issue the Washington DC WCPCCS Call to Action on Addressing the Global Burden of Pediatric and Congenital Heart Diseases. A copy of this Washington DC WCPCCS Call to Action is provided in the Appendix of this manuscript. This Washington DC WCPCCS Call to Action is an initiative aimed at increasing awareness of the global burden, promoting the development of sustainable care systems, and improving access to high quality and equitable healthcare for children with heart disease as well as adults with congenital heart disease worldwide.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Background: Sex differences in treatment response to intravenous thrombolysis (IVT) are poorly characterized. We compared sex-disaggregated outcomes in patients receiving IVT for acute ischemic stroke in the Alteplase compared to Tenecteplase (AcT) trial, a Canadian multicentre, randomised trial. Methods: In this post-hoc analysis, the primary outcome was excellent functional outcome (modified Rankin Score [mRS] 0-1) at 90 days. Secondary and safety outcomes included return to baseline function, successful reperfusion (eTICI≥2b), death and symptomatic intracerebral hemorrhage. Results: Of 1577 patients, there were 755 women and 822 men (median age 77 [68-86]; 70 [59-79]). There were no differences in rates of mRS 0-1 (aRR 0.95 [0.86-1.06]), return to baseline function (aRR 0.94 [0.84-1.06]), reperfusion (aRR 0.98 [0.80-1.19]) and death (aRR 0.91 [0.79-1.18]). There was no effect modification by treatment type on the association between sex and outcomes. The probability of excellent functional outcome decreased with increasing onset-to-needle time. This relation did not vary by sex (pinteraction 0.42). Conclusions: The AcT trial demonstrated comparable functional, safety and angiographic outcomes by sex. This effect did not differ between alteplase and tenecteplase. The pragmatic enrolment and broad national participation in AcT provide reassurance that there do not appear to be sex differences in outcomes amongst Canadians receiving IVT.
In Paper I, we presented an overview of the Southern-sky MWA Rapid Two-metre (SMART) survey, including the survey design and search pipeline. While the combination of MWA’s large field-of-view and the voltage capture system brings a survey speed of ${\sim} 450\, {\textrm{deg}}^{2}\,\textrm{h}^{-1}$, the progression of the survey relies on the availability of compact configuration of the Phase II array. Over the past few years, by taking advantage of multiple windows of opportunity when the compact configuration was available, we have advanced the survey to 75% of the planned sky coverage. To date, about 10% of the data collected thus far have been processed for a first-pass search, where 10 min of observation is processed for dispersion measures out to 250 ${\textrm{pc cm}}^{-3}$, to realise a shallow survey that is largely sensitive to long-period pulsars. The ongoing analysis has led to two new pulsar discoveries, as well as an independent discovery and a rediscovery of a previously incorrectly characterised pulsar, all from ${\sim} 3\% $ of the data for which candidate scrutiny is completed. In this sequel to Paper I, we describe the strategies for further detailed follow-up including improved sky localisation and convergence to timing solution, and illustrate them using example pulsar discoveries. The processing has also led to re-detection of 120 pulsars in the SMART observing band, bringing the total number of pulsars detected to date with the MWA to 180, and these are used to assess the search sensitivity of current processing pipelines. The planned second-pass (deep survey) processing is expected to yield a three-fold increase in sensitivity for long-period pulsars, and a substantial improvement to millisecond pulsars by adopting optimal de-dispersion plans. The SMART survey will complement the highly successful Parkes High Time Resolution Universe survey at 1.2–1.5 GHz, and inform future large survey efforts such as those planned with the low-frequency Square Kilometre Array (SKA-Low).
We present an overview of the Southern-sky MWA Rapid Two-metre (SMART) pulsar survey that exploits the Murchison Widefield Array’s large field of view and voltage-capture system to survey the sky south of 30$^{\circ}$ in declination for pulsars and fast transients in the 140–170 MHz band. The survey is enabled by the advent of the Phase II MWA’s compact configuration, which offers an enormous efficiency in beam-forming and processing costs, thereby making an all-sky survey of this magnitude tractable with the MWA. Even with the long dwell times employed for the survey (4800 s), data collection can be completed in $<$100 h of telescope time, while still retaining the ability to reach a limiting sensitivity of $\sim$2–3 mJy (at 150 MHz, near zenith), which is effectively 3–5 times deeper than the previous-generation low-frequency southern-sky pulsar survey, completed in the 1990s. Each observation is processed to generate $\sim$5000–8000 tied-array beams that tessellate the full $\sim 610\, {\textrm{deg}^{2}}$ field of view (at 155 MHz), which are then processed to search for pulsars. The voltage-capture recording of the survey also allows a multitude of post hoc processing options including the reprocessing of data for higher time resolution and even exploring image-based techniques for pulsar candidate identification. Due to the substantial computational cost in pulsar searches at low frequencies, the survey data processing is undertaken in multiple passes: in the first pass, a shallow survey is performed, where 10 min of each observation is processed, reaching about one-third of the full-search sensitivity. Here we present the system overview including details of ongoing processing and initial results. Further details including first pulsar discoveries and a census of low-frequency detections are presented in a companion paper. Future plans include deeper searches to reach the full sensitivity and acceleration searches to target binary and millisecond pulsars. Our simulation analysis forecasts $\sim$300 new pulsars upon the completion of full processing. The SMART survey will also generate a complete digital record of the low-frequency sky, which will serve as a valuable reference for future pulsar searches planned with the low-frequency Square Kilometre Array.
Recent research has shown that risk and reward are positively correlated in many environments, and that people have internalized this association as a “risk-reward heuristic”: when making choices based on incomplete information, people infer probabilities from payoffs and vice-versa, and these inferences shape their decisions. We extend this work by examining people’s expectations about another fundamental trade-off — that between monetary reward and delay. In 2 experiments (total N = 670), we adapted a paradigm previously used to demonstrate the risk-reward heuristic. We presented participants with intertemporal choice tasks in which either the delayed reward or the length of the delay was obscured. Participants inferred larger rewards for longer stated delays, and longer delays for larger stated rewards; these inferences also predicted people’s willingness to take the delayed option. In exploratory analyses, we found that older participants inferred longer delays and smaller rewards than did younger ones. All of these results replicated in 2 large-scale pre-registered studies with participants from a different population (total N = 2138). Our results suggest that people expect intertemporal choice tasks to offer a trade-off between delay and reward, and differ in their expectations about this trade-off. This “delay-reward heuristic” offers a new perspective on existing models of intertemporal choice and provides new insights into unexplained and systematic individual differences in the willingness to delay gratification.
We report the detections of NH3(3,3) and 25 GHz and 278.3 GHz class I CH3OH maser emission associated with the outflow of the Extended Green Object G19.01–0.03 in sub-arcsecond resolution Atacama Large Millimeter/submillimeter Array (ALMA) and Karl G. Jansky Very Large Array (VLA) observations. For masers associated with the outer outflow lobes (> 12.5 ″ from the central massive young stellar object; MYSO), the spatial distribution of the NH3(3,3) masers is statistically indistinguishable from that of previously known 44 GHz Class I CH3OH masers, strengthening the connection of NH3(3,3) masers to outflow shocks. In sub-arcsecond resolution VLA observations, we resolve the 6.7 GHz Class II CH3OH maser emission towards the MYSO into a partial, inclined ring, with a velocity gradient consistent with the rotationally supported circumstellar disc traced by thermal gas emission.
The purpose of this investigation was to expand upon the limited existing research examining the test–retest reliability, cross-sectional validity and longitudinal validity of a sample of bioelectrical impedance analysis (BIA) devices as compared with a laboratory four-compartment (4C) model. Seventy-three healthy participants aged 19–50 years were assessed by each of fifteen BIA devices, with resulting body fat percentage estimates compared with a 4C model utilising air displacement plethysmography, dual-energy X-ray absorptiometry and bioimpedance spectroscopy. A subset of thirty-seven participants returned for a second visit 12–16 weeks later and were included in an analysis of longitudinal validity. The sample of devices included fourteen consumer-grade and one research-grade model in a variety of configurations: hand-to-hand, foot-to-foot and bilateral hand-to-foot (octapolar). BIA devices demonstrated high reliability, with precision error ranging from 0·0 to 0·49 %. Cross-sectional validity varied, with constant error relative to the 4C model ranging from −3·5 (sd 4·1) % to 11·7 (sd 4·7) %, standard error of the estimate values of 3·1–7·5 % and Lin’s concordance correlation coefficients (CCC) of 0·48–0·94. For longitudinal validity, constant error ranged from −0·4 (sd 2·1) % to 1·3 (sd 2·7) %, with standard error of the estimate values of 1·7–2·6 % and Lin’s CCC of 0·37–0·78. While performance varied widely across the sample investigated, select models of BIA devices (particularly octapolar and select foot-to-foot devices) may hold potential utility for the tracking of body composition over time, particularly in contexts in which the purchase or use of a research-grade device is infeasible.
We developed an agent-based model using a trial emulation approach to quantify effect measure modification of spillover effects of pre-exposure prophylaxis (PrEP) for HIV among men who have sex with men (MSM) in the Atlanta-Sandy Springs-Roswell metropolitan area, Georgia. PrEP may impact not only the individual prescribed, but also their partners and beyond, known as spillover. We simulated a two-stage randomised trial with eligible components (≥3 agents with ≥1 HIV+ agent) first randomised to intervention or control (no PrEP). Within intervention components, agents were randomised to PrEP with coverage of 70%, providing insight into a high PrEP coverage strategy. We evaluated effect modification by component-level characteristics and estimated spillover effects on HIV incidence using an extension of randomisation-based estimators. We observed an attenuation of the spillover effect when agents were in components with a higher prevalence of either drug use or bridging potential (if an agent acts as a mediator between ≥2 connected groups of agents). The estimated spillover effects were larger in magnitude among components with either higher HIV prevalence or greater density (number of existing partnerships compared to all possible partnerships). Consideration of effect modification is important when evaluating the spillover of PrEP among MSM.
Non-archosaur archosauromorphs are a paraphyletic group of diapsid reptiles that were important members of global Middle and Late Triassic continental ecosystems. Included in this group are the azendohsaurids, a clade of allokotosaurians (kuehneosaurids and Azendohsauridae + Trilophosauridae) that retain the plesiomorphic archosauromorph postcranial body plan but evolved disparate cranial features that converge on later dinosaurian anatomy, including sauropodomorph-like marginal dentition and ceratopsian-like postorbital horns. Here we describe a new malerisaurine azendohsaurid from two monodominant bonebeds in the Blue Mesa Member, Chinle Formation (Late Triassic, ca. 218–220 Ma); the first occurs at Petrified Forest National Park and preserves a minimum of eight individuals of varying sizes, and the second occurs near St. Johns, Arizona. Puercosuchus traverorum n. gen. n. sp. is a carnivorous malerisaurine that is closely related to Malerisaurus robinsonae from the Maleri Formation of India and to Malerisaurus langstoni from the Dockum Group of western Texas. Dentigerous elements from Puercosuchus traverorum n. gen. n. sp. confirm that some Late Triassic tooth morphotypes thought to represent early dinosaurs cannot be differentiated from, and likely pertain to, Puercosuchus-like malerisaurine taxa. These bonebeds from northern Arizona support the hypothesis that non-archosauriform archosauromorphs were locally diverse near the middle Norian and experienced an extinction event prior to the end-Triassic mass extinction coincidental with the Adamanian-Revueltian boundary recognized at Petrified Forest National Park. The relatively late age of this early-diverging taxon (Norian) suggests that the diversity of azendohsaurids is underrepresented in Middle and Late Triassic fossil records around the world.
Placement of fertilizer in the seed furrow to increase nutrient availability is a common practice in row-crop production. While in-furrow application of fertilizer is widely utilized in the production of winter wheat (Triticum aestivum L.), there is a lack of work evaluating new formulations and nutrient combinations that are available. The objective of this study was to quantify the effects of in-furrow fertilizer products and combinations of products on winter wheat grain yield, nitrogen and mineral concentrations. Trials were conducted across five site-years in central Oklahoma using 11 fertilizer formulations placed in-furrow at the time of planting. In locations that soil test phosphorus (STP) levels or potassium were above sufficiency, the use of in-furrow fertilizers did not improve yield over the control. Inconsistency of response was noted at locations where STP levels were below the critical threshold. While one location showed no response to the addition of P regardless of source, two other locations had significant yield responses from three or more P-containing fertilizers. The addition of both sulphur and zinc resulted in increased yield over the base product at one low STP location. Nutrient concentrations were also influenced in nutrient-limited soils; however, no trends in response were present. Based upon the results of this study, the application of in-furrow fertilizer has the potential to increase winter wheat grain yield and nutrient concentration, when soil nutrients are limiting. As expected the addition of fertilizer when soil test levels are at or above a sufficiency did not increase grain yield.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.