We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Syncope is common among pediatric patients and is rarely pathologic. The mechanisms for symptoms during exercise are less well understood than the resting mechanisms. Additionally, inert gas rebreathing analysis, a non-invasive examination of haemodynamics including cardiac output, has not previously been studied in youth with neurocardiogenic syncope.
Methods:
This was a retrospective (2017–2023), single-center cohort study in pediatric patients ≤ 21 years with prior peri-exertional syncope evaluated with echocardiography and cardiopulmonary exercise testing with inert gas rebreathing analysis performed on the same day. Patients with and without symptoms during or immediately following exercise were noted.
Results:
Of the 101 patients (15.2 ± 2.3 years; 31% male), there were 22 patients with symptoms during exercise testing or recovery. Resting echocardiography stroke volume correlated with resting (r = 0.53, p < 0.0001) and peak stroke volume (r = 0.32, p = 0.009) by inert gas rebreathing and with peak oxygen pulse (r = 0.61, p < 0.0001). Patients with syncopal symptoms peri-exercise had lower left ventricular end-diastolic volume (Z-score –1.2 ± 1.3 vs. –0.36 ± 1.3, p = 0.01) and end-systolic volume (Z-score –1.0 ± 1.4 vs. −0.1 ± 1.1, p = 0.001) by echocardiography, lower percent predicted peak oxygen pulse during exercise (95.5 ± 14.0 vs. 104.6 ± 18.5%, p = 0.04), and slower post-exercise heart rate recovery (31.0 ± 12.7 vs. 37.8 ± 13.2 bpm, p = 0.03).
Discussion:
Among youth with a history of peri-exertional syncope, those who become syncopal with exercise testing have lower left ventricular volumes at rest, decreased peak oxygen pulse, and slower heart rate recovery after exercise than those who remain asymptomatic. Peak oxygen pulse and resting stroke volume on inert gas rebreathing are associated with stroke volume on echocardiogram.
Complications following the Fontan procedure include prolonged pleural drainage and readmission for effusions. To address these complications, a post-Fontan management pathway was implemented with primary goals of reducing chest tube duration/reinsertion rates and decreasing hospital length of stay and readmissions.
Methods:
Fontan patients were identified by retrospective chart review (2017–2019) to obtain baseline data for chest tube duration/reinsertion rates, hospital length of stay, and readmission rates for effusion. A post-Fontan management pathway was implemented (2020–2021) utilising post-operative vasopressin, nasal cannula oxygen until chest tube removal, and discharge regimen of three times daily diuretics, sildenafil, and afterload reducing medications. Patients were followed to evaluate primary outcomes.
Results:
The pre- and post-pathway groups were similar in single ventricle morphology, demographics, and pre-operative haemodynamics. Forty-three and 36 patients were included in the pre- and post-pathway cohorts, respectively. There were statistically significant reductions in chest tube duration (8 vs. 5 days, p ≤ 0.001), chest tube output on post-operative day 4 (20.4 vs. 9.9 mL/kg/day, p = 0.003), and hospital readmission rates for effusion (13[30%] vs. 3[8%], p = 0.02) compared to baseline. There was an absolute reduction in hospital length of stay (11 vs. 9.5 days, p = 0.052). When combining average cost savings for the Fontan hospitalisations, readmissions for effusion, and cardiac catheterisations within 6 months of Fontan completion, there was a $325,144 total cost savings for 36 patients following pathway implementation.
Conclusion:
Implementation of a post-Fontan management pathway resulted in significant reductions in chest tube duration and output, and readmission rates for effusion in the perioperative period.
The widespread significance of tobacco in Mesoamerica is documented in historical and ethnographic sources, yet recovery of the organic remains of this plant from archaeological contexts is rare. Here, the authors present evidence for the ritual use of tobacco at Cotzumalhuapa, Guatemala, during the Late Classic period (AD 650–950). Detection of nicotine in residue analysis of three cylindrical ceramic vases recovered from cache deposits near the El Baúl acropolis suggests that these vessels contained tobacco infusions or other liquid preparations. These results suggest an ancient ritual practice involving tobacco for which there was previously no physical evidence in Mesoamerica.
Helium or neopentane can be used as surrogate gas fill for deuterium (D2) or deuterium-tritium (DT) in laser-plasma interaction studies. Surrogates are convenient to avoid flammability hazards or the integration of cryogenics in an experiment. To test the degree of equivalency between deuterium and helium, experiments were conducted in the Pecos target chamber at Sandia National Laboratories. Observables such as laser propagation and signatures of laser-plasma instabilities (LPI) were recorded for multiple laser and target configurations. It was found that some observables can differ significantly despite the apparent similarity of the gases with respect to molecular charge and weight. While a qualitative behaviour of the interaction may very well be studied by finding a suitable compromise of laser absorption, electron density, and LPI cross sections, a quantitative investigation of expected values for deuterium fills at high laser intensities is not likely to succeed with surrogate gases.
Obesity is associated with adverse effects on brain health, including increased risk for neurodegenerative diseases. Changes in cerebral metabolism may underlie or precede structural and functional brain changes. While bariatric surgery is known to be effective in inducing weight loss and improving obesity-related medical comorbidities, few studies have examined whether it may be able to improve brain metabolism. In the present study, we examined change in cerebral metabolite concentrations in participants with obesity who underwent bariatric surgery.
Participants and Methods:
35 patients with obesity (BMI > 35 kg/m2) were recruited from a bariatric surgery candidate nutrition class. They completed single voxel 1H-proton magnetic resonance spectroscopy at baseline (pre-surgery) and within one year post-surgery. Spectra were obtained from a large medial frontal brain region. Tissue-corrected absolute concentrations for metabolites including choline-containing compounds (Cho), myo-inositol (mI), N-acetylaspartate (NAA), creatine (Cr), and glutamate and glutamine (Glx) were determined using Osprey. Paired t-tests were used to examine within-subject change in metabolite concentrations, and correlations were used to relate these changes to other health-related outcomes, including weight loss and glycemic control.
Results:
Bariatric surgery was associated with a reduction in cerebral Cho (f[34j = -3.79, p < 0.001, d = -0.64) and mI (f[34] = -2.81, p < 0.01, d = -0.47) concentrations. There were no significant changes in NAA, Glx, or Cr concentrations. Reductions in Cho were associated with greater weight loss (r = 0.40, p < 0.05), and reductions in mI were associated with greater reductions in HbA1c (r = 0.44, p < 0.05).
Conclusions:
Participants who underwent bariatric surgery exhibited reductions in cerebral Cho and mI concentrations, which were associated with improvements in weight loss and glycemic control. Given that elevated levels of Cho and mI have been implicated in neuroinflammation, reduction in these metabolites after bariatric surgery may reflect amelioration of obesity-related neuroinflammatory processes. As such, our results provide evidence that bariatric surgery may improve brain health and metabolism in individuals with obesity.
Turfgrass managers apply nonselective herbicides to control winter annual weeds during dormancy of warm-season turfgrass. Zoysiagrass subcanopies, however, retain green leaves and stems during winter dormancy, especially in warmer climates. The partially green zoysiagrass often deters the use of nonselective herbicides due to variable injury concerns in transition and southern climatic zones. This study evaluated zoysiagrass response to glyphosate and glufosinate applied at four different growing degree day (GDD)-based application timings during postdormancy transition in different locations, including Blacksburg, VA; Starkville, MS; and Virginia Beach, VA, in 2018 and 2019. GDD was calculated using a 5 C base temperature with accumulation beginning January 1 each year, and targeted application timings were 125, 200, 275, and 350 GDD5C. Zoysiagrass injury response to glyphosate and glufosinate was consistent across a broad growing region from northern Mississippi to coastal Virginia, but it varied by application timing. Glyphosate application at 125 and 200 GDD5C can be used safely for weed control during the postdormancy period of zoysiagrass, while glufosinate caused unacceptable turf injury regardless of application timing. Glyphosate and glufosinate exhibited a stepwise increase to maximum injury with increasing targeted GDD5C application timings. Glyphosate applied at 125 or 200 GDD5C did not injure zoysiagrass above a threshold of 30%, whereas glufosinate caused greater than 30% injury for 28 and 29 d when applied at 125 and 200 GDD5C, respectively. Likewise, glyphosate application at 125 or 200 GDD5C did not affect the zoysiagrass green cover area under the progress curve per day, whereas later applications reduced it. Glyphosate and glufosinate caused greater injury to zoysiagrass when applied at greater cumulative heat units and this was attributed to increasing turfgrass green leaf density, because heat unit accumulation is positively correlated with green leaf density. Accumulated heat unit-based application timing will allow practitioners to apply nonselective herbicides with reduced injury concerns.
Deep Springs Valley (DSV) is a hydrologically isolated valley between the White and Inyo mountains that is commonly excluded from regional paleohydrology and paleoclimatology. Previous studies showed that uplift of Deep Springs ridge (informal name) by the Deep Springs fault defeated streams crossing DSV and hydrologically isolated the valley sometime after eruption of the Pleistocene Bishop Tuff (0.772 Ma). Here, we present tephrochronology and clast counts that reaffirms interruption of the Pliocene–Pleistocene hydrology and formation of DSV during the Pleistocene. Paleontology and infrared stimulated luminescence (IRSL) dates indicate a freshwater lake inundated Deep Springs Valley from ca. 83–61 ka or during Late Pleistocene Marine Isotope Stages 5a (MIS 5a; ca. 82 ka peak) and 4 (MIS 4; ca. 71–57 ka). The age of pluvial Deep Springs Lake coincides with pluvial lakes in Owens Valley and Columbus Salt Marsh and documents greater effective precipitation in southwestern North America during MIS 5a and MIS 4. In addition, we hypothesize that Deep Springs Lake was a balanced-fill lake that overflowed into Eureka Valley via the Soldier Pass wind gap during MIS 5a and MIS 4. DSV hydrology has implications for dispersal and endemism of the Deep Springs black toad (Anaxyrus exsul).
The United Nations (UN) established an umbrella of organizations to manage distinct clusters of humanitarian aid. The World Health Organization (WHO) oversees the health cluster, giving it responsibility for global, national, and local medical responses to natural disasters. However, this centralized structure insufficiently engages local players, impeding robust local implementation. The Gorkha earthquake struck Nepal on April 25, 2015, becoming Nepal’s most severe natural disaster since the 1934 Nepal-Bihar earthquake. In coordinated response, 2 organizations, Empower Nepali Girls and International Neurosurgical Children’s Association, used a hybrid approach integrating continuous communication with local recipients. Each organization mobilized its principal resource strengths—material medical supplies or human capital—thereby efficiently deploying resources to maximize the impact of the medical response. In addition to efficient resource use, this approach facilitates dynamic medical responses from highly mobile organizations. Importantly, in addition to future earthquakes in Nepal, this medical response strategy is easily scalable to other natural disaster contexts and other medical relief organizations. Preemptively identifying partner organizations with complementary strengths, continuous engagement with recipient populations, and creating disaster- and region-specific response teams may represent viable variations of the WHO cluster model with greater efficacy in local implementation of treatment in acute disaster scenarios.
The 1967 Outer Space Treaty reserved outer space for ‘peaceful purposes’, yet recent decades have witnessed growing competition and calls for new multilateral rules including a proposed ban on the deployment of weapons in space. These diplomatic initiatives have stalled in the face of concerted opposition from the United States. To explain this outcome, we characterise US diplomacy as a form of ‘antipreneurship’, a type of strategic norm-focused competition designed to preserve the prevailing normative status quo in the face of entrepreneurial efforts. We substantially refine and extend existing accounts of antipreneurship by theorising three dominant forms of antipreneurial agency – rhetorical, procedural, and behavioural – and describing the mechanisms and scope conditions though which they operate. We then trace the development of US resistance to proposed restraints on space weapons from 2000–present. Drawing on hundreds of official documents, we show how successive US administrations have employed a range of interlayered diplomatic strategies and tactics to preserve the permissive international legal framework governing outer space and protect US national security priorities. Our study illustrates the specific techniques and impacts of resistance in a domain of growing strategic importance, with implications for further refining understandings of norm competition in other issue areas.
The Ratio-Bias phenomenon, observed by psychologist Seymour Epstein and colleagues, is a systematic manifestation of irrationality. When offered a choice between two lotteries, individuals consistently choose the lottery with the greater number of potential successes, even when it offers a smaller probability of success. In the current study, we conduct experiments to confirm this phenomenon and test for the existence of Bias as distinct from general irrationality. Moreover, we examine the effect of introducing a monetary incentive of varying size (depending on the treatment) on the extent of irrational choices within this framework. We confirm the existence of the Bias. Moreover, the existence of an incentive significantly reduces the extent of irrationality exhibited, and that this effect is roughly linear in response to changes in the size of the incentive within the magnitudes investigated.
To characterize antifungal prescribing patterns, including the indication for antifungal use, in hospitalized children across the United States.
Design:
We analyzed antifungal prescribing data from 32 hospitals that participated in the SHARPS Antibiotic Resistance, Prescribing, and Efficacy among Children (SHARPEC) study, a cross-sectional point-prevalence survey conducted between June 2016 and December 2017.
Methods:
Inpatients aged <18 years with an active systemic antifungal order were included in the analysis. We classified antifungal prescribing by indication (ie, prophylaxis, empiric, targeted), and we compared the proportion of patients in each category based on patient and antifungal characteristics.
Results:
Among 34,927 surveyed patients, 2,095 (6%) received at least 1 systemic antifungal and there were 2,207 antifungal prescriptions. Most patients had an underlying oncology or bone marrow transplant diagnosis (57%) or were premature (13%). The most prescribed antifungal was fluconazole (48%) and the most common indication for antifungal use was prophylaxis (64%). Of 2,095 patients receiving antifungals, 79 (4%) were prescribed >1 antifungal, most often as targeted therapy (48%). The antifungal prescribing rate ranged from 13.6 to 131.2 antifungals per 1,000 patients across hospitals (P < .001).
Conclusions:
Most antifungal use in hospitalized children was for prophylaxis, and the rate of antifungal prescribing varied significantly across hospitals. Potential targets for antifungal stewardship efforts include high-risk, high-utilization populations, such as oncology and bone marrow transplant patients, and specific patterns of utilization, including prophylactic and combination antifungal therapy.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
We surveyed pediatric antimicrobial stewardship program (ASP) site leaders within the Sharing Antimicrobial Reports for Pediatric Stewardship collaborative regarding discharge stewardship practices. Among 67 sites, 13 (19%) reported ASP review of discharge antimicrobial prescriptions. These findings highlight discharge stewardship as a potential opportunity for improvement during the hospital-to-home transition.
Do the processes states use to select judges for peak courts influence gender diversity? Scholars have debated whether concentrating appointment power in a single individual or diffusing appointment power across many individuals best promotes gender diversification. Others have claimed that the precise structure of the process matters less than fundamental changes in the process. We clarify these theoretical mechanisms, derive testable implications concerning the appointment of the first woman to a state’s highest court, and then develop a matched-pair research design within a Rosenbaum permutation approach to observational studies. Using a global sample beginning in 1970, we find that constitutional change to the judicial selection process decreases the time until the appointment of the first woman justice. These results reflect claims that point to institutional disruptions as critical drivers of gender diversity on important political posts.
To develop a pediatric research agenda focused on pediatric healthcare-associated infections and antimicrobial stewardship topics that will yield the highest impact on child health.
Participants:
The study included 26 geographically diverse adult and pediatric infectious diseases clinicians with expertise in healthcare-associated infection prevention and/or antimicrobial stewardship (topic identification and ranking of priorities), as well as members of the Division of Healthcare Quality and Promotion at the Centers for Disease Control and Prevention (topic identification).
Methods:
Using a modified Delphi approach, expert recommendations were generated through an iterative process for identifying pediatric research priorities in healthcare associated infection prevention and antimicrobial stewardship. The multistep, 7-month process included a literature review, interactive teleconferences, web-based surveys, and 2 in-person meetings.
Results:
A final list of 12 high-priority research topics were generated in the 2 domains. High-priority healthcare-associated infection topics included judicious testing for Clostridioides difficile infection, chlorhexidine (CHG) bathing, measuring and preventing hospital-onset bloodstream infection rates, surgical site infection prevention, surveillance and prevention of multidrug resistant gram-negative rod infections. Antimicrobial stewardship topics included β-lactam allergy de-labeling, judicious use of perioperative antibiotics, intravenous to oral conversion of antimicrobial therapy, developing a patient-level “harm index” for antibiotic exposure, and benchmarking and or peer comparison of antibiotic use for common inpatient conditions.
Conclusions:
We identified 6 healthcare-associated infection topics and 6 antimicrobial stewardship topics as potentially high-impact targets for pediatric research.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
To characterize the current state of antifungal stewardship practices and perceptions of antifungal use among pediatric antimicrobial stewardship programs (ASPs).
Design:
We developed and distributed an electronic survey, which included 17 closed-ended questions about institutional antifungal stewardship practices and perceptions, among pediatric ASPs.
Participants:
ASP physicians and pharmacists of 74 hospitals participating in the multicenter Sharing Antimicrobial Reports for Pediatric Stewardship (SHARPS) Collaborative.
Results:
We sent surveys to 74 hospitals and received 68 unique responses, for a response rate of 92%. Overall, 63 of 68 the respondent ASPs (93%) reported that they conduct 1 or more antifungal stewardship activities. Of these 68 hospital ASPs, 43 (63%) perform prospective audit and feedback (PAF) of antifungals. The most common reasons reported for not performing PAF of antifungals were not enough time or resources (19 of 25, 76%) and minimal institutional antifungal use (6 of 25, 24%). Also, 52 hospitals (76%) require preauthorization for 1 or more antifungal agents. The most commonly restricted antifungals were isavuconazole (42 of 52 hospitals, 80%) and posaconazole (39 of 52 hospitals, 75%). Furthermore, 33 ASPs (48%) agreed or strongly agreed that antifungals are inappropriately used at their institution, and only 25 of 68 (37%) of ASPs felt very confident making recommendations about antifungals.
Conclusions:
Most pediatric ASPs steward antifungals, but the strategies employed are highly variable across surveyed institutions. Although nearly half of respondents identified inappropriate antifungal use as a problem at their institution, most ASPs do not feel confident making recommendations about antifungals. Future studies are needed to determine the rate of inappropriate antifungal use and the best antifungal stewardship strategies.