We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Racial and ethnic variations in antibiotic utilization are well-reported in outpatient settings but little is known about inpatient settings. Our objective was to describe national inpatient antibiotic utilization among children by race and ethnicity.
Methods:
This study included hospital visit data from the Pediatric Health Information System between 01/01/2022 and 12/31/2022 for patients <20 years. Primary outcomes were the percentage of hospitalization encounters that received an antibiotic and antibiotic days of therapy (DOT) per 1000 patient days. Mixed-effect regression models were used to determine the association of race-ethnicity with outcomes, adjusting for covariates.
Results:
There were 846,530 hospitalizations. 45.2% of children were Non-Hispanic (NH) White, 27.1% were Hispanic, 19.2% were NH Black, 4.5% were NH Other, 3.5% were NH Asian, 0.3% were NH Native Hawaiian/Other Pacific Islander (NHPI) and 0.2% were NH American Indian. Adjusting for covariates, NH Black children had lower odds of receiving antibiotics compared to NH White children (aOR 0.96, 95%CI 0.94–0.97), while NH NHPI had higher odds of receiving antibiotics (aOR 1.16, 95%CI 1.05–1.29). Children who were Hispanic, NH Asian, NH American Indian, and children who were NH Other received antibiotic DOT compared to NH White children, while NH NHPI children received more antibiotic DOT.
Conclusions:
Antibiotic utilization in children’s hospitals differs by race and ethnicity. Hospitals should assess policies and practices that may contribute to disparities in treatment; antibiotic stewardship programs may play an important role in promoting inpatient pharmacoequity. Additional research is needed to examine individual diagnoses, clinical outcomes, and drivers of variation.
Pragmatic trials aim to speed translation to practice by integrating study procedures in routine care settings. This study evaluated implementation outcomes related to clinician and patient recruitment and participation in a trial of community paramedicine (CP) and presents successes and challenges of maintaining pragmatic study features.
Methods:
Adults in the pre-hospital setting, emergency department (ED), or hospital being considered for referral to the ED/hospital or continued hospitalization for intermediate-level care were randomized 1:1 to CP care or usual care. Referral and enrollment data were tracked administratively, and patient characteristics were abstracted from the electronic health record (EHR). Enrolled patients completed baseline surveys, and a subset of intervention patients were interviewed. All CPs and a sample of clinicians and administrators were invited to complete a survey and interview.
Results:
Between January 2022 and February 2023, 240 enrolled patients (42% rural) completed surveys, and 22 completed an interview; 63 staff completed surveys and 20 completed an interview. Ninety-three clinicians in 27 departments made at least one referral. Factors related to referrals included program awareness and understanding the CP practice scope. Most patients were enrolled in the hospital, but characteristics were similar to the primary care population and included older and medically complex patients. Challenges to achieving representativeness included limited EHR infrastructure, constraints related to patient consenting, and clinician concerns about patient randomization disrupting preferred care.
Conclusion:
Future pragmatic trials in busy clinical settings may benefit from regulatory policies and EHR capabilities that allow for real-world study conduct and representative participation. Trial registration: NCT05232799.
To understand the relationship between adolescents’ unhealthy snacking behaviour during their school journey and their perceived and objective measures of food outlet availability in the school neighbourhood.
Design:
A cross-sectional survey enquired about socio-demographic information, school transport modes, perceived presence of food outlets in the school neighbourhood and unhealthy food purchase and consumption on the school journey. A geographical information system analysis of the food outlets within 500 m and 1000 m school buffers was undertaken. Data were analysed using generalised linear mixed modelling.
Setting:
All twelve secondary schools in Dunedin, Aotearoa New Zealand, March 2020–June 2022.
Participants:
Adolescents aged 13–18 years (n 725) who reported being familiar with their school neighbourhood.
Results:
Perceived availability of food outlets in the school neighbourhood was inversely correlated with distance to the closest food outlet from school and positively correlated with food outlet density within 500 m and 1000 m school buffers. Adolescents’ purchase and consumption of unhealthy snacks and drinks during the school journey were associated with perceived availability of food outlets and with shorter distance to the closest food outlet from school. Mixed transport users, girls and those living in high-deprivation neighbourhoods had higher odds of purchasing and consuming unhealthy snacks and drinks during the school journey than active transport users, boys and those living in low-deprivation neighbourhoods, respectively.
Conclusions:
Adolescents perceptions of the food environment and close access to food outlets in the school neighbourhood may influence adolescents’ food purchase and consumption behaviours during the school journey.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Interventions using a cognitive training paradigm called the Useful Field of View (UFOV) task have shown to be efficacious in slowing cognitive decline. However, no studies have looked at the engagement of functional networks during UFOV task completion. The current study aimed to (a) assess if regions activated during the UFOV fMRI task were functionally connected and related to task performance (henceforth called the UFOV network), (b) compare connectivity of the UFOV network to 7 resting-state functional connectivity networks in predicting proximal (UFOV) and near-transfer (Double Decision) performance, and (c) explore the impact of network segregation between higher-order networks and UFOV performance.
Participants and Methods:
336 healthy older adults (mean age=71.6) completed the UFOV fMRI task in a Siemens 3T scanner. UFOV fMRI accuracy was calculated as the number of correct responses divided by 56 total trials. Double Decision performance was calculated as the average presentation time of correct responses in log ms, with lower scores equating to better processing speed. Structural and functional MRI images were processed using the default pre-processing pipeline within the CONN toolbox. The Artifact Rejection Toolbox was set at a motion threshold of 0.9mm and participants were excluded if more than 50% of volumes were flagged as outliers. To assess connectivity of regions associated with the UFOV task, we created 10 spherical regions of interest (ROIs) a priori using the WFU PickAtlas in SPM12. These include the bilateral pars triangularis, supplementary motor area, and inferior temporal gyri, as well as the left pars opercularis, left middle occipital gyrus, right precentral gyrus and right superior parietal lobule. We used a weighted ROI-to-ROI connectivity analysis to model task-based within-network functional connectivity of the UFOV network, and its relationship to UFOV accuracy. We then used weighted ROI-to-ROI connectivity analysis to compare the efficacy of the UFOV network versus 7 resting-state networks in predicting UFOV fMRI task performance and Double Decision performance. Finally, we calculated network segregation among higher order resting state networks to assess its relationship with UFOV accuracy. All functional connectivity analyses were corrected at a false discovery threshold (FDR) at p<0.05.
Results:
ROI-to-ROI analysis showed significant within-network functional connectivity among the 10 a priori ROIs (UFOV network) during task completion (all pFDR<.05). After controlling for covariates, greater within-network connectivity of the UFOV network associated with better UFOV fMRI performance (pFDR=.008). Regarding the 7 resting-state networks, greater within-network connectivity of the CON (pFDR<.001) and FPCN (pFDR=. 014) were associated with higher accuracy on the UFOV fMRI task. Furthermore, greater within-network connectivity of only the UFOV network associated with performance on the Double Decision task (pFDR=.034). Finally, we assessed the relationship between higher-order network segregation and UFOV accuracy. After controlling for covariates, no significant relationships between network segregation and UFOV performance remained (all p-uncorrected>0.05).
Conclusions:
To date, this is the first study to assess task-based functional connectivity during completion of the UFOV task. We observed that coherence within 10 a priori ROIs significantly predicted UFOV performance. Additionally, enhanced within-network connectivity of the UFOV network predicted better performance on the Double Decision task, while conventional resting-state networks did not. These findings provide potential targets to optimize efficacy of UFOV interventions.
Cognitive training using a visual speed-of-processing task, called the Useful Field of View (UFOV) task, reduced dementia risk and reduced decline in activities of daily living at a 10-year follow-up in older adults. However, there is variability in the level of cognitive gains after cognitive training across studies. One potential explanation for this variability could be moderating factors. Prior studies suggest variables moderating cognitive training gains share features of the training task. Learning trials of the Hopkins Verbal Learning Test-Revised (HVLT-R) and Brief Visuospatial Memory Test-Revised (BVMT-R) recruit similar cognitive abilities and have overlapping neural correlates with the UFOV task and speed-ofprocessing/working memory tasks and therefore could serve as potential moderators. Exploring moderating factors of cognitive training gains may boost the efficacy of interventions, improve rigor in the cognitive training literature, and eventually help provide tailored treatment recommendations. This study explored the association between the HVLT-R and BVMT-R learning and the UFOV task, and assessed the moderation of HVLT-R and BVMT-R learning on UFOV improvement after a 3-month speed-ofprocessing/attention and working memory cognitive training intervention in cognitively healthy older adults.
Participants and Methods:
75 healthy older adults (M age = 71.11, SD = 4.61) were recruited as part of a larger clinical trial through the Universities of Florida and Arizona. Participants were randomized into a cognitive training (n=36) or education control (n=39) group and underwent a 40-hour, 12-week intervention. Cognitive training intervention consisted of practicing 4 attention/speed-of-processing (including the UFOV task) and 4 working memory tasks. Education control intervention consisted of watching 40-minute educational videos. The HVLT-R and BVMT-R were administered at the pre-intervention timepoint as part of a larger neurocognitive battery. The learning ratio was calculated as: trial 3 total - trial 1 total/12 - trial 1 total. UFOV performance was measured at pre- and post-intervention time points via the POSIT Brain HQ Double Decision Assessment. Multiple linear regressions predicted baseline Double Decision performance from HVLT-R and BVMT-R learning ratios controlling for study site, age, sex, and education. A repeated measures moderation analysis assessed the moderation of HVLT-R and BVMT-R learning ratio on Double Decision change from pre- to post-intervention for cognitive training and education control groups.
Results:
Baseline Double Decision performance significantly associated with BVMT-R learning ratio (β=-.303, p=.008), but not HVLT-R learning ratio (β=-.142, p=.238). BVMT-R learning ratio moderated gains in Double Decision performance (p<.01); for each unit increase in BVMT-R learning ratio, there was a .6173 unit decrease in training gains. The HVLT-R learning ratio did not moderate gains in Double Decision performance (p>.05). There were no significant moderations in the education control group.
Conclusions:
Better visuospatial learning was associated with faster Double Decision performance at baseline. Those with poorer visuospatial learning improved most on the Double Decision task after training, suggesting that healthy older adults who perform below expectations may show the greatest training gains. Future cognitive training research studying visual speed-of-processing interventions should account for differing levels of visuospatial learning at baseline, as this could impact the magnitude of training outcomes.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
We present detailed characterization of laser-driven fusion and neutron production ($\sim {10}^5$/second) using 8 mJ, 40 fs laser pulses on a thin (<1 μm) D${}_2$O liquid sheet employing a measurement suite. At relativistic intensity ($\sim 5\times {10}^{18}$ W/cm${}^2$) and high repetition rate (1 kHz), the system produces deuterium–deuterium (D-D) fusion, allowing for consistent neutron generation. Evidence of D-D fusion neutron production is verified by a measurement suite with three independent detection systems: an EJ-309 organic scintillator with pulse-shape discrimination, a ${}^3\mathrm{He}$ proportional counter and a set of 36 bubble detectors. Time-of-flight analysis of the scintillator data shows the energy of the produced neutrons to be consistent with 2.45 MeV. Particle-in-cell simulations using the WarpX code support significant neutron production from D-D fusion events in the laser–target interaction region. This high-repetition-rate laser-driven neutron source could provide a low-cost, on-demand test bed for radiation hardening and imaging applications.
Recent evidence from case reports suggests that a ketogenic diet may be effective for bipolar disorder. However, no clinical trials have been conducted to date.
Aims
To assess the recruitment and feasibility of a ketogenic diet intervention in bipolar disorder.
Method
Euthymic individuals with bipolar disorder were recruited to a 6–8 week trial of a modified ketogenic diet, and a range of clinical, economic and functional outcome measures were assessed. Study registration number: ISRCTN61613198.
Results
Of 27 recruited participants, 26 commenced and 20 completed the modified ketogenic diet for 6–8 weeks. The outcomes data-set was 95% complete for daily ketone measures, 95% complete for daily glucose measures and 95% complete for daily ecological momentary assessment of symptoms during the intervention period. Mean daily blood ketone readings were 1.3 mmol/L (s.d. = 0.77, median = 1.1) during the intervention period, and 91% of all readings indicated ketosis, suggesting a high degree of adherence to the diet. Over 91% of daily blood glucose readings were within normal range, with 9% indicating mild hypoglycaemia. Eleven minor adverse events were recorded, including fatigue, constipation, drowsiness and hunger. One serious adverse event was reported (euglycemic ketoacidosis in a participant taking SGLT2-inhibitor medication).
Conclusions
The recruitment and retention of euthymic individuals with bipolar disorder to a 6–8 week ketogenic diet intervention was feasible, with high completion rates for outcome measures. The majority of participants reached and maintained ketosis, and adverse events were generally mild and modifiable. A future randomised controlled trial is now warranted.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
We performed a preimplementation assessment of workflows, resources, needs, and antibiotic prescribing practices of trainees and practicing dentists to inform the development of an antibiotic-stewardship clinical decision-support tool (CDST) for dentists.
Methods:
We used a technology implementation framework to conduct the preimplementation assessment via surveys and focus groups of students, residents, and faculty members. Using Likert scales, the survey assessed baseline knowledge and confidence in dental providers’ antibiotic prescribing. The focus groups gathered information on existing workflows, resources, and needs for end users for our CDST.
Results:
Of 355 dental providers recruited to take the survey, 213 (60%) responded: 151 students, 27 residents, and 35 faculty. The average confidence in antibiotic prescribing decisions was 3.2 ± 1.0 on a scale of 1 to 5 (ie, moderate). Dental students were less confident about prescribing antibiotics than residents and faculty (P < .01). However, antibiotic prescribing knowledge was no different between dental students, residents, and faculty. The mean likelihood of prescribing an antibiotic when it was not needed was 2.7 ± 0.6 on a scale of 1 to 5 (unlikely to maybe) and was not meaningfully different across subgroups (P = .10). We had 10 participants across 3 focus groups: 7 students, 2 residents, and 1 faculty member. Four major themes emerged, which indicated that dentists: (1) make antibiotic prescribing decisions based on anecdotal experiences; (2) defer to physicians’ recommendations; (3) have limited access to evidence-based resources; and (4) want CDST for antibiotic prescribing.
Conclusions:
Dentists’ confidence in antibiotic prescribing increased by training level, but knowledge did not. Trainees and practicing dentists would benefit from a CDST to improve appropriateness of antibiotic prescribing.
Effective management of the introduced invasive grass common reed [Phragmites australis (Cav.) Trin. ex Steud.] requires the ability to differentiate between the introduced and native subspecies found in North America. While genetic tools are useful for discriminating between the subspecies, morphological identification is a useful complementary approach that is low to zero cost and does not require specialized equipment or technical expertise. The objective of our study was to identify the best morphological traits for rapid and simple identification of native and introduced P. australis. A suite of 22 morphological traits were measured in 21 introduced and 27 native P. australis populations identified by genetic barcoding across southern Ontario, Canada. Traits were compared between the subspecies to identify measurements that offered reliable, diagnostic separation. Overall, 21 of the 22 traits differed between the subspecies, with four offering complete separation: the retention of leaf sheaths on dead stems; a categorical assessment of stem color; the base height of the ligule, excluding the hairy fringe; and a combined measurement of leaf length and lower glume length. Additionally, round fungal spots on the stem occurred only on the native subspecies and never on the sampled introduced populations. The high degree of variation observed in traits within and between the subspecies cautions against a “common wisdom” approach to identification or automatic interpretation of intermediate traits as indicative of aberrant populations or hybridization. As an alternative, we have compiled the five best traits into a checklist of simple and reliable measurements to identify native and introduced P. australis. This guide will be most applicable for samples collected in the late summer and fall in the Great Lakes region but can also inform best practices for morphological identification in other regions as well.
Neonates and infants who undergo congenital cardiac surgery frequently have difficulty with feeding. The factors that predispose these patients to require a gastrostomy tube have not been well defined. We aimed to report the incidence and describe hospital outcomes and characteristics in neonates and infants undergoing congenital cardiac surgery who required gastrostomy tube placement.
Materials and method:
A retrospective review was performed on patients undergoing congenital cardiac surgery between October 2015 and December 2020. Patients were identified by International Classification of Diseases 10th Revision codes, utilising the performance improvement database Vizient® Clinical Data Base, and stratified by age at admission: neonates (<1 month) and infants (1–12 months). Outcomes were compared and comparative analysis performed between admissions with and without gastrostomy tube placement.
Results:
There were 11,793 admissions, 3519 (29.8%) neonates and 8274 (70.2%) infants. We found an increased incidence of gastrostomy tube placement in neonates as compared to infants following congenital cardiac surgery (23.1% versus 6%, p = <0.001). Outcomes in neonates and infants were similar with increased length of stay and cost in those requiring a gastrostomy tube. Gastrostomy tube placement was noted to be more likely in neonates and infants with upper airway anomalies, congenital abnormalities, hospital infections, and genetic abnormalities.
Discussion:
Age at hospitalisation for congenital cardiac surgery is a definable risk factor for gastrostomy tube requirement. Additional factors contribute to gastrostomy tube placement and should be used when counselling families regarding the potential requirement of a gastrostomy tube.
With the exponential growth in investment attention to brain health—solutions spanning brain wellness to mental health to neurological disorders—tech giants, payers, and biotechnology companies have been making forays into this field to identify technology solutions and pharmaceutical amplifiers. So far, their investments have had mixed results. The concept of open innovation (OI) was first coined by Henry Chesbrough to describe the paradigm by which enterprises allow free flow of ideas, products, and services from the outside to the inside and vice versa in order to remain competitive, particularly in rapidly evolving fields where there is abundant, relevant knowledge outside the traditional walls of the enterprise. In this article, we advocate for further exploration and advancement of OI in brain health.
To describe pediatric outpatient visits and antibiotic prescribing during the coronavirus disease 2019 (COVID-19) pandemic.
Design:
An observational, retrospective control study from January 2019 to October 2021.
Setting:
Outpatient clinics, including 27 family medicine clinics, 27 pediatric clinics, and 26 urgent or prompt care clinics.
Patients:
Children aged 0–19 years receiving care in an outpatient setting.
Methods:
Data were extracted from the electronic health record. The COVID-19 era was defined as April 1, 2020, to October 31, 2021. Virtual visits were identified by coded encounter or visit type variables. Visit diagnoses were assigned using a 3-tier classification system based on appropriateness of antibiotic prescribing and a subanalysis of respiratory visits was performed to compare changes in the COVID-19 era compared to baseline.
Results:
Through October 2021, we detected an overall sustained reduction of 18.2% in antibiotic prescribing to children. Disproportionate changes occurred in the percentages of antibiotic visits in respiratory visits for children by age, race or ethnicity, practice setting, and prescriber type. Virtual visits were minimal during the study period but did not result in higher rates of antibiotic visits or in-person follow-up visits.
Conclusions:
These findings suggest that reductions in antibiotic prescribing have been sustained despite increases in outpatient visits. However, additional studies are warranted to better understand disproportionate rates of antibiotic visits.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.