To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
s-Frames and i-frames do not represent two opposed types of intervention. Rather they are interpretive lenses for focusing on specific aspects of interventions, all of which include individual and structural dimensions. There is no sense to be made of prioritizing either system change or individual change, because each requires the other.
New technologies and disruptions related to Coronavirus disease-2019 have led to expansion of decentralized approaches to clinical trials. Remote tools and methods hold promise for increasing trial efficiency and reducing burdens and barriers by facilitating participation outside of traditional clinical settings and taking studies directly to participants. The Trial Innovation Network, established in 2016 by the National Center for Advancing Clinical and Translational Science to address critical roadblocks in clinical research and accelerate the translational research process, has consulted on over 400 research study proposals to date. Its recommendations for decentralized approaches have included eConsent, participant-informed study design, remote intervention, study task reminders, social media recruitment, and return of results for participants. Some clinical trial elements have worked well when decentralized, while others, including remote recruitment and patient monitoring, need further refinement and assessment to determine their value. Partially decentralized, or “hybrid” trials, offer a first step to optimizing remote methods. Decentralized processes demonstrate potential to improve urban-rural diversity, but their impact on inclusion of racially and ethnically marginalized populations requires further study. To optimize inclusive participation in decentralized clinical trials, efforts must be made to build trust among marginalized communities, and to ensure access to remote technology.
On February 24, 2022, Russia invaded Ukraine, resulting in Europe’s largest refugee crisis since World War II. More than six million Ukrainians fled the country—half of these to Poland—and one-third of the population was internally displaced.
Border points became bottlenecks where fatalities were reported—people risked their lives in long queues and subzero temperatures.
This presentation focuses on experiential information obtained during a 17-week deployment of EMT Type 1 both at border points (fixed) and in northwestern Ukraine (mobile). Quantitative and qualitative data were obtained after deployment by online survey with 75 medical, logistical and interpreter volunteers.
Initial teams experienced extremely fluid demands and numerous challenges with security, team adherence to COVID-19 protocols, behavioral issues with less experienced volunteers, and collaboration with novel governmental and non-governmental partners to achieve objectives.
1. Deployment to a conflict setting requires adherence to the Incident Command System, with daily security briefings and structured handover between teams at the beginning of each deployment.
2. Strict adherence to well-defined protocols for the prevention and management of emerging infectious risks such as COVID-19 is necessary, along with contingency plans to isolate infected team members.
3. There is a need for standardized pre-deployment vetting, training and orientation of all volunteers—particularly team leaders.
4. Identification of international partners should start pre-deployment and remain a continuous process during deployment.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.
OBJECTIVES/GOALS: The objective of this study was to use NIH RePORTER (Research Portfolio Online Reporting Tools) to analyze K99 funding trends and determine if R00 to R01 or R21 achievement time correlates with the future success of an early-stage NIH-funded investigator. METHODS/STUDY POPULATION: All award data were collected from NIH RePORTER. All K99 awards and funding data in this study were limited to All Clinical Departments (ACD). All researchers (n = 1,148) and awards (n = 2,022) were identified through a K99 search from FY 2007 to FY 2022 across ACD. Historic trends in K99 awards and funding from NIH Fiscal Year (FY) 2007 to FY 2022 were investigated. An R00 dataset was generated from NIH RePORTER. The K99 to R00 achievement statistics from FY 2007 to FY 2022 was investigated. NIH annual datafiles for FY 2007 to FY 2021 were aggregated to generate a master datafile of all R01 (n = 395,505) and R21 awards (n = 61,766). R01 and R21 award data were linked to the researcher previously identified through the K99 search. The connection between K99/R00 awardees and subsequent R01 or R21 awards was focused on. RESULTS/ANTICIPATED RESULTS: From FY 2008 to FY 2022, the number of K99 awards per year increased 123.4%, from 94 to 210. Over the same period, after correcting for inflation, the NIH K99 budget increased 127.0% while the NIH program level budget increased 17.3%. For researchers who achieved their first R01 or R21 0–3 years versus 3–6 years after the start of their R00, their average funding per year since the start of the R00 phase was $467,425 versus $290,604, respectively (p < 0.001). In summary, NIH investment in the K99 award pathway has substantially outpaced the NIH program level budget increase, and there is a strong relationship between average funding per year since the start of the R00 phase and time from R00 to R01 or R21. DISCUSSION/SIGNIFICANCE: Our study offers additional evidence of the Matthew effect in science, where previous success generates future success. This analysis may be useful to clinical departments as they evaluate selecting new and retaining current biomedical scientists for independent research positions.
Gaps in the implementation of effective interventions impact nearly all cancer prevention and control strategies in the US including Massachusetts. To close these implementation gaps, evidence-based interventions must be rapidly and equitably implemented in settings serving racially, ethnically, socioeconomically, and geographically diverse populations. This paper provides a brief overview of The Implementation Science Center for Cancer Control Equity (ISCCCE) and describes how we have operationalized our commitment to a robust community-engaged center that aims to close these gaps. We describe how ISCCCE is organized and how the principles of community-engaged research are embedded across the center. Principles of community engagement have been operationalized across all components of ISCCCE. We have intentionally integrated these principles throughout all structures and processes and have developed evaluation strategies to assess whether the quality of our partnerships reflects the principles. ISCCCE is a comprehensive community-engaged infrastructure for studying efficient, pragmatic, and equity-focused implementation and adaptation strategies for cancer prevention in historically and currently disadvantaged communities with built-in methods to evaluate the quality of community engagement. This engaged research center is designed to maximize the impact and relevance of implementation research on cancer control in community health centers.
Youth experiencing socioeconomic deprivation may be exposed to disadvantage in multiple contexts (e.g., neighborhood, family, and school). To date, however, we know little about the underlying structure of socioeconomic disadvantage, including whether the 'active ingredients' driving its robust effects are specific to one context (e.g., neighborhood) or whether the various contexts increment one another as predictors of youth outcomes.
The present study addressed this gap by examining the underlying structure of socioeconomic disadvantage across neighborhoods, families, and schools, as well as whether the various forms of disadvantage jointly predicted youth psychopathology and cognitive performance. Participants were 1,030 school-aged twin pairs from a subsample of the Michigan State University Twin Registry enriched for neighborhood disadvantage.
Two correlated factors underlay the indicators of disadvantage. Proximal disadvantage comprised familial indicators, whereas contextual disadvantage represented deprivation in the broader school and neighborhood contexts. Results from exhaustive modeling analyses indicated that proximal and contextual disadvantage incremented one another as predictors of childhood externalizing problems, disordered eating, and reading difficulties, but not internalizing symptoms.
Disadvantage within the family and disadvantage in the broader context, respectively, appear to represent distinct constructs with additive influence, carrying unique implications for multiple behavioral outcomes during middle childhood.
Many patients with Fontan physiology are unable to achieve the minimum criteria for peak effort during cardiopulmonary exercise testing. The purpose of this study is to determine the influence of physical activity and other clinical predictors related to achieving peak exercise criteria, signified by respiratory exchange ratio ≥ 1.1 in youth with Fontan physiology.
Secondary analysis of a cross-sectional study of 8–18-year-olds with single ventricle post-Fontan palliation who underwent cardiopulmonary exercise testing (James cycle protocol) and completed a past-year physical activity survey. Bivariate associations were assessed by Wilcoxon rank-sum test and simple regression. Conditional inference forest algorithm was used to classify participants achieving respiratory exchange ratio > 1.1 and to predict peak respiratory exchange ratio.
Of the n = 43 participants, 65% were male, mean age was 14.0 ± 2.4 years, and 67.4% (n = 29) achieved respiratory exchange ratio ≥ 1.1. Despite some cardiopulmonary exercise stress test variables achieving statistical significance in bivariate associations with participants achieving respiratory exchange ratio > 1.1, the classification accuracy had area under the precision recall curve of 0.55. All variables together explained 21.4% of the variance in respiratory exchange ratio, with peak oxygen pulse being the most informative.
Demographic, physical activity, and cardiopulmonary exercise test measures could not classify meeting peak exercise criteria (respiratory exchange ratio ≥ 1.1) at a satisfactory accuracy. Correlations between respiratory exchange ratio and oxygen pulse suggest the augmentation of stroke volume with exercise may affect the Fontan patient’s ability to sustain high-intensity exercise.
Preschool psychiatric symptoms significantly increase the risk for long-term negative outcomes. Transdiagnostic hierarchical approaches that capture general (‘p’) and specific psychopathology dimensions are promising for understanding risk and predicting outcomes, but their predictive utility in young children is not well established. We delineated a hierarchical structure of preschool psychopathology dimensions and tested their ability to predict psychiatric disorders and functional impairment in preadolescence.
Data for 1253 preschool children (mean age = 4.17, s.d. = 0.81) were drawn from three longitudinal studies using a similar methodology (one community sample, two psychopathology-enriched samples) and followed up into preadolescence, yielding a large and diverse sample. Exploratory factor models derived a hierarchical structure of general and specific factors using symptoms from the Preschool Age Psychiatric Assessment interview. Longitudinal analyses examined the prospective associations of preschool p and specific factors with preadolescent psychiatric disorders and functional impairment.
A hierarchical dimensional structure with a p factor at the top and up to six specific factors (distress, fear, separation anxiety, social anxiety, inattention-hyperactivity, oppositionality) emerged at preschool age. The p factor predicted all preadolescent disorders (ΔR2 = 0.04–0.15) and functional impairment (ΔR2 = 0.01–0.07) to a significantly greater extent than preschool psychiatric diagnoses and functioning. Specific dimensions provided additional predictive power for the majority of preadolescent outcomes (disorders: ΔR2 = 0.06–0.15; functional impairment: ΔR2 = 0.05–0.12).
Both general and specific dimensions of preschool psychopathology are useful for predicting clinical and functional outcomes almost a decade later. These findings highlight the value of transdiagnostic dimensions for predicting prognosis and as potential targets for early intervention and prevention.
To examine differences in surgical practices between salaried and fee-for-service (FFS) surgeons for two common degenerative spine conditions. Surgeons may offer different treatments for similar conditions on the basis of their compensation mechanism.
The study assessed the practices of 63 spine surgeons across eight Canadian provinces (39 FFS surgeons and 24 salaried) who performed surgery for two lumbar conditions: stable spinal stenosis and degenerative spondylolisthesis. The study included a multicenter, ambispective review of consecutive spine surgery patients enrolled in the Canadian Spine Outcomes and Research Network registry between October 2012 and July 2018. The primary outcome was the difference in type of procedures performed between the two groups. Secondary study variables included surgical characteristics, baseline patient factors, and patient-reported outcome.
For stable spinal stenosis (n = 2234), salaried surgeons performed statistically fewer uninstrumented fusion (p < 0.05) than FFS surgeons. For degenerative spondylolisthesis (n = 1292), salaried surgeons performed significantly more instrumentation plus interbody fusions (p < 0.05). There were no statistical differences in patient-reported outcomes between the two groups.
Surgeon compensation was associated with different approaches to stable lumbar spinal stenosis and degenerative lumbar spondylolisthesis. Salaried surgeons chose a more conservative approach to spinal stenosis and a more aggressive approach to degenerative spondylolisthesis, which highlights that remuneration is likely a minor determinant in the differences in practice of spinal surgery in Canada. Further research is needed to further elucidate which variables, other than patient demographics and financial incentives, influence surgical decision-making.
Deficits in visuospatial attention, known as neglect, are common following brain injury, but underdiagnosed and poorly treated, resulting in long-term cognitive disability. In clinical settings, neglect is often assessed using simple pen-and-paper tests. While convenient, these cannot characterise the full spectrum of neglect. This protocol reports a research programme that compares traditional neglect assessments with a novel virtual reality attention assessment platform: The Attention Atlas (AA).
The AA was codesigned by researchers and clinicians to meet the clinical need for improved neglect assessment. The AA uses a visual search paradigm to map the attended space in three dimensions and seeks to identify the optimal parameters that best distinguish neglect from non-neglect, and the spectrum of neglect, by providing near-time feedback to clinicians on system-level behavioural performance. A series of experiments will address procedural, scientific, patient, and clinical feasibility domains.
Analyses focuses on descriptive measures of reaction time, accuracy data for target localisation, and histogram-based raycast attentional mapping analysis; which measures the individual’s orientation in space, and inter- and intra-individual variation of visuospatial attention. We will compare neglect and control data using parametric between-subjects analyses. We present example individual-level results produced in near-time during visual search.
The development and validation of the AA is part of a new generation of translational neuroscience that exploits the latest advances in technology and brain science, including technology repurposed from the consumer gaming market. This approach to rehabilitation has the potential for highly accurate, highly engaging, personalised care.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) is primarily associated with hospital-acquired infections and is an urgent public health threat due to its ability to contaminate the environment and cause severe disease. In 2019, Illinois began pilot surveillance for CRAB requiring select laboratories to submit specimens for molecular characterization. On July 17, 2020, the Chicago Department of Public Health (CDPH) was notified of an increase in CRAB infections in a 20-bed ICU at an acute-care hospital in Chicago (hospital A) during the initial COVID-19 surge. We summarize the outbreak investigation findings and infection control recommendations. Methods: Clinical cultures were collected from patients in hospital A, and CRAB-positive isolates were sent to the Wisconsin State Laboratory of Hygiene for mechanism of resistance and antibiotic susceptibility testing. On-site assessments and remote follow-ups were conducted by CDPH infection preventionists to evaluate infection control practices including environmental cleaning, hand hygiene compliance, and use of personal protective equipment (PPE). The Illinois Department of Public Health and CDPH summarized the testing results, facilitated a containment response, and provided recommendations for infection control. Results: From March 18, 2020, to September 30, 2021, 56 patients with CRAB infections were identified from hospital A, and 33 (59%) of these cases were pan-nonsusceptible. Most specimen sources were sputum (n = 30, 54%), followed by blood (n = 13, 23%), urine (n = 6, 11%) and other (n = 7, 13%). Among isolates with mechanism testing (n = 54), 45 (83%) were positive for OXA-24/40 and 9 (17%) were positive for OXA-23. Of the CRAB-positive patients, 28 (50%) were previously positive for SARS-CoV-2. To date, 25 of these patients (45%) have been discharged and 31 (55%) have died. Two onsite visits and 7 remote-assistance sessions were conducted as part of the investigation. In response to increased COVID-19 hospitalizations, hospital A moved to crisis-capacity PPE use and encountered staffing shortages, which led to compromised infection control measures. Cleaning agents (Quat disinfectant cleaner) were also found to be ineffective against CRAB and required long contact times. Conclusions: In response to the CRAB outbreak at hospital A, CDPH recommended that the hospital stop crisis-capacity protocols for PPE, conduct admission screening and point-prevalence testing for CRAB, implement a hand hygiene campaign, and use an EPA-registered List K product for environmental cleaning. These recommendations were implemented in May 2021, and no CRAB cases have been reported since July 2021. To reduce CRAB transmission during the pandemic, facility leadership must commit resources to educate staff on effective infection control practices including conventional use of PPE, appropriate cleaning agents, and improved hand hygiene.