To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter explores Plutarch’s presentation of greatness as it equates with leadership ability and outcomes. He expressly values civic participation and leadership that aims to secure and promote the welfare of the community. Subsequent to the presentation of some basic information concerning his theories of education, especially ethical education, attention is then focused on the innate components of greatness and the appropriate means to develop this inborn talent in training individuals to wield power in an effective and responsible fashion. A comparative analysis is then undertaken to set forth the similarities and differences between the psychological/behavioral makeup of Plutarch’s ideal leader and recent influential work in leadership theory by Daniel Goleman, James MacGregor Burns, and Bernard M. Bass. The significant degree of correspondence elucidated leads to a discussion of the literary techniques Plutarch employs to place in sharper relief the salient aspects of great leadership (and its opposite), including his developed use of synkrisis and the Socratic paradigm, as well as the representation of performative acts of leadership.
OBJECTIVES/GOALS: Clonal hematopoiesis of indeterminate potential (CHIP) is a common age-related condition that confers an increased risk of blood cancer, cardiovascular disease, and overall mortality. Larger proportions of blood cells with the CHIP mutation (clones) lead to worse outcomes. The goal of this study was to characterize CHIP clonal behavior over time. METHODS/STUDY POPULATION: While DNA biobanks have the ability to identify large cohorts of individuals with CHIP, they typically only contain blood from a single timepoint, limiting the ability to infer how CHIP clones change over time. In this preliminary study, we utilized multi-timepoint blood samples from 101 individuals with CHIP in Vanderbilt’s biobank (BioVU) to characterize clonal behavior over time. Using a CHIP gene-specific sequencing pipeline, we were able to characterize each individual’s CHIP mutation(s) and how the fraction of cells with the CHIP mutation expanded/reduced over time. By Spring 2023, we will also include ~300 additional individuals with CHIP in this study. RESULTS/ANTICIPATED RESULTS: CHIP mutations occurred 48% of the time in DNMT3A and 23% of the time in TET2, consistent with previous studies. 21% of individuals had more than one CHIP mutation. The mean difference in time between the two timepoints was 5.2 years (SD=2.9). Surprisingly, we observed both clonal expansion and clonal reduction across timepoints with 30% of DNMT3A, 0.6% of TET2, and 46% of JAK2 clones shrinking over time. The fastest average expansion was seen in TET2 clones (2% growth/year) and the slowest in DNMT3A clones (0.4% growth/year), but there was a significant amount of variation between individuals. In DNMT3A clones, there were no differences observed between loss of function mutations, missense mutations or DNMT3A R882 hotspot mutations. Clonal competition was observed in individuals with multiple driver mutations. DISCUSSION/SIGNIFICANCE: We used multi-timepoint blood samples to quantify the change in CHIP cell fraction over time on a per individual basis and observed novel clonal behavior and competition. Understanding the factors that influence the rate of CHIP progression can lead to personalized disease risk assessment for individuals with CHIP.
Neurocognitive testing may advance the goal of predicting near-term suicide risk. The current study examined whether performance on a Go/No-go (GNG) task, and computational modeling to extract latent cognitive variables, could enhance prediction of suicide attempts within next 90 days, among individuals at high-risk for suicide.
136 Veterans at high-risk for suicide previously completed a computer-based GNG task requiring rapid responding (Go) to target stimuli, while withholding responses (No-go) to infrequent foil stimuli; behavioral variables included false alarms to foils (failure to inhibit) and missed responses to targets. We conducted a secondary analysis of these data, with outcomes defined as actual suicide attempt (ASA), other suicide-related event (OtherSE) such as interrupted/aborted attempt or preparatory behavior, or neither (noSE), within 90-days after GNG testing, to examine whether GNG variables could improve ASA prediction over standard clinical variables. A computational model (linear ballistic accumulator, LBA) was also applied, to elucidate cognitive mechanisms underlying group differences.
On GNG, increased miss rate selectively predicted ASA, while increased false alarm rate predicted OtherSE (without ASA) within the 90-day follow-up window. In LBA modeling, ASA (but not OtherSE) was associated with decreases in decisional efficiency to targets, suggesting differences in the evidence accumulation process were specifically associated with upcoming ASA.
These findings suggest that GNG may improve prediction of near-term suicide risk, with distinct behavioral patterns in those who will attempt suicide within the next 90 days. Computational modeling suggests qualitative differences in cognition in individuals at near-term risk of suicide attempt.
Background: Early postoperative and acute prosthetic joint infection (PJI) may be managed with debridement, antibiotics, and implant retention (DAIR). Among patients with nonstaphylococcal PJI, an initial 4–6-week course of intravenous or highly bioavailable oral antibiotics is recommended in the Infectious Diseases Society of America (IDSA) guidelines, with disagreement among committee members on the need for subsequent chronic oral antimicrobial suppression (CAS). We aimed to characterize patients with nonstaphylococcal PJI who received CAS and to compare them to those who did not receive CAS. Methods: This retrospective cohort study included patients admitted to Veterans’ Affairs (VA) hospitals from 2003 to 2017 who had a PJI caused by nonstaphylococcal bacteria, underwent DAIR, and received 4–6 weeks of antimicrobial treatment. PJI was defined by Musculoskeletal Infection Society (MSIS) 2011 criteria. CAS was defined as at least 6 months of oral antibiotics following initial treatment of the PJI. Patients were followed for 5 years after debridement. We used χ2 tests and t tests were used to compare patients who received CAS with those who did not receive CAS. Results: Overall, 561 patients had a nonstaphylococcal PJI treated with DAIR, and 80.6% of patients received CAS. The most common organisms causing PJI were streptococci. We detected no significant differences between patients who received CAS and those who did not receive CAS, except that modified Acute Physiology and Chronic Health Evaluation (mAPACHE) scores were higher among patients who did not receive CAS (Table 1). Conclusion: Patients not on CAS were more severely ill (by mAPACHE) than those on CAS. Otherwise, the 2 groups were not different. This finding was contrary to our hypothesis that patients with multiple comorbidities or higher mAPACHE scores would be more likely to get CAS. A future analysis will be conducted to assess treatment failure in both groups. We hope to find a specific cohort who may benefit from CAS and hope to deimplement CAS in others who may not benefit from it.
Group Name: VHA Center for Antimicrobial Stewardship and Prevention of Antimicrobial Resistance (CASPAR) Background: Antimicrobial stewardship programs (ASPs) are advised to measure antimicrobial consumption as a metric for audit and feedback. However, most ASPs lack the tools necessary for appropriate risk adjustment and standardized data collection, which are critical for peer-program benchmarking. We created a system that automatically extracts antimicrobial use data and patient-level factors for risk-adjustment and a dashboard to present risk-adjusted benchmarking metrics for ASP within the Veterans’ Health Administration (VHA). Methods: We built a system to extract patient-level data for antimicrobial use, procedures, demographics, and comorbidities for acute inpatient and long-term care units at all VHA hospitals utilizing the VHA’s Corporate Data Warehouse (CDW). We built baseline negative binomial regression models to perform risk-adjustments based on patient- and unit-level factors using records dated between October 2016 and September 2018. These models were then leveraged both retrospectively and prospectively to calculate observed-to-expected ratios of antimicrobial use for each hospital and for specific units within each hospital. Data transformation and applications of risk-adjustment models were automatically performed within the CDW database server, followed by monthly scheduled data transfer from the CDW to the Microsoft Power BI server for interactive data visualization. Frontline antimicrobial stewards at 10 VHA hospitals participated in the project as pilot users. Results: Separate baseline risk-adjustment models to predict days of therapy (DOT) for all antibacterial agents were created for acute-care and long-term care units based on 15,941,972 patient days and 3,011,788 DOT between October 2016 and September 2018 at 134 VHA hospitals. Risk adjustment models include month, unit types (eg, intensive care unit [ICU] vs non-ICU for acute care), specialty, age, gender, comorbidities (50 and 30 factors for acute care and long-term care, respectively), and preceding procedures (45 and 24 procedures for acute care and long-term care, respectively). We created additional models for each antimicrobial category based on National Healthcare Safety Network definitions. For each hospital, risk-adjusted benchmarking metrics and a monthly ranking within the VHA system were visualized and presented to end users through the dashboard (an example screenshot in Figure 1). Conclusions: Developing an automated surveillance system for antimicrobial consumption and risk-adjustment benchmarking using an electronic medical record data warehouse is feasible and can potentially provide valuable tools for ASPs, especially at hospitals with no or limited local informatics expertise. Future efforts will evaluate the effectiveness of dashboards in these settings.
Evidence from genetics, post mortem and animal studies suggest that N-Methyl-D-Aspartate Receptor (NMDAR) hypofunction has an important role in the pathophysiology of psychosis. However, it is not known if NMDAR activity is altered in the early stages of psychosis or if this links to symptom severity. Our aim was to investigate NMDAR availability in first-episode psychosis (FEP) and determine if it links to symptom severity. The NMDAR hypofunction hypothesis of schizophrenia was initially proposed in the 1990s on the basis of observations that ketamine and phencyclidine (PCP) induced the full range of schizophrenia-like symptoms (positive, negative and cognitive) when given to healthy participants and also that they worsen symptoms in patients with schizophrenia.
We recruited 40 volunteers, including 21 patients with schizophrenia from early intervention services in London (12 antipsychotic-free and 9 receiving antipsychotic medication) and 19 matched healthy controls. The uptake of an NMDAR selective ligand, [18F]GE179, was measured using positron emission tomography (PET) and indexed using the distribution volume ratio (DVR) and volume of distribution (VT, in millilitres per cubic centimetre) of [18F]GE179 in the hippocampus and additional exploratory regions (anterior cingulate cortex (ACC), thalamus, striatum and temporal lobe). Symptom severity was measured using the Positive and Negative Syndrome Scale (PANSS).
A total of 37 individuals were included in the analyses (mean [SD] age of controls, 26.7 [4.5] years; mean [SD] age of patients, 25.3 [4.9] years). There was a significant reduction in hippocampal DVR in the patients with schizophrenia relative to healthy controls (p = 0.02, Cohen's d = 0.81). Although the VT of [18F]GE179 was lower in absolute terms in patients, there was no significant effect of group on VT in the hippocampus (p = 0.15, Cohen's d = 0.49) or the exploratory brain regions. There was a negative association between hippocampal DVR and total PANSS symptoms (rho = –0.47, p = 0.04), depressive symptoms (rho = –0.67, p = 0.002), and general PANSS symptoms (rho = –0.74, p = 0.001).
These results indicate lower hippocampal NMDAR levels in schizophrenia relative to controls with a large effect size, and that lower NMDAR levels are associated with greater levels of symptom severity. These findings are consistent with the role of NMDAR hypofunction in the pathophysiology of schizophrenia; however, further work is required to test specificity and causal relationships.
To develop a fully automated algorithm using data from the Veterans’ Affairs (VA) electrical medical record (EMR) to identify deep-incisional surgical site infections (SSIs) after cardiac surgeries and total joint arthroplasties (TJAs) to be used for research studies.
Retrospective cohort study.
This study was conducted in 11 VA hospitals.
Patients who underwent coronary artery bypass grafting or valve replacement between January 1, 2010, and March 31, 2018 (cardiac cohort) and patients who underwent total hip arthroplasty or total knee arthroplasty between January 1, 2007, and March 31, 2018 (TJA cohort).
Relevant clinical information and administrative code data were extracted from the EMR. The outcomes of interest were mediastinitis, endocarditis, or deep-incisional or organ-space SSI within 30 days after surgery. Multiple logistic regression analysis with a repeated regular bootstrap procedure was used to select variables and to assign points in the models. Sensitivities, specificities, positive predictive values (PPVs) and negative predictive values were calculated with comparison to outcomes collected by the Veterans’ Affairs Surgical Quality Improvement Program (VASQIP).
Overall, 49 (0.5%) of the 13,341 cardiac surgeries were classified as mediastinitis or endocarditis, and 83 (0.6%) of the 12,992 TJAs were classified as deep-incisional or organ-space SSIs. With at least 60% sensitivity, the PPVs of the SSI detection algorithms after cardiac surgeries and TJAs were 52.5% and 62.0%, respectively.
Considering the low prevalence rate of SSIs, our algorithms were successful in identifying a majority of patients with a true SSI while simultaneously reducing false-positive cases. As a next step, validation of these algorithms in different hospital systems with EMR will be needed.
We evaluated the relationship between local MRSA prevalence rates and antibiotic use across 122 VHA hospitals in 2016. Higher hospital-level MRSA prevalence was associated with significantly higher rates of antibiotic use, even after adjusting for case mix and stewardship strategies. Benchmarking anti-MRSA antibiotic use may need to adjust for MRSA prevalence.
In this cohort of Escherichia coli and Klebsiella spp hospital-onset bacteremia, isolated fluoroquinolone resistance had a larger relative impact on mortality than other phenotypic resistance patterns. This finding may support stewardship efforts targeting unnecessary fluoroquinolone use and increased attention from infection prevention and control departments.
Magnetic fields are observed on all scales in the Universe (see e.g. Kronberg 1994), but little is known about the origin and evolution of those fields with cosmic time. Seed fields of arbitrary source must be amplified to present-day values and distributed among cosmic structures. Therefore, the emergence of cosmic magnetic fields and corresponding dynamo processes (see e.g. Zel'dovich et al. 1983; Kulsrud et al. 1997) can only be jointly understood with the very basic processes of structure and galaxy formation (see e.g. Mo et al. 2010).
Using the MHD version of Gadget3 (Stasyszyn, Dolag & Beck 2013) and a model for the seeding of magnetic fields by supernovae (SN), we performed simulations of the evolution of the magnetic fields in galaxy clusters and study their effects on the heat transport within the intra cluster medium (ICM). This mechanism – where SN explosions during the assembly of galaxies provide magnetic seed fields – has been shown to reproduce the magnetic field in Milky Way-like galactic halos (Beck et al. 2013). The build up of the magnetic field at redshifts before z = 5 and the accordingly predicted rotation measure evolution are also in good agreement with current observations. Such magnetic fields present at high redshift are then transported out of the forming protogalaxies into the large-scale structure and pollute the ICM (in a similar fashion to metals transport). Here, complex velocity patterns, driven by the formation process of cosmic structures are further amplifying and distributing the magnetic fields. In galaxy clusters, the magnetic fields therefore get amplified to the observed μG level and produce the observed amplitude of rotation measures of several hundreds of rad/m2. We also demonstrate that heat conduction in such turbulent fields on average is equivalent to a suppression factor around 1/20th of the classical Spitzer value and in contrast to classical, isotropic heat transport leads to temperature structures within the ICM compatible with observations (Arth et al. 2014).
We present a model for the seeding and evolution of magnetic fields in galaxies by supernovae (SN). SN explosions during galaxy assembly provide seed fields, which are subsequently amplified by compression, shear flows and random motions. Our model explains the origin of μG magnetic fields within galactic structures. We implement our model in the MHD version of the cosmological simulation code Gadget-3 and couple it with a multi-phase description of the interstellar medium. We perform simulations of Milky Way-like galactic halo formation and analyze the distribution and strength of the magnetic field. We investigate the intrinsic rotation measure (RM) evolution and find RM values exceeding 1000 rad/m2 at high redshifts and RM values around 10 rad/m2 at present-day. We compare our simulations to a limited set of observational data points and find encouraging similarities. In our model, galactic magnetic fields are a natural consequence of the very basic processes of star formation and galaxy assembly.
We present Magneticum Pathfinder, a new set of hydrodynamical cosmological simulations covering a large range of cosmological scales. Among the important physical processes included in the simulations are the chemical and thermodynamical evolution of the diffuse gas as well as the evolution of stars and black holes and the corresponding feedback channels. In the high resolution boxes aimed at studies of galaxy formation and evolution, populations of both disk and spheroidal galaxies are self-consistently reproduced. These galaxy populations match the observed stellar mass function and show the same trends for disks and spheroids in the mass–size relation as observations from the SDSS. Additionally, we demonstrate that the simulated galaxies successfully reproduce the observed specific angular-momentum–mass relations for the two different morphological types of galaxies. In summary, the Magneticum Pathfinder simulations are a valuable tool for studying the assembly of cosmic and galactic structures in the universe.
In this paper, we review the production of radiocarbon and other radionuclides in extraterrestrial materials. This radioactivity can be produced by the effects of solar and galactic cosmic rays on solid material in space. In addition, direct implantation at the lunar surface of 14C and other radionuclides can occur. The level of 14C and other radionuclides in a meteorite can be used to determine its residence time on the Earth's surface, or “terrestrial age”. 14C provides the best tool for estimating terrestrial ages of meteorites collected in desert environments. Age control allows us to understand the time constraints on processes by which meteorites are weathered, as well as mean storage times. Third, we discuss the use of the difference in 14C/12C ratio of organic material and carbonates produced on other planetary objects and terrestrial material. These differences can be used to assess the importance of distinguishing primary material formed on the parent body from secondary alteration of meteoritic material after it lands on the earth.
Email your librarian or administrator to recommend adding this to your organisation's collection.