To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Constraining patterns of growth using directly observable and quantifiable characteristics can reveal a wealth of information regarding the biology of the Ediacara biota—the oldest macroscopic, complex community-forming organisms in the fossil record. However, these rely on individuals captured at an instant in time at various growth stages, and so different interpretations can be derived from the same material. Here we leverage newly discovered and well-preserved Dickinsonia costata Sprigg, 1947 from South Australia, combined with hundreds of previously described specimens, to test competing hypotheses for the location of module addition. We find considerable variation in the relationship between the total number of modules and body size that cannot be explained solely by expansion and contraction of individuals. Patterns derived assuming new modules differentiated at the anterior result in numerous examples in which the oldest module(s) must decrease in size with overall growth, potentially falsifying this hypothesis. Observed polarity as well as the consistent posterior location of defects and indentations support module formation at this end in D. costata. Regardless, changes in repeated units with growth share similarities with those regulated by morphogen gradients in metazoans today, suggesting that these genetic pathways were operating in Ediacaran animals.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Research was conducted using a functional malachite green colorimetric assay to evaluate acetyl-coenzyme A carboxylase (ACCase) activity previously identified as resistant to sethoxydim and select aryloxyphenoxypropionate (FOPs) herbicides, fenoxaprop, and fluazifop. Two resistant southern crabgrass [Digitaria ciliaris (Retz.) Koeler] biotypes, R1 and R2, containing an Ile-1781-Leu amino acid substitution and previously identified as resistant to sethoxydim, pinoxaden, and fluazifop but not clethodim was utilized as the resistant chloroplastic ACCase source compared with known susceptible (S) ACCase. Dose-response studies with sethoxydim, clethodim, fluazifop-p-butyl, and pinoxaden (0.6 to 40 µM) were conducted to compare the ACCase–herbicide interactions of R1, R2, and S using the malachite green functional assay. Assay results indicated that R biotypes required more ACCase-targeting herbicides to inhibit ACCase activity compared with S. IC50 values of all four herbicides for R biotypes were consistently an order of magnitude greater than those of S. No sequencing differences in the carboxyltransferase domain was observed for R1 and R2; however, R2 IC50 values were greater across all herbicides. These results indicate the malachite green functional assay is effective in evaluating ACCase activity of R and S biotypes in the presence of ACCase-targeting herbicides, which can be used as a replacement for the 14C-based radiometric functional assays.
Cannabis use has been associated with increased risk of psychiatric disorders. However, associations between adolescent cannabis use, depression and anxiety disorders are inconsistently reported in longitudinal samples.
To study associations of adolescent cannabis use with depression and anxiety disorders.
We used data from the Northern Finland Birth Cohort 1986, linked to nationwide registers, to study the association between adolescent cannabis use and depression and anxiety disorders until 33 years of age (until 2018).
We included 6325 participants (48.8% male) in the analyses; 352 (5.6%) participants reported cannabis use until 15–16 years of age. By the end of the follow-up, 583 (9.2%) participants were diagnosed with unipolar depression and 688 (10.9%) were diagnosed with anxiety disorder. Cannabis use in adolescence was associated with an increased risk of depression and anxiety disorders in crude models. After adjusting for parental psychiatric disorder, baseline emotional and behavioural problems, demographic factors and other substance use, using cannabis five or more times was associated with increased risk of anxiety disorders (hazard ratio 2.01, 95% CI 1.15–3.82), and using cannabis once (hazard ratio 1.93, 95% CI 1.30–2.87) or two to four times (hazard ratio 2.02, 95% CI 1.24–3.31) was associated with increased risk of depression.
Cannabis use in adolescence was associated with an increased risk of future depression and anxiety disorders. Further research is needed to clarify if this is a causal association, which could then inform public health messages about the use of cannabis in adolescence.
In the UK, acute mental healthcare is provided by in-patient wards and crisis resolution teams. Readmission to acute care following discharge is common. Acute day units (ADUs) are also provided in some areas.
To assess predictors of readmission to acute mental healthcare following discharge in England, including availability of ADUs.
We enrolled a national cohort of adults discharged from acute mental healthcare in the English National Health Service (NHS) between 2013 and 2015, determined the risk of readmission to either in-patient or crisis teams, and used multivariable, multilevel logistic models to evaluate predictors of readmission.
Of a total of 231 998 eligible individuals discharged from acute mental healthcare, 49 547 (21.4%) were readmitted within 6 months, with a median time to readmission of 34 days (interquartile range 10–88 days). Most variation in readmission (98%) was attributable to individual patient-level rather than provider (trust)-level effects (2.0%). Risk of readmission was not associated with local availability of ADUs (adjusted odds ratio 0.96, 95% CI 0.80–1.15). Statistically significant elevated risks were identified for participants who were female, older, single, from Black or mixed ethnic groups, or from more deprived areas. Clinical predictors included shorter index admission, psychosis and being an in-patient at baseline.
Relapse and readmission to acute mental healthcare are common following discharge and occur early. Readmission was not influenced significantly by trust-level variables including availability of ADUs. More support for relapse prevention and symptom management may be required following discharge from acute mental healthcare.
A goosegrass [Eleusine indica (L.) Gaertn.] population uncontrolled by paraquat (R) in a vegetable production field in St. Clair County, AL, was collected in summer 2019. Research was conducted to assess the level of resistance of the suspected resistant population compared with three populations with no suspected paraquat resistance (S1, S2, and S3). Visual injury at all rating dates and biomass reduction at 28 d after treatment (DAT) of S populations occurred exponentially to increasing paraquat rates. S biotypes were injured more than R at 3 DAT, with biomass recovery at 28 DAT only occurring at rates <0.28 kg ha−1. Plant death or biomass reduction did not occur for any rate at any date for R. Paraquat rates that induced 50% or 90% injury or reduced biomass 50% or 90% compared with the non-treated (I50 or I90, respectively) ranged from 10 to 124 times higher I50 for R compared with S and 54 to 116 times higher I90 for R compared with S biotypes. These data confirm a paraquat-resistant E. indica biotype in Alabama, providing additional germplasm for study of resistance to photosystem I electron-diverting (PSI-ED) resistance mechanisms.
Analyses of macroscopic charcoal, sediment geochemistry (%C, %N, C/N, δ13C, δ15N), and fossil pollen were conducted on a sediment core recovered from Stella Lake, Nevada, establishing a 2000 year record of fire history and vegetation change for the Great Basin. Charcoal accumulation rates (CHAR) indicate that fire activity, which was minimal from the beginning of the first millennium to AD 750, increased slightly at the onset of the Medieval Climate Anomaly (MCA). Observed changes in catchment vegetation were driven by hydroclimate variability during the early MCA. Two notable increases in CHAR, which occurred during the Little Ice Age (LIA), were identified as major fire events within the catchment. Increased C/N, enriched δ15N, and depleted δ13C values correspond with these events, providing additional evidence for the occurrence of catchment-scale fire events during the late fifteenth and late sixteenth centuries. Shifts in the vegetation community composition and structure accompanied these fires, with Pinus and Picea decreasing in relative abundance and Poaceae increasing in relative abundance following the fire events. During the LIA, the vegetation change and lacustrine geochemical response was most directly influenced by the occurrence of catchment-scale fires, not regional hydroclimate.
The large-spirited, learned, and sharp-witted organizers and contributors to this collection of essays have come to understand that I have quite mixed feelings about Festschriften and have dodged them for as long as possible. Why? Well, there are two reasons. The first reason is that such performances risk becoming a classic example of what I have elsewhere called “the public transcript.” The form tends to suppress dissent in favor of praise and filters out the “backstage” chorus of criticism and parody that accompanies, and should accompany, any body of work in social science. The second reason is that such celebratory events tend to occur at the dusk of a scholar's career, and, simply by summing up a trajectory of thought, resemble an intellectual funeral. “Well, that's that; what on earth does what he wrote add up to?” Since I flatter myself that I may still have a few novel and interesting things to say, things that may change my epitaph, my inclination is to not show up at the premature wake.
The analysis presented here was motivated by an objective of describing the interactions between the physical and biological processes governing the responses of tidal wetlands to rising sea level and the ensuing equilibrium elevation. We define equilibrium here as meaning that the elevation of the vegetated surface relative to mean sea level (MSL) remains within the vertical range of tolerance of the vegetation on decadal time scales or longer. The equilibrium is dynamic, and constantly responding to short-term changes in hydrodynamics, sediment supply, and primary productivity. For equilibrium to occur, the magnitude of vertical accretion must be great enough to compensate for change in the rate of sea-level rise (SLR). SLR is defined here as meaning the local rate relative to a benchmark, typically a gauge. Equilibrium is not a given, and SLR can exceed the capacity of a wetland to accrete vertically.
For people in mental health crisis, acute day units (ADUs) provide daily structured sessions and peer support in non-residential settings, often as an addition or alternative to crisis resolution teams (CRTs). There is little recent evidence about outcomes for those using ADUs, particularly compared with those receiving CRT care alone.
We aimed to investigate readmission rates, satisfaction and well-being outcomes for people using ADUs and CRTs.
We conducted a cohort study comparing readmission to acute mental healthcare during a 6-month period for ADU and CRT participants. Secondary outcomes included satisfaction (Client Satisfaction Questionnaire), well-being (Short Warwick–Edinburgh Mental Well-being Scale) and depression (Center for Epidemiologic Studies Depression Scale).
We recruited 744 participants (ADU: n = 431, 58%; CRT: n = 312, 42%) across four National Health Service trusts/health regions. There was no statistically significant overall difference in readmissions: 21% of ADU participants and 23% of CRT participants were readmitted over 6 months (adjusted hazard ratio 0.78, 95% CI 0.54–1.14). However, readmission results varied substantially by setting. At follow-up, ADU participants had significantly higher Client Satisfaction Questionnaire scores (2.5, 95% CI 1.4–3.5, P < 0.001) and well-being scores (1.3, 95% CI 0.4–2.1, P = 0.004), and lower depression scores (−1.7, 95% CI −2.7 to −0.8, P < 0.001), than CRT participants.
Patients who accessed ADUs demonstrated better outcomes for satisfaction, well-being and depression, and no significant differences in risk of readmission, compared with those who only used CRTs. Given the positive outcomes for patients, and the fact that ADUs are inconsistently provided in the National Health Service, their value and place in the acute care pathway needs further consideration and research.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
How do bureaucracies remember? The conventional view is that institutional memory is static and singular, the sum of recorded files and learned procedures. There is a growing body of scholarship that suggests contemporary bureaucracies are failing at this core task. This Element argues that this diagnosis misses that memories are essentially dynamic stories. They reside with people and are thus dispersed across the array of actors that make up the differentiated polity. Drawing on four policy examples from four sectors (housing, energy, family violence and justice) in three countries (the UK, Australia and New Zealand), this Element argues that treating the way institutions remember as storytelling is both empirically salient and normatively desirable. It is concluded that the current conceptualisation of institutional memory needs to be recalibrated to fit the types of policy learning practices required by modern collaborative governance.
The ‘16Up’ study conducted at the QIMR Berghofer Medical Research Institute from January 2014 to December 2018 aimed to examine the physical and mental health of young Australian twins aged 16−18 years (N = 876; 371 twin pairs and 18 triplet sets). Measurements included online questionnaires covering physical and mental health as well as information and communication technology (ICT) use, actigraphy, sleep diaries and hair samples to determine cortisol concentrations. Study participants generally rated themselves as being in good physical (79%) and mental (73%) health and reported lower rates of psychological distress and exposure to alcohol, tobacco products or other substances than previously reported for this age group in the Australian population. Daily or near-daily online activity was almost universal among study participants, with no differences noted between males and females in terms of frequency or duration of internet access. Patterns of ICT use in this sample indicated that the respondents were more likely to use online information sources for researching physical health issues than for mental health or substance use issues, and that they generally reported partial levels of satisfaction with the mental health information they found online. This suggests that internet-based mental health resources can be readily accessed by adolescent Australians, and their computer literacy augurs well for future access to online health resources. In combination with other data collected as part of the ongoing Brisbane Longitudinal Twin Study, the 16Up project provides a valuable resource for the longitudinal investigation of genetic and environmental contributions to phenotypic variation in a variety of human traits.
Callery pear (Pyrus calleryana Decne.) is rapidly spreading in the United States, gaining attention in the last two decades as a serious invasive pest. Recommended control methods include foliar, basal bark, cut stump, and hack-and-squirt application of herbicides, but there are few published studies with replicated data on efficacy. Four readily available herbicidal active ingredients and a combination of two active ingredients were tested for control efficacy against P. calleryana in old-field areas and loblolly pine (Pinus taeda L.) understory. Basal bark applications (triclopyr, triclopyr + aminopyralid), foliar applications (glyphosate, imazapyr), and a soil application (hexazinone) effectively killed P. calleryana with the exception of hexazinone at one site, where rainfall may not have been optimal. Foliar application of glyphosate provided the most consistent control. Our results demonstrate efficacy of registered herbicide formulations for P. calleryana control in two geographic locations and two habitat types. The need for development of integrated pest management programs for P. calleryana is discussed.
POST goosegrass and other grassy weed control in bermudagrass is problematic. Fewer herbicides that can control goosegrass are available due to regulatory pressure and herbicide resistance. Alternative herbicide options that offer effective control are needed. Previous research demonstrates that topramezone controls goosegrass, crabgrass, and other weed species; however, injury to bermudagrass may be unacceptable. The objective of this research was to evaluate the safening potential of topramezone combinations with different additives on bermudagrass. Field trials were conducted at Auburn University during summer and fall from 2015 to 2018 and 2017 to 2018, respectively. Treatments included topramezone mixtures and methylated seed oil applied in combination with five different additives: triclopyr, green turf pigment, green turf paint, ammonium sulfate, and chelated iron. Bermudagrass bleaching and necrosis symptoms were visually rated. Normalized-difference vegetative index measurements and clipping yield data were also collected. Topramezone plus chelated iron, as well as topramezone plus triclopyr, reduced bleaching potential the best; however, the combination of topramezone plus triclopyr resulted in necrosis that outweighed reductions in bleaching. Masking agents such as green turf paint and green turf pigment were ineffective in reducing injury when applied with topramezone. The combination of topramezone plus ammonium sulfate should be avoided because of the high level of necrosis. Topramezone-associated bleaching symptoms were transient and lasted 7 to 14 d on average. Findings from this research suggest that chelated iron added to topramezone and methylated seed oil mixtures acted as a safener on bermudagrass.
Background: Epidemiological studies have utilized administrative discharge diagnosis codes to identify methicillin-resistant and methicillin-sensitive Staphylococcus aureus (MRSA and MSSA) infections and trends, despite debate regarding the accuracy of utilizing codes for this purpose. We assessed the sensitivity and positive predictive value (PPV) of MRSA- and MSSA-specific diagnosis codes, trends, characteristics, and outcomes of S. aureus hospitalizations by method of identification. Methods: Clinical micro biology results and discharge data from geographically diverse US hospitals participating in the Premier Healthcare Database from 2012–2017 were used to identify monthly rates of MRSA and MSSA. Positive MRSA or MSSA clinical cultures and/or a MRSA- or MSSA-specific International Classification of Diseases, Ninth/Tenth Revision, Clinical Modification (ICD-9/10 CM) diagnosis codes from adult inpatients (aged ≥18 years) were included as S. aureus hospitalizations. Septicemia was defined as a positive blood culture or a MRSA or MSSA septicemia code. Sensitivity and PPV for codes were calculated for hospitalizations where admission status was not listed as transfer; true infection was considered a positive clinical culture. Negative binominal regression models measured trends in rates of MRSA and MSSA per 1,000 hospital discharges. Results: We identified 168,634 MRSA and 148,776 MSSA hospitalizations in 256 hospitals; 17% of MRSA and 21% of MSSA were septicemia. Less than half of all S. aureus hospitalizations (49% MRSA, 46% MSSA) and S. aureus septicemia hospitalizations (37% MRSA, 38% MSSA) had both a positive culture and diagnosis code (Fig. 1). Sensitivity of MRSA codes in identifying positive cultures was 61% overall and 56% for septicemia, PPV was 62% overall and 53% for septicemia. MSSA codes had a sensitivity of 49% in identifying MSSA cultures and 52% for MSSA septicemia; PPV was 69% overall and 62% for septicemia. Despite low sensitivity, MRSA trends are similar for cultures and codes, and MSSA trends are divergent (Fig. 2). For hospitalizations with septicemia, mortality was highest among those with a blood culture only (31.3%) compared to hospitalizations with both a septicemia code and blood culture (16.6%), and septicemia code only (14.7%). Conclusions: ICD diagnosis code sensitivity and PPV for identifying infections were consistently poor in recent years. Less than half of hospitalizations have concordant microbiology laboratory results and diagnosis codes. Rates and trend estimates for MSSA differ by method of identification. Using diagnosis codes to identify S. aureus infections may not be appropriate for descriptive epidemiology or assessing trends due to significant misclassification.
Disclosures: Scott Fridkin reports that his spouse receives consulting fees from the vaccine industry.
Background: The NHSN methods for central-line–associated bloodstream infection (CLABSI) surveillance do not account for additive CLABSI risk of concurrent central lines. Past studies were small and modestly risk adjusted but quantified the risk to be ~2-fold. If the attributable risk is this high, facilities that serve high-acuity patients with medically indicated concurrent central-line use may disproportionally incur CMS payment penalties for having high CLABSI rates. We aimed to build evidence through analysis using improved risk adjustment of a multihospital CLABSI experience to influence NHSN CLABSI protocols to account for risks attributed to concurrent central lines. Methods: In a retrospective cohort of adult patients at 4 hospitals (range, 110–733 beds) from 2012 to 2017, we linked central-line data to patient encounter data (age, comorbidities, total parenteral nutrition, chemotherapy, CLABSI). Analysis was limited to patients with >2 central-line days, with either a single central line or concurrence of no more than 2 central lines where insertion and removal dates overlapped by >1 day. Propensity-score matching for likelihood of concurrence and conditional logistic regression modeling estimated the risk of CLABSI attributed to concurrence of >1 day. To evaluate in Cox proportional hazards regression of time to CLABSIs, we also analyzed patients as unique central-line episodes: low risk (ie, ports, dialysis central lines, or PICC) or high risk (ie, temporary or nontunneled) and single versus concurrent. Results: In total, 64,575 central lines were used in 50,254 encounters. Among these patients, 517 developed a CLABSI; 438 (85%) with a single central line and 74 (15%) with concurrence. Moreover, 4,657 (9%) patients had concurrence (range, 6%–14% by hospital); of these, 74 (2%) had CLABSI, compared to 71 of 7,864 propensity-matched controls (1%). Concurrence patients had a median of 17 NHSN central-line days and 21 total central-line days. In multivariate modeling, patients with more concurrence (>2 of 3 of concurrent central-line days) had an higher risk for CLABSI (adjusted risk ratio, 1.62; 95% CI, 1.1–2.3) compared to controls. In survival analysis, 14,610 concurrent central-line episodes were compared to 31,126 single low-risk central-line episodes; adjusting for comorbidity, total parenteral nutrition, and chemotherapy, the daily excess risk of CLABSI attributable to the concurrent central line was ~80% (hazard ratio 1.78 for 2 high-risk or 2 low-risk central lines; hazard ratio 1.80 for a mix of high- and low-risk central lines) (Fig. 1). Notably, the hazard ratio attributed to a single high-risk line compared to a low-risk line was 1.44 (95% CI, 1.13–1.84). Conclusions: Since a concurrent central line nearly doubles the risk for CLABSI compared to a single low-risk line, the CDC should modify NHSN methodology to better account for this risk.
Disclosures: Scott Fridkin reports that his spouse receives consulting fees from the vaccine industry.
Background: Current NHSN denominator reporting for central-line–associated bloodstream infection (CLABSI) counts each patient day with n central lines as 1 central-line day. The NHSN does not directly adjust for potential increased risk of CLABSI from concurrent central lines, but the current NHSN standardized infection ratio (SIR) methods may account for differences in concurrence by adjusting for location type. Objective: We examined differences in central-line concurrence by NHSN location type among CLABSI patients. Methods: In a retrospective cohort of adults with CLABSI at 4 hospitals from 2012 to 2017, we linked central-line data to encounter and CLABSI data. Central lines were considered concurrent if they overlapped for >1 day. We calculated proportion of patients with concurrence at both NHSN location and SIR group levels; risk ratios for concurrence between NHSN location types within each SIR group (ie,, locations defined by SIR models as equal “risk”) were determined. Results: In total, 930 CLABIs were identified from 19 NHSN-defined locations that map to 7 SIR groups. Most CLABSIs occurred in locations mapped to either of 2 SIR groups: wards (227, 16% concurrence) and ICUs (294, 33% concurrence). The ward group had 3 NHSN locations (median, 78 CLABSIs) with concurrence range 8% (medical-surgical ward) to 20% (surgical ward). The ICU group had 6 NHSN locations (median, 47.5 CLABSIs) and concurrence ranged from 20% (neurosurgical ICU) to 39% (medical ICU). Despite the noted variations, no risk ratio was statistically different within each SIR group (Table 1). Conclusions: In patients with CLABSIs, the frequency of concurrence varied up to 2-fold between location types within the current NHSN SIR groups, though not statistically significantly. Assessing whether this difference in magnitude persists in all patients with central lines is an important next step in refining risk adjustment methods to account for concurrent central-line use.
Disclosures: Scott Fridkin reports that his spouse receives consulting fees from the vaccine industry.