We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Structural racism in the USA has roots that extend deep into healthcare and medical research, and it remains a key driver of illness and early death for Black, Indigenous, People of Color (BIPOC). Furthermore, the persistence of racism within academic medicine compels an interrogation of education and research within this context. In the spirit of this interrogation, this article highlights a unique model of community-engaged education that integrates cultural humility. As an individual and institutional stance, cultural humility denotes lifelong learning and self-critique, the mitigation of power imbalances, and accountability. The integration of cultural humility emphasizes that when space is created for BIPOC communities to lead the way, education regarding healthcare and research can be effectively reimagined. Demonstrating this effectiveness, six community partners led the development and implementation of a five-module Structural Racism in Healthcare and Research course. Using a cohort model approach, the pilot course enrolled 12 community members and 12 researchers. The curriculum covered topics such as history of racism in healthcare and research, and introduced participants to a cultural resilience framework. Evaluation results demonstrated a significant increase in participants’ knowledge and ability to identify and take action to address inequities related to racism in healthcare and research.
Bustards comprise a highly threatened family of birds and, being relatively fast, heavy fliers with very limited frontal visual fields, are particularly susceptible to mortality at powerlines. These infrastructures can also displace them from immediately adjacent habitat and act as barriers, fragmenting their ranges. With geographically ever wider energy transmission and distribution grids, the powerline threat to bustards is constantly growing. Reviewing the published and unpublished literature up to January 2021, we found 2,774 records of bustard collision with powerlines, involving 14 species. Some studies associate powerline collisions with population declines. To avoid mortalities, the most effective solution is to bury the lines; otherwise they should be either routed away from bustard-frequented areas, or made redundant by local energy generation. When possible, new lines should run parallel to existing structures and wires should preferably be as low and thick as possible, with minimal conductor obstruction of vertical airspace, although it should be noted that these measures require additional testing. A review of studies finds limited evidence that ‘bird flight diverters’ (BFDs; devices fitted to wires to induce evasive action) achieve significant reductions in mortality for some bustard species. Nevertheless, dynamic BFDs are preferable to static ones as they are thought to perform more effectively. Rigorous evaluation of powerline mortalities, and effectiveness of mitigation measures, need systematic carcass surveys and bias corrections. Whenever feasible, assessments of displacement and barrier effects should be undertaken. Following best practice guidelines proposed with this review paper to monitor impacts and mitigation could help build a reliable body of evidence on best ways to prevent bustard mortality at powerlines. Research should focus on validating mitigation measures and quantifying, particularly for threatened bustards, the population effects of powerline grids at the national scale, to account for cumulative impacts on bustards and establish an equitable basis for compensation measures.
Variation exists in the timing of surgery for balanced complete atrioventricular septal defect repair. We sought to explore associations between timing of repair and resource utilisation and clinical outcomes in the first year of life.
Methods:
In this retrospective single-centre cohort study, we included patients who underwent complete atrioventricular septal defect repair between 2005 and 2019. Patients with left or right ventricular outflow tract obstruction and major non-cardiac comorbidities (except trisomy 21) were excluded. The primary outcome was days alive and out of the hospital in the first year of life.
Results:
Included were 79 infants, divided into tertiles based on age at surgery (1st = 46 to 137 days, 2nd = 140 – 176 days, 3rd = 178 – 316 days). There were no significant differences among age tertiles for days alive and out of the hospital in the first year of life by univariable analysis (tertile 1, median 351 days; tertile 2, 348 days; tertile 3, 354 days; p = 0.22). No patients died. Fewer post-operative ICU days were used in the oldest tertile relative to the youngest, but days of mechanical ventilation and hospitalisation were similar. Clinical outcomes after repair and resource utilisation in the first year of life were similar for unplanned cardiac reinterventions, outpatient cardiology clinic visits, and weight-for-age z-score at 1 year.
Conclusions:
Age at complete atrioventricular septal defect repair is not associated with important differences in clinical outcomes or resource utilisation in the first year of life.
We investigated the efficacy and complication profile of intranasal dexmedetomidine for transthoracic echocardiography sedation in patients with single ventricle physiology and shunt-dependent pulmonary blood flow during the high-risk interstage period.
Methods:
A single-centre, retrospective review identified interstage infants who received dexmedetomidine for echocardiography sedation. Baseline and procedural vitals were reported. Significant adverse events related to sedation were defined as an escalation in care or need for any additional/increased inotropic support to maintain pre-procedural haemodynamics. Minor adverse events were defined as changes from baseline haemodynamics that resolved without intervention. To assess whether sedation was adequate, echocardiogram reports were reviewed for completeness.
Results:
From September to December 2020, five interstage patients (age 29–69 days) were sedated with 3 mcg/kg intranasal dexmedetomidine. The median sedation onset time and duration time was 24 minutes (range 12–43 minutes) and 60 minutes (range 33–60 minutes), respectively. Sedation was deemed adequate in all patients as complete echocardiograms were accomplished without a rescue dose. When compared to baseline, three (60%) patients had a >10% reduction in heart rate, one (20%) patient had a >10% reduction in oxygen saturations, and one (20%) patient had a >30% decrease in blood pressure. Amongst all patients, no significant complications occurred and haemodynamic changes from baseline did not result in need for intervention or interruption of study.
Conclusions:
Intranasal dexmedetomidine may be a reasonable option for echocardiography sedation in infants with shunt-dependent single ventricle heart disease, and further investigation is warranted to ensure efficacy and safety in an outpatient setting.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
This paper presents a compilation of atmospheric radiocarbon for the period 1950–2019, derived from atmospheric CO2 sampling and tree rings from clean-air sites. Following the approach taken by Hua et al. (2013), our revised and extended compilation consists of zonal, hemispheric and global radiocarbon (14C) data sets, with monthly data sets for 5 zones (Northern Hemisphere zones 1, 2, and 3, and Southern Hemisphere zones 3 and 1–2). Our new compilation includes smooth curves for zonal data sets that are more suitable for dating applications than the previous approach based on simple averaging. Our new radiocarbon dataset is intended to help facilitate the use of atmospheric bomb 14C in carbon cycle studies and to accommodate increasing demand for accurate dating of recent (post-1950) terrestrial samples.
Major depressive disorder (MDD) is characterised by a recurrent course and high comorbidity rates. A lifespan perspective may therefore provide important information regarding health outcomes. The aim of the present study is to examine mental disorders that preceded 12-month MDD diagnosis and the impact of these disorders on depression outcomes.
Methods
Data came from 29 cross-sectional community epidemiological surveys of adults in 27 countries (n = 80 190). The Composite International Diagnostic Interview (CIDI) was used to assess 12-month MDD and lifetime DSM-IV disorders with onset prior to the respondent's age at interview. Disorders were grouped into depressive distress disorders, non-depressive
distress disorders, fear disorders and externalising disorders. Depression outcomes included 12-month suicidality, days out of role and impairment in role functioning.
Results
Among respondents with 12-month MDD, 94.9% (s.e. = 0.4) had at least one prior disorder (including previous MDD), and 64.6% (s.e. = 0.9) had at least one prior, non-MDD disorder. Previous non-depressive distress, fear and externalising disorders, but not depressive distress disorders, predicted higher impairment (OR = 1.4–1.6) and suicidality (OR = 1.5–2.5), after adjustment for sociodemographic variables. Further adjustment for MDD characteristics weakened, but did not eliminate, these associations. Associations were largely driven by current comorbidities, but both remitted and current externalising disorders predicted suicidality among respondents with 12-month MDD.
Conclusions
These results illustrate the importance of careful psychiatric history taking regarding current anxiety disorders and lifetime externalising disorders in individuals with MDD.
Dithiopyr and dinitroanilines are preemergence-applied, mitotic-inhibiting herbicides used to control goosegrass [Eleusine indica (L.) Gaertn.] in turfgrass. A suspected resistant E. indica population was collected from a golf course putting green and was evaluated for possible resistance to dithiopyr and prodiamine. After dose–response evaluation, the α-tubulin gene was sequenced for known target-site mutations that have been reported to confer resistance to mitotic-inhibiting herbicides. A mutation was discovered that resulted in an amino acid substitution at position 136 from leucine to phenylalanine (Leu-136-Phe). Previous research has indicated that Leu-136-Phe does confer resistance to dinitroaniline herbicides. The level of resistance indicated by regression models and I50 values indicates that there is 54.1-, 4.7-, >100-, and >100-fold resistance to dithiopyr, prodiamine, pendimethalin, and oryzalin, respectively, when compared with the susceptible population based on seedling emergence response and 88.4-, 7.8-, >100-, and >100-fold resistance to dithiopyr, prodiamine, pendimethalin, and oryzalin, respectively, when compared with the susceptible population based on biomass reduction response. This is the first report of less resistance to prodiamine compared with pendimethalin or oryzalin due to a target-site α-tubulin mutation and the first report of a target-site α-tubulin mutation associated with dithiopyr resistance.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The analysis presented here was motivated by an objective of describing the interactions between the physical and biological processes governing the responses of tidal wetlands to rising sea level and the ensuing equilibrium elevation. We define equilibrium here as meaning that the elevation of the vegetated surface relative to mean sea level (MSL) remains within the vertical range of tolerance of the vegetation on decadal time scales or longer. The equilibrium is dynamic, and constantly responding to short-term changes in hydrodynamics, sediment supply, and primary productivity. For equilibrium to occur, the magnitude of vertical accretion must be great enough to compensate for change in the rate of sea-level rise (SLR). SLR is defined here as meaning the local rate relative to a benchmark, typically a gauge. Equilibrium is not a given, and SLR can exceed the capacity of a wetland to accrete vertically.
To evaluate broad-spectrum intravenous antibiotic use before and after the implementation of a revised febrile neutropenia management algorithm in a population of adults with hematologic malignancies.
Design:
Quasi-experimental study.
Setting and population:
Patients admitted between 2014 and 2018 to the Adult Malignant Hematology service of an acute-care hospital in the United States.
Methods:
Aggregate data for adult malignant hematology service were obtained for population-level antibiotic use: days of therapy (DOT), C. difficile infections, bacterial bloodstream infections, intensive care unit (ICU) length of stay, and in-hospital mortality. All rates are reported per 1,000 patient days before the implementation of an febrile neutropenia management algorithm (July 2014–May 2016) and after the intervention (June 2016–December 2018). These data were compared using interrupted time series analysis.
Results:
In total, 2,014 patients comprised 6,788 encounters and 89,612 patient days during the study period. Broad-spectrum intravenous (IV) antibiotic use decreased by 5.7% with immediate reductions in meropenem and vancomycin use by 22 (P = .02) and 15 (P = .001) DOT per 1,000 patient days, respectively. Bacterial bloodstream infection rates significantly increased following algorithm implementation. No differences were observed in the use of other antibiotics or safety outcomes including C. difficile infection, ICU length of stay, and in-hospital mortality.
Conclusions:
Reductions in vancomycin and meropenem were observed following the implementation of a more stringent febrile neutropenia management algorithm, without evidence of adverse outcomes. Successful implementation occurred through a collaborative effort and continues to be a core reinforcement strategy at our institution. Future studies evaluating patient-level data may identify further stewardship opportunities in this population.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Deposits of at least three glaciations are present in New Jersey and the New York City area. The oldest deposits are magnetically reversed. Pollen and stratigraphic relations suggest that they are from the earliest Laurentide advance at ~2.4 Ma. Deposits of a second advance are overlain by peat dated to 41 ka and so are pre-Marine Isotope Stage (pre-MIS) 2. Their relation to marine deposits indicates that they predate MIS 5 but postdate MIS 11 and may postdate MIS 7 or 9, suggesting an MIS 6 age. The most recent deposits are of MIS 2 (last glacial maximum [LGM]) age. Radiocarbon dates and varve counts tied to glacial-lake events indicate that LGM ice arrived at its terminus at 25 ka, stood at the terminus until ~24 ka, retreated at a rate of 80 m/yr until 23.5 ka, and then retreated at a rate of 12 m/yr to 18 ka. At 18 ka the retreat record connects to the base of the North American Varve Chronology at Newburgh, New York. The 25–24 ka age for the LGM is slightly younger than, but within the uncertainty of, cosmogenic ages; it is significantly older than the oldest dated macrofossils in postglacial deposits in the region.
Lack of judicious testing can result in the incorrect diagnosis of Clostridioides difficile infection (CDI), unnecessary CDI treatment, increased costs and falsely augmented hospital-acquired infection (HAI) rates. We evaluated facility-wide interventions used at the VA San Diego Healthcare System (VASDHS) to reduce healthcare-onset, healthcare-facility–associated CDI (HO-HCFA CDI), including the use of diagnostic stewardship with test ordering criteria.
Design:
We conducted a retrospective study to assess the effectiveness of measures implemented to reduce the rate of HO-HCFA CDI at the VASDHS from fiscal year (FY)2015 to FY2018.
Interventions:
Measures executed in a stepwise fashion included a hand hygiene initiative, prompt isolation of CDI patients, enhanced terminal room cleaning, reduction of fluoroquinolone and proton-pump inhibitor use, laboratory rejection of solid stool samples, and lastly diagnostic stewardship with C. difficile toxin B gene nucleic acid amplification testing (NAAT) criteria instituted in FY2018.
Results:
From FY2015 to FY2018, 127 cases of HO-HCFA CDI were identified. All rate-reducing initiatives resulted in decreased HO-HCFA cases (from 44 to 13; P ≤ .05). However, the number of HO-HCFA cases (34 to 13; P ≤ .05), potential false-positive testing associated with colonization and laxative use (from 11 to 4), hospital days (from 596 to 332), CDI-related hospitalization costs (from $2,780,681 to $1,534,190) and treatment cost (from $7,158 vs $1,476) decreased substantially following the introduction of diagnostic stewardship with test criteria from FY2017 to FY2018.
Conclusions:
Initiatives to decrease risk for CDI and diagnostic stewardship of C. difficile stool NAAT significantly reduced HO-HCFA CDI rates, detection of potential false-positives associated with laxative use, and lowered healthcare costs. Diagnostic stewardship itself had the most dramatic impact on outcomes observed and served as an effective tool in reducing HO-HCFA CDI rates.
Successful management of an event where health-care needs exceed regional health-care capacity requires coordinated strategies for scarce resource allocation. Publications for rapid development, training, and coordination of regional hospital triage teams to manage the allocation of scarce resources during coronavirus disease 2019 (COVID-19) are lacking. Over a period of 3 weeks, over 100 clinicians, ethicists, leaders, and public health authorities convened virtually to achieve consensus on how best to save the most lives possible and share resources. This is referred to as population-based crisis management. The rapid regionalization of 22 acute care hospitals across 4500 square miles in the midst of a pandemic with a shifting regulatory landscape was challenging, but overcome by mutual trust, transparency, and confidence in the public health authority. Because many cities are facing COVID-19 surges, we share a process for successful rapid formation of health-care care coalitions, Crisis Standard of Care, and training of Triage Teams. Incorporation of continuous process improvement and methods for communication is essential for successful implementation. Use of our regional health-care coalition communications, incident command system, and the crisis care committee helped mitigate crisis care in the San Diego and Imperial County region as COVID-19 cases surged and scarce resource collaborative decisions were required.
The Late Formative period immediately precedes the emergence of Tiwanaku, one of the earliest South American states, yet it is one of the most poorly understood periods in the southern Lake Titicaca Basin (Bolivia). In this article, we refine the ceramic chronology of this period with large sets of dates from eight sites, focusing on temporal inflection points in decorated ceramic styles. These points, estimated here by Bayesian models, index specific moments of change: (1) cal AD 120 (60–170, 95% probability): the first deposition of Kalasasaya red-rimmed and zonally incised styles; (2) cal AD 240 (190–340, 95% probability): a tentative estimate of the final deposition of Kalasasaya zonally incised vessels; (3) cal AD 420 (380–470, 95% probability): the final deposition of Kalasasaya red-rimmed vessels; and (4) cal AD 590 (500–660, 95% probability): the first deposition of Tiwanaku Redwares. These four modeled boundaries anchor an updated Late Formative chronology, which includes the Initial Late Formative phase, a newly identified decorative hiatus between the Middle and Late Formative periods. The models place Qeya and transitional vessels between inflection points 3 and 4 based on regionally consistent stratigraphic sequences. This more precise chronology will enable researchers to explore the trajectories of other contemporary shifts during this crucial period in Lake Titicaca Basin's prehistory.
Little is known about the neural substrates of suicide risk in mood disorders. Improving the identification of biomarkers of suicide risk, as indicated by a history of suicide-related behavior (SB), could lead to more targeted treatments to reduce risk.
Methods
Participants were 18 young adults with a mood disorder with a history of SB (as indicated by endorsing a past suicide attempt), 60 with a mood disorder with a history of suicidal ideation (SI) but not SB, 52 with a mood disorder with no history of SI or SB (MD), and 82 healthy comparison participants (HC). Resting-state functional connectivity within and between intrinsic neural networks, including cognitive control network (CCN), salience and emotion network (SEN), and default mode network (DMN), was compared between groups.
Results
Several fronto-parietal regions (k > 57, p < 0.005) were identified in which individuals with SB demonstrated distinct patterns of connectivity within (in the CCN) and across networks (CCN-SEN and CCN-DMN). Connectivity with some of these same regions also distinguished the SB group when participants were re-scanned after 1–4 months. Extracted data defined SB group membership with good accuracy, sensitivity, and specificity (79–88%).
Conclusions
These results suggest that individuals with a history of SB in the context of mood disorders may show reliably distinct patterns of intrinsic network connectivity, even when compared to those with mood disorders without SB. Resting-state fMRI is a promising tool for identifying subtypes of patients with mood disorders who may be at risk for suicidal behavior.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
Methods
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
Results
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
Conclusions
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.