To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Finite-amplitude hydromagnetic Rossby waves in the magnetostrophic regime are studied. We consider the slow mode, which travels in the opposite direction to the hydrodynamic or fast mode, in the presence of a toroidal magnetic field and zonal flow by means of quasi-geostrophic models for thick spherical shells. The weakly nonlinear long waves are derived asymptotically using a reductive perturbation method. The problem at the first order is found to obey a second-order ordinary differential equation, leading to a hypergeometric equation for a Malkus field and a confluent Heun equation for an electrical wire field, and is non-singular when the wave speed approaches the mean flow. Investigating its neutral non-singular eigensolutions for different basic states, we find the evolution is described by the Korteweg–de Vries equation. This implies that the nonlinear slow wave forms solitons and solitary waves. These may take the form of a coherent eddy, such as a single anticyclone. We speculate on the relation of the anticyclone to the asymmetric gyre seen in the Earth's fluid core, and in state-of-the-art dynamo direct numerical simulations.
With human influences driving populations of apex predators into decline, more information is required on how factors affect species at national and global scales. However, camera-trap studies are seldom executed at a broad spatial scale. We demonstrate how uniting fine-scale studies and utilizing camera-trap data of non-target species is an effective approach for broadscale assessments through a case study of the brown hyaena Parahyaena brunnea. We collated camera-trap data from 25 protected and unprotected sites across South Africa into the largest detection/non-detection dataset collected on the brown hyaena, and investigated the influence of biological and anthropogenic factors on brown hyaena occupancy. Spatial autocorrelation had a significant effect on the data, and was corrected using a Bayesian Gibbs sampler. We show that brown hyaena occupancy is driven by specific co-occurring apex predator species and human disturbance. The relative abundance of spotted hyaenas Crocuta crocuta and people on foot had a negative effect on brown hyaena occupancy, whereas the relative abundance of leopards Panthera pardus and vehicles had a positive influence. We estimated that brown hyaenas occur across 66% of the surveyed camera-trap station sites. Occupancy varied geographically, with lower estimates in eastern and southern South Africa. Our findings suggest that brown hyaena conservation is dependent upon a multi-species approach focussed on implementing conservation policies that better facilitate coexistence between people and hyaenas. We also validate the conservation value of pooling fine-scale datasets and utilizing bycatch data to examine species trends at broad spatial scales.
Background: Hospitalists play a critical role in antimicrobial stewardship as the primary antibiotic prescriber for many inpatients. We sought to describe antibiotic prescribing variation among hospitalists within a healthcare system. Methods: We created a novel metric of hospitalist-specific antibiotic prescribing by linking hospitalist billing data to hospital medication administration records in 4 hospitals (two 500-bed academic (AMC1 and AMC2), one 400-bed community (CH1), and one 100-bed community (CH2)) from January 2016 to December 2018. We attributed dates that a hospitalist electronically billed for a given patient as billed patient days (bPD) and mapped an antibiotic day of therapy (DOT) to a bPD. Each DOT was classified according to National Healthcare Safety Network antibiotic categories: broad-spectrum hospital-onset (BS-HO), broad-spectrum community-onset (BS-CO), anti-MRSA, and highest risk for Clostridioides difficile infection (CDI). DOT and bPD were pooled to calculate hospitalist-specific DOT per 1,000 bPD. Best subsets regression was performed to assess model fit and generate hospital and antibiotic category-specific models adjusting for patient-level factors (eg, age ≥65, ICD-10 codes for comorbidities and infections). The models were used to calculate predicted hospitalist-specific DOT and observed-to-expected ratios (O:E) for each antibiotic category. Kruskal-Wallis tests and pairwise Wilcoxon rank-sum tests were used to determine significant differences between median DOT per 1,000 bPD and O:E between hospitals for each antibiotic category. Results: During the study period, 116 hospitalists across 4 hospitals contributed a total of 437,303 bPD. Median DOT per 1,000 bPD varied between hospitals (BS-HO range, 46.7–84.2; BS-CO range, 63.3–100; anti-MRSA range, 48.4–65.4; CDI range, 82.0–129.4). CH2 had a significantly higher median DOT per 1,000 bPD compared to the academic hospitals (all antibiotic categories P < .001) and CH1 (BS-HO, P = .01; anti-MRSA, P = .02) (Fig. 1A). The 4 antibiotic groups at 4 hospitals resulted in 16 models, with good model fit for CH2 (R2 > 0.55 for all models), modest model fit for AMC2 (R2 = 0.46–0.55), fair model fit for CH1 (R2 = 0.19–0.35), and poor model fit for AMC1 (R2 < 0.12 for all models). Variation in hospitalist-specific O:E was moderate (IQR, 0.9–1.1). AMC1 showed greater variation than other hospitals, but we detected no significant differences in median O:E between hospitals (all antibiotic categories P > .10) (Fig. 1B). Conclusions: Adjusting for patient-level factors significantly reduced much of the variation in hospitalist-specific DOT per 1,000 bPD in some but not all hospitals, suggesting that unmeasured factors may drive antibiotic prescribing. This metric may represent a target for stewardship intervention, such as hospitalist-specific feedback of antibiotic prescribing practices.
Disclosures: Scott Fridkin, consulting fee - vaccine industry (various) (spouse)
The EAT–Lancet Commission promulgated a universal reference diet. Subsequently, researchers constructed an EAT–Lancet diet score (0–14 points), with minimum intake values for various dietary components set at 0 g/d, and reported inverse associations with risks of major health outcomes in a high-income population. We assessed associations between EAT–Lancet diet scores, without or with lower bound values, and the mean probability of micronutrient adequacy (MPA) among nutrition-insecure women of reproductive age (WRA) from low- and middle-income countries (LMIC). We analysed single 24-h diet recall data (n 1950) from studies in rural DRC, Ecuador, Kenya, Sri Lanka and Vietnam. Associations between EAT–Lancet diet scores and MPA were assessed by fitting linear mixed-effects models. Mean EAT–Lancet diet scores were 8·8 (SD 1·3) and 1·9 (SD 1·1) without or with minimum intake values, respectively. Pooled MPA was 0·58 (SD 0·22) and energy intake was 10·5 (SD 4·6) MJ/d. A one-point increase in the EAT–Lancet diet score, without minimum intake values, was associated with a 2·6 (SD 0·7) percentage points decrease in MPA (P < 0·001). In contrast, the EAT–Lancet diet score, with minimum intake values, was associated with a 2·4 (SD 1·3) percentage points increase in MPA (P = 0·07). Further analysis indicated positive associations between EAT–Lancet diet scores and MPA adjusted for energy intake (P < 0·05). Our findings indicate that the EAT–Lancet diet score requires minimum intake values for nutrient-dense dietary components to avoid positively scoring non-consumption of food groups and subsequently predicting lower MPA of diets, when applied to rural WRA in LMIC.
Most oviposition by Helicoverpa zea (Boddie) occurs near the top of the canopy in soybean, Glycine max (L.) Merr, and larval abundance is influenced by the growth habit of plants. However, the vertical distribution of larvae within the canopy is not as well known. We evaluated the vertical distribution of H. zea larvae in determinate and indeterminate varieties, hypothesizing that larval distribution in the canopy would vary between these two growth habits and over time. We tested this hypothesis in a naturally infested replicated field experiment and two experimentally manipulated cage experiments. In the field experiment, flowering time was synchronized between the varieties by manipulating planting date, while infestation timing was manipulated in the cage experiments. Larvae were recovered using destructive sampling of individual soybean plants, and their vertical distribution by instar was recorded from three sampling points over time in each experiment. While larval population growth and development varied between the determinate and indeterminate varieties within and among experiments, we found little evidence that larvae have preference for different vertical locations in the canopy. This study lends support to the hypothesis that larval movement and location within soybean canopies do not result entirely from oviposition location and nutritional requirements.
A new high time resolution observing mode for the Murchison Widefield Array (MWA) is described, enabling full polarimetric observations with up to
MHz of bandwidth and a time resolution of
s. This mode makes use of a polyphase synthesis filter to ‘undo’ the polyphase analysis filter stage of the standard MWA’s Voltage Capture System observing mode. Sources of potential error in the reconstruction of the high time resolution data are identified and quantified, with the
loss induced by the back-to-back system not exceeding
dB for typical noise-dominated samples. The system is further verified by observing three pulsars with known structure on microsecond timescales.
The COVID-19 pandemic has had a major impact on clinical practice. Safe standards of practice are essential to protect health care workers while still allowing them to provide good care. The Canadian Society of Clinical Neurophysiologists, the Canadian Association of Electroneurophysiology Technologists, the Association of Electromyography Technologists of Canada, the Board of Registration of Electromyography Technologists of Canada, and the Canadian Board of Registration of Electroencephalograph Technologists have combined to review current published literature about safe practices for neurophysiology laboratories. Herein, we present the results of our review and provide our expert opinion regarding the safe practice of neurophysiology during the COVID-19 pandemic in Canada.
Praziquantel (PZQ) is the drug of choice for schistosomiasis. The potential drug resistance necessitates the search for adjunct or alternative therapies to PZQ. Previous functional genomics has shown that RNAi inhibition of Ca2+/calmodulin-dependent protein kinase II (CaMKII) gene in Schistosoma adult worms significantly improved the effectiveness of PZQ. Here we tested the in vitro efficacy of 15 selective and non-selective CaMK inhibitors against Schistosoma mansoni and showed that PZQ efficacy was improved against refractory juvenile parasites when combined with these CaMK inhibitors. By measuring CaMK activity and the mobility of adult S. mansoni, we identified two non-selective CaMK inhibitors, Staurosporine (STSP) and 1Naphthyl PP1 (1NAPP1), as promising candidates for further study. The impact of STSP and 1NAPP1 was investigated in mice infected with S. mansoni in the presence or absence of a sub-lethal dose of PZQ against 2- and 7-day-old schistosomula and adults. Treatment with STSP/PZQ induced a significant (47–68%) liver egg burden reduction compared with mice treated with PZQ alone. The findings indicate that the combination of STSP and PZQ dosages significantly improved anti-schistosomal activity compared to PZQ alone, demonstrating the potential of selective and non-selective CaMK/kinase inhibitors as a combination therapy with PZQ in treating schistosomiasis.
The aim of this study is to determine the species of parasite that infected the population of Brussels during the Medieval and Renaissance periods, and determine if there was notable variation between different households within the city. We compared multiple sediment layers from cesspits beneath three different latrines dating from the 14th–17th centuries. Helminths and protozoa were detected using microscopy and enzyme-linked immunosorbent assay (ELISA). We identified Ascaris sp., Capillaria sp., Dicrocoelium dendriticum, Entamoeba histolytica, Fasciola hepatica, Giardia duodenalis, Taenia sp. and Trichuris sp. in Medieval samples, and continuing presence of Ascaris sp., D. dendriticum, F. hepatica, G. duodenalis and Trichuris sp. into the Renaissance. While some variation existed between households, there was a broadly consistent pattern with the domination of species spread by fecal contamination of food and drink (whipworm, roundworm and protozoa that cause dysentery). These data allow us to explore diet and hygiene, together with routes for the spread of fecal–oral parasites. Key factors explaining our findings are manuring practices with human excrement in market gardens, and flooding of the polluted River Senne during the 14th–17th centuries.
An increasing number of patients are being prescribed direct oral anticoagulants (DOACs), while the patients who remain on warfarin are becoming more complex. There is currently a lack of a standardised anticoagulation review for patients in primary care, resulting in potentially preventable harm events. Our aim was to implement a new service, where a standardised review is carried out by a specialist multidisciplinary secondary care anticoagulation team. Overall, the implementation of a standardised review resulted in better optimisation of anticoagulation management for patients taking either a DOAC or a warfarin. Of the 172 eligible patients prescribed warfarin, 47 (27%) chose to switch a DOAC. The average time in therapeutic range for patients on warfarin before and after the pilot increased from 73.5% to 75%. Of 482 patients taking a DOAC, 35 (7%) were found to be on incorrect dose. In 32 (91%) of 35 patients, the dose was amended after notifying the patient’s general practitioner. We also found a significant number of patients inappropriately prescribed concomitant medication such as antiplatelet or non-steroidal anti-inflammatory drugs, potentially putting the patients at an elevated risk of bleeding. While further research is needed; we believe the results of this pilot can be used to help build a case to influence the commissioning of anticoagulation services. Secondary care anticoagulation teams, like our own, may be well-placed to provide or support such services, by working across the primary care and secondary care interface to support our primary care colleagues.
Global pork production has largely adopted on-farm biosecurity to minimize vectors of disease transmission and protect swine health. Feed and ingredients were not originally thought to be substantial vectors, but recent incidents have demonstrated their ability to harbor disease. The objective of this paper is to review the potential role of swine feed as a disease vector and describe biosecurity measures that have been evaluated as a way of maintaining swine health. Recent research has demonstrated that viruses such as porcine epidemic diarrhea virus and African Swine Fever Virus can survive conditions of transboundary shipment in soybean meal, lysine, and complete feed, and contaminated feed can cause animal illness. Recent research has focused on potential methods of preventing feed-based pathogens from infecting pigs, including prevention of entry to the feed system, mitigation by thermal processing, or decontamination by chemical additives. Strategies have been designed to understand the spread of pathogens throughout the feed manufacturing environment, including potential batch-to-batch carryover, thus reducing transmission risk. In summary, the focus on feed biosecurity in recent years is warranted, but additional research is needed to further understand the risk and identify cost-effective approaches to maintain feed biosecurity as a way of protecting swine health.
OBJECTIVES/GOALS: We conducted a review of CTSA websites to understand the current landscape for CRP institutional professional development and training revealed in the CTSA hub websites. METHODS/STUDY POPULATION: We accessed and reviewed 59 currently funded CTSA hub websites for evidence of CRP training opportunities. Parameters reviewed included: 1) opportunities were specified for CRPs versus K and T trainees; 2) mandated training; 3) leveling; 4) delivery methods/resources; 5) public accessibility; 6) unique features. The website reviews informed a REDCap survey sent to the CTSA Administrators (n = 149) and the Coordinator Taskforce (n = 105) listservs to gain additional knowledge of CRP training available at the institution. A subsequent repeat review of the CTSA hub websites will be conducted to determine evolving trends. RESULTS/ANTICIPATED RESULTS: A total of 40 responded to the survey from 59 CTSA hubs. Survey results are being analyzed. Website review data are being tabulated and the subsequent review of websites will be collected in February. Those findings are pending and will include a comparison of prior findings. 42% of CRP hubs list CRP training within the CTSA hub website. Required onboarding training (beyond CITI certificates) is revealed for some hubs (15%). DISCUSSION/SIGNIFICANCE OF IMPACT: On our initial website review less than half of the CTSA hub websites list specific CRP training on their website. Many were hidden behind firewalls and could not be reviewed for content. The REDCap Survey will provide more granular descriptions of programs. Data from a second website review will be collected for comparison. Based on a preliminary re-review of sites, there is a suggestion of increasing CRP workforce development information. CTSAs are well-positioned to be a central hub for promoting educational excellence of the institutional workforce, for medical centers and in other venues where clinical research is performed.
To utilise a community-based participatory approach in the design and implementation of an intervention targeting diet-related health problems on Navajo Nation.
A dual strategy approach of community needs/assets assessment and engagement of cross-sectorial partners in programme design with systematic cyclical feedback for programme modifications.
Navajo Nation, USA.
Navajo families with individuals meeting criteria for programme enrolment. Participant enrolment increased with iterative cycles.
The Navajo Fruit and Vegetable Prescription (FVRx) Programme.
A broad, community-driven and culturally relevant programme design has resulted in a programme able to maintain core programmatic principles, while also allowing for flexible adaptation to changing needs.
This paper explores the complex story of a particular style of rock art in western Arnhem Land known as ‘Painted Hands’. Using new evidence from recent fieldwork, we present a definition for their style, distribution and place in the stylistic chronologies of this region. We argue these motifs played an important cultural role in Aboriginal society during the period of European settlement in the region. We explore the complex messages embedded in the design features of the Painted Hands, arguing that they are more than simply hand stencils or markers of individuality. We suggest that these figures represent stylized and intensely encoded motifs with the power to communicate a high level of personal, clan and ceremonial identity at a time when all aspects of Aboriginal cultural identity were under threat.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.