To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Silphium spp. have garnered interest in Europe as a bioenergy crop and in North America as a perennial oilseed crop. However, very little has been done at this early stage of domestication to characterize wild collections for many key characteristics, including important oilseed traits. The objective of this work was to develop a basic understanding of how biogeography and associated population genetic forces have shaped seed phenotypes in plant collections across the native range of Silphium integrifolium Michx. (Asteraceae: Heliantheae), the primary domestication candidate for oilseed use. A collection of 53 accessions was grown in a common environment in Salina, KS, which is a location well within the native range of the species in central North America. Plants from each collection site were randomly mated by hand to produce seed representative of each accession, and the seeds subjected to seed dimensional trait, oil content and oil composition analyses. Kernel width varied along a latitudinal cline of collection site, while kernel length varied across a longitudinal cline. Palmitic and linoleic acids were inversely correlated with each other and varied along a longitudinal cline of the collection site. The results indicate that accessions collected from more southwesterly sites tended to have larger seed and those from more westerly sites had higher linoleic acid content and lower palmitic and myristic acids, which are all desirable phenotypes for an oilseed Silphium.
In coastal and island archaeology, carbonate mollusk shells are often among the most abundant materials available for radiocarbon (14C) dating. The marsh periwinkle (Littorina irrorata) is one of these such species, ubiquitously found along the Atlantic and Gulf coasts of the United States in both modern and archaeological contexts. This paper presents a novel approach to dating estuarine mollusks where rather than attempting to characterize the size and variability of reservoir effects to “correct” shell carbonate dates, we describe a compound-specific approach that isolates conchiolin, the organic matter bound with the shell matrix of the L. irrorata. Conchiolin typically constitutes <5% of shell weight. In L. irrorata, it is derived from the snail’s terrestrial diet and is thus not strongly influenced by marine, hardwater, or other carbon reservoir effects. We compare the carbon isotopes (δ13C and Δ14C) of L. irrorata shell carbonate, conchiolin, and bulk soft tissue from six modern, live-collected specimens from Apalachicola Bay, Florida, with samples that represent possible sources of carbon within their environment including surface sediments, marsh plant tissues, and dissolved inorganic carbon (DIC) in water. Ultimately, this paper demonstrates that samples obtained from wet chemical oxidation of L. irrorata conchiolin produces accurate 14C dates.
To validate a system to detect ventilator associated events (VAEs) autonomously and in real time.
Retrospective review of ventilated patients using a secure informatics platform to identify VAEs (ie, automated surveillance) compared to surveillance by infection control (IC) staff (ie, manual surveillance), including development and validation cohorts.
The Massachusetts General Hospital, a tertiary-care academic health center, during January–March 2015 (development cohort) and January–March 2016 (validation cohort).
Ventilated patients in 4 intensive care units.
The automated process included (1) analysis of physiologic data to detect increases in positive end-expiratory pressure (PEEP) and fraction of inspired oxygen (FiO2); (2) querying the electronic health record (EHR) for leukopenia or leukocytosis and antibiotic initiation data; and (3) retrieval and interpretation of microbiology reports. The cohorts were evaluated as follows: (1) manual surveillance by IC staff with independent chart review; (2) automated surveillance detection of ventilator-associated condition (VAC), infection-related ventilator-associated complication (IVAC), and possible VAP (PVAP); (3) senior IC staff adjudicated manual surveillance–automated surveillance discordance. Outcomes included sensitivity, specificity, positive predictive value (PPV), and manual surveillance detection errors. Errors detected during the development cohort resulted in algorithm updates applied to the validation cohort.
In the development cohort, there were 1,325 admissions, 479 ventilated patients, 2,539 ventilator days, and 47 VAEs. In the validation cohort, there were 1,234 admissions, 431 ventilated patients, 2,604 ventilator days, and 56 VAEs. With manual surveillance, in the development cohort, sensitivity was 40%, specificity was 98%, and PPV was 70%. In the validation cohort, sensitivity was 71%, specificity was 98%, and PPV was 87%. With automated surveillance, in the development cohort, sensitivity was 100%, specificity was 100%, and PPV was 100%. In the validation cohort, sensitivity was 85%, specificity was 99%, and PPV was 100%. Manual surveillance detection errors included missed detections, misclassifications, and false detections.
Manual surveillance is vulnerable to human error. Automated surveillance is more accurate and more efficient for VAE surveillance.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
Over the past 30 years, the number of US doctoral anthropology graduates has increased by about 70%, but there has not been a corresponding increase in the availability of new faculty positions. Consequently, doctoral degree-holding archaeologists face more competition than ever before when applying for faculty positions. Here we examine where US and Canadian anthropological archaeology faculty originate and where they ultimately end up teaching. Using data derived from the 2014–2015 AnthroGuide, we rank doctoral programs whose graduates in archaeology have been most successful in the academic job market; identify long-term and ongoing trends in doctoral programs; and discuss gender division in academic archaeology in the US and Canada. We conclude that success in obtaining a faculty position upon graduation is predicated in large part on where one attends graduate school.
Debates about presidential greatness have been with us for decades, facilitated in part by numerous systematic surveys of scholars with expertise in American history and politics. Nevertheless, the voice of political scientists in this debate has been relatively muted when compared particularly with the role that historians have had in making these determinations. This article introduces and assesses results of a recent effort to capture the attitudes of political science presidency experts about presidential greatness. By surveying the membership of the APSA Presidents and Executive Politics section, we could identify and then compare specifically the attitudes of political scientists against the growing body of ratings and rankings of a phenomenon with long-standing interest and importance.
The Omani basement is located spatially distant from the dominantly juvenile Arabian–Nubian Shield (ANS) to its west, and its relationship to the amalgamation of those arc terranes has yet to be properly constrained. The Jebel Ja'alan (NE Oman) basement inlier provides an excellent opportunity to better understand the Neoproterozoic tectonic geography of Oman and its relationship to the ANS. To understand the origin of this basement inlier, we present new radiogenic isotopic data from igneous bodies in Jebel Ja'alan. U–Pb and 40Ar/39Ar geochronological data are used to constrain the timing of magmatism and metamorphism in the jebel. Positive εHf and εNd values indicate a juvenile origin for the igneous lithologies. Phase equilibria modelling is used to constrain the metamorphic conditions recorded by basement. Pressure–temperature (P–T) pseudosections show that basement schists followed a clockwise P–T path, reaching peak metamorphic conditions of c. 650–700°C at 4–7.5 kbar, corresponding to a thermal gradient of c. 90–160°C/kbar. From the calculated thermal gradient, in conjunction with collected trace-element data, we interpret that the Jebel Ja'alan basement formed in an arc environment. Geochronological data indicate that this juvenile arc formed during Tonian time and is older than basement further west in Oman. We argue that the difference in timing is related to westwards arc accretion and migration, which implies that the Omani basement represents its own tectonic domain separate to the ANS and may be the leading edge of the Neoproterozoic accretionary margin of India.
Neonates undergoing heart surgery for CHD are at risk for postoperative gastrointestinal complications and aspiration events. There are limited data regarding the prevalence of aspiration after neonatal cardiothoracic surgery; thus, the effects of aspiration events on this patient population are not well understood. This retrospective chart review examined the prevalence and effects of aspiration among neonates who had undergone cardiac surgery at the time of their discharge.
This study examined the prevalence of aspiration among neonates who had undergone cardiac surgery. Demographic data regarding these patients were analysed in order to determine risk factors for postoperative aspiration. Post-discharge feeding routes and therapeutic interventions were extracted to examine the time spent using alternate feeding routes because of aspiration risk or poor caloric intake. Modified barium swallow study results were used to evaluate the effectiveness of the test as a diagnostic tool.
Materials and methods
A retrospective study was undertaken of neonates who had undergone heart surgery from July, 2013 to January, 2014. Data describing patient demographics, feeding methods, and follow-up visits were recorded and compared using a χ2 test for goodness of fit and a Kaplan–Meier graph.
The patient population included 62 infants – 36 of whom were male, and 10 who were born with single-ventricle circulation. The median age at surgery was 6 days (interquartile range=4 to 10 days). Modified barium swallow study results showed that 46% of patients (n=29) aspirated or were at risk for aspiration, as indicated by laryngeal penetration. In addition, 48% (n=10) of subjects with a negative barium swallow or no swallow study demonstrated clinical aspiration events. Tube feedings were required by 66% (n=41) of the participants. The median time spent on tube feeds, whether in combination with oral feeds or exclusive use of a nasogastric or gastric tube, was 54 days; 44% (n=27) of patients received tube feedings for more than 120 days. Premature infants were significantly more likely to have aspiration events than infants delivered at full gestational age (OR p=0.002). Infants with single-ventricle circulation spent a longer time on tube feeds (median=95 days) than infants with two-ventricle defects (median=44 days); the type of cardiac defect was independent of prevalence of an aspiration event.
Aspiration is common following neonatal cardiac surgery. The modified barium swallow study is often used to identify aspiration events and to determine an infant’s risk for aspirating. This leads to a high proportion of infants who require tube feedings following neonatal cardiac surgery.
In current practice, children with anatomically normal hearts routinely undergo fluoroscopy-free ablations. Infants and children with congenital heart disease (CHD) represent the most difficult population to perform catheter ablation without fluoroscopy. We report two neonatal patients with CHD in whom cardiac ablations were performed without fluoroscopy. The first infant had pulmonary atresia with intact ventricular septum with refractory supraventricular tachycardia, and the second infant presented with Ebstein’s anomaly of the tricuspid valve along with persistent supraventricular tachycardia. Both patients underwent uncomplicated, successful ablation without recurrence of arrhythmias. These cases suggest that current approaches to minimising fluoroscopy may be useful even in challenging patients such as neonates with CHD.
Metabarcoding, the coupling of DNA-based species identification and high-throughput sequencing, offers enormous promise for arthropod biodiversity studies but factors such as cost, speed and ease-of-use of bioinformatic pipelines, crucial for making the leapt from demonstration studies to a real-world application, have not yet been adequately addressed. Here, four published and one newly designed primer sets were tested across a diverse set of 80 arthropod species, representing 11 orders, to establish optimal protocols for Illumina-based metabarcoding of tropical Malaise trap samples. Two primer sets which showed the highest amplification success with individual specimen polymerase chain reaction (PCR, 98%) were used for bulk PCR and Illumina MiSeq sequencing. The sequencing outputs were subjected to both manual and simple metagenomics quality control and filtering pipelines. We obtained acceptable detection rates after bulk PCR and high-throughput sequencing (80–90% of input species) but analyses were complicated by putative heteroplasmic sequences and contamination. The manual pipeline produced similar or better outputs to the simple metagenomics pipeline (1.4 compared with 0.5 expected:unexpected Operational Taxonomic Units). Our study suggests that metabarcoding is slowly becoming as cheap, fast and easy as conventional DNA barcoding, and that Malaise trap metabarcoding may soon fulfill its potential, providing a thermometer for biodiversity.
Longitudinal normative data obtained from a robust elderly sample (i.e., believed to be free from neurodegenerative disease) are sparse. The purpose of the present study was to develop reliable change indices (RCIs) that can assist with interpretation of test score changes relative to a healthy sample of older adults (ages 50+). Participants were 4217 individuals who completed at least three annual evaluations at one of 34 past and present Alzheimer’s Disease Centers throughout the United States. All participants were diagnosed as cognitively normal at every study visit, which ranged from three to nine approximately annual evaluations. One-year RCIs were calculated for 11 neuropsychological variables in the Uniform Data Set by regressing follow-up test scores onto baseline test scores, age, education, visit number, post-baseline assessment interval, race, and sex in a linear mixed effects regression framework. In addition, the cumulative frequency distributions of raw score changes were examined to describe the base rates of test score changes. Baseline test score, age, education, and race were robust predictors of follow-up test scores across most tests. The effects of maturation (aging) were more pronounced on tests related to attention and executive functioning, whereas practice effects were more pronounced on tests of episodic and semantic memory. Interpretation of longitudinal changes on 11 cognitive test variables can be facilitated through the use of reliable change intervals and base rates of score changes in this robust sample of older adults. A Web-based calculator is provided to assist neuropsychologists with interpretation of longitudinal change. (JINS, 2015, 21, 558–567)
Information-processing biases may contribute to the intergenerational transmission of depression. There is growing evidence that children of depressed mothers exhibit attentional biases for sad faces. However, findings are mixed as to whether this bias reflects preferential attention toward, versus attentional avoidance of, sad faces, suggesting the presence of unmeasured moderators. To address these mixed findings, we focused on the potential moderating role of genes associated with hypothalamic–pituitary–adrenal axis reactivity. Participants included children (8–14 years old) of mothers with (n = 81) and without (n = 81) a history of depression. Eye movements were recorded while children passively viewed arrays of angry, happy, sad, and neutral faces. DNA was obtained from buccal cells. Children of depressed mothers exhibited more sustained attention to sad faces than did children of nondepressed mothers. However, it is important that this relation was moderated by children's genotype. Specifically, children of depressed mothers who carried reactive genotypes across the corticotropin-releasing hormone type 1 receptor (CHRH1) TAT haplotype and FK506 binding protein 5 (FKBP5) rs1360780 (but not the solute carrier family C6 member 4 [SLC6A4] of the serotonin transporter linked polymorphic region [5-HTTLPR]) exhibited less sustained attention to sad faces and more sustained attention to happy faces. These findings highlight the role played by specific genetic influences and suggest that previous mixed findings may have been due to genetic heterogeneity across the samples.
To identify factors associated with the development of surgical site infection (SSI) among adult patients undergoing renal transplantation
A retrospective cohort study
An urban tertiary care center in Baltimore, Maryland, with a well-established renal transplantation program that performs ~200–250renal transplant procedures annually
At total of 441 adult patients underwent renal transplantation between January 1, 2010, and December 31, 2011. Of these 441patients, 66 (15%) developed an SSI; of these 66, 31 (47%) were superficial incisional infections and 35 (53%) were deep-incisional or organ-space infections. The average body mass index (BMI) among this patient cohort was 29.7; 84 (42%) were obese (BMI >30). Patients who developed an SSI had a greater mean BMI (31.7 vs 29.4; P=.004) and were more likely to have a history of peripheral vascular disease, rheumatologic disease, and narcotic abuse. History of cerebral vascular disease was protective. Multivariate analysis showed BMI (odds ratio [OR] 1.06; 95% confidence interval [CI], 1.02–1.11) and past history of narcotic use/abuse (OR, 4.86; 95% CI, 1.24–19.12) to be significantly associated with development of SSI after controlling for National Healthcare Surveillance Network (NHSN) score and presence of cerebrovascular, peripheral vascular, and rheumatologic disease.
We identified higher BMI as a risk factor for the development of SSI following renal transplantation. Notably, neither aggregate comorbidity scores nor NHSN risk index were associated with SSI in this population. Additional risk adjustment measures and research in this area are needed to compare SSIs across transplant centers.