To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
De Fruyt and De Clercq (this volume) and Sellbom (this volume) raise important issues surrounding the use of a five-factor model (FFM) of personality to conceptualize, assess, and diagnose personality disorder including how one includes a measure of impairment, the level of abstraction that is optimal (domains vs. facets), and the need for the development of formal test manuals, normative data, and means for identifying non-credible responding. In this response, the authors note their agreement with De Fruyt and De Clercq regarding the importance of assessing impairment but note that (a) it is already included to a large degree in the assessment of pathological FFM traits and (b) that they would prefer an approach that focuses explicitly on difficulties in occupational and social functioning. As to Sellbom’s comments suggesting that further work is necessary with regard to the development of test manuals, normative databases, and measures of valid responding – the authors agree and note some of the obstacles including the need for funding for the collection of normative data (and what consensus as to the kind – clinical only; community only; both) and development of test manuals. As to measures of credible responding, they note that many of these exist already for the family of FFM PD scales that they helped create and are aware of similar efforts for other popular measures of pathological traits.
The use of dimensional personality traits with explicit ties to general or normative personality has gone mainstream with instantiation in the most recent version of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) and the soon to be released 11th revision of the International Classification of Diseases (ICD-11). Much of the theoretical and empirical work that supports the transition to dimensional trait-based models of personality disorder has used the prominent five-factor model of personality to do so, which suggests that five basic dimensions capture much of the important and reliable personality variance: neuroticism, extraversion, openness to experience, agreeableness, and conscientiousness. This chapter reviews this literature and demonstrates how general and pathological five-factor models of personality are parsimonious, valid, and useful. The authors believe that the use of such models for the diagnosis of personality disorder represents a much needed and empirically supported movement to integrate normative and pathological personality.
Vascular surgery patients are nutritionally vulnerable. Various malnutrition screening and assessment tools are available; however, none has been developed or validated in vascular patients. The present study aimed to: (1) investigate the validity of four commonly administered malnutrition screening tools (Malnutrition Screening Tool (MST), Malnutrition Universal Screening Tool (MUST), Nutrition Risk Screen-2002 (NRS-2002) and the Mini-Nutritional Assessment – Short Form (MNA-SF) and an assessment tool (the Patient-Generated Subjective Global Assessment (PG-SGA)) compared against a comprehensive dietitian’s assessment and (2) evaluate the ability of the instruments to predict outcomes. Vascular inpatients were screened using the four malnutrition screening tools and assessed using the PG-SGA. Each was assessed by a dietitian incorporating nutritional biochemistry, anthropometry and changes in dietary intake. Diagnostic accuracy, consistency and predictive ability were determined. A total of 322 (69·3 % male) patients participated, with 75 % having at least one parameter indicating nutritional deficits. No instrument achieved the a priori levels for sensitivity (14·9–52·5 %). Neither tool predicted EuroQoL 5-dimension 5-level score. All tools except the MNA-SF were associated with length of stay (LOS); however, the direction varied with increased risk of malnutrition on the MUST and NRS-2002 being associated with shorter LOS (P=0·029 and 0·045) and the reverse with the MST and PG-SGA (P=0·005 and <0·001). The NRS-2002 was associated with increased risk of complications (P=0·039). The MST, NRS-2002 and PG-SGA were predictive of discharge to an institution (P=0·004, 0·005 and 0·003). The tools studied were unable to identify the high prevalence of undernutrition; hence, vascular disease-specific screening and/or assessment tools are warranted.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
The association between lower birth weight and increased disease risk in adulthood has drawn attention to the physiological processes that shape the gestational environment. We implement genome-wide transcriptional profiling of maternal blood samples to identify subsets of genes and associated transcription control pathways that predict offspring birth weight. Female participants (N = 178, mean = 27.0 years) in a prospective observational birth cohort study were contacted between 2009 and 2014 to identify new pregnancies. An in-home interview was scheduled for early in the third trimester (mean = 30.3 weeks) to collect pregnancy-related information and a blood sample, and birth weight was measured shortly after delivery. Transcriptional activity in white blood cells was determined with a whole-genome gene expression direct hybridization assay. Fifty transcripts were differentially expressed in association with offspring birth weight, with 18 up-regulated in relation to lower birth weight, and 32 down-regulated. Examination of transcription control pathways identified increased activity of NF-κB, AP-1, EGR1, EGR4, and Gfi families, and reduced the activity of CEBP, in association with lower birth weight. Transcript origin analyses identified non-classical CD16+ monocytes, CD1c+ myeloid dendritic cells, and neutrophils as the primary cellular mediators of differential gene expression. These results point toward a systematic regulatory shift in maternal white blood cell activity in association with lower offspring birth weight, and they suggest that analyses of gene expression during gestation may provide insight into regulatory and cellular mechanisms that influence birth outcomes.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
The Enenteridae Yamaguti, 1958 and Gyliauchenidae Fukui, 1929 exhibit an interesting pattern of host partitioning in herbivorous fishes of the Indo-West Pacific. Enenterids are known almost exclusively from fishes of the family Kyphosidae, a group of herbivorous marine fishes common on tropical and temperate reefs. In contrast, gyliauchenids are found in most of the remaining lineages of marine herbivorous fishes, but until the present study, had never been known from kyphosids. Here we report on the first species of gyliauchenid known from a kyphosid. Endochortophagus protoporus gen. nov., sp. nov. was recovered from the Western buffalo bream, Kyphosus cornelii (Whitley, 1944), collected off Western Australia. Kyphosus cornelii also hosts an enenterid, Koseiria allanwilliamsi Bray & Cribb, 2002, and is thus the first fish known in which enenterids and gyliauchenids co-occur. Molecular phylogenetic analyses place the new species close to those of Affecauda Hall & Chambers, 1999 and Flagellotrema Ozaki, 1936, but there is sufficient morphological evidence, combined with the unusual host, to consider it distinct from these genera. We discuss factors which may have contributed to the host partitioning pattern observed between enenterids and gyliauchenids.
New evidence from archaeological investigations in north-east Thailand shows a transition in rice farming towards wetland cultivation that would have facilitated greater yields and surpluses. This evidence, combined with new dates and palaeoclimatic data, suggests that this transition took place in the Iron Age, at a time of increasingly arid climate, and when a number of broader societal changes become apparent in the archaeological record. For the first time, it is possible to relate changes in subsistence economy to shifts in regional climate and water-management strategies, and to the emergence of state societies in Southeast Asia.
As the IAU heads towards its second century, many changes have simultaneously transformed Astronomy and the human condition world-wide. Amid the amazing recent discoveries of exoplanets, primeval galaxies, and gravitational radiation, the human condition on Earth has become blazingly interconnected, yet beset with ever-increasing problems of over-population, pollution, and never-ending wars. Fossil-fueled global climate change has begun to yield perilous consequences. And the displacement of people from war-torn nations has reached levels not seen since World War II.
Good education requires student experiences that deliver lessons about practice as well as theory and that encourage students to work for the public good—especially in the operation of democratic institutions (Dewey 1923; Dewy 1938). We report on an evaluation of the pedagogical value of a research project involving 23 colleges and universities across the country. Faculty trained and supervised students who observed polling places in the 2016 General Election. Our findings indicate that this was a valuable learning experience in both the short and long terms. Students found their experiences to be valuable and reported learning generally and specifically related to course material. Postelection, they also felt more knowledgeable about election science topics, voting behavior, and research methods. Students reported interest in participating in similar research in the future, would recommend other students to do so, and expressed interest in more learning and research about the topics central to their experience. Our results suggest that participants appreciated the importance of elections and their study. Collectively, the participating students are engaged and efficacious—essential qualities of citizens in a democracy.
Mass casualty incidents are a concern in many urban areas. A community’s ability to cope with such events depends on the capacities and capabilities of its hospitals for handling a sudden surge in demand of patients with resource-intensive and specialized medical needs. This paper uses a whole-hospital simulation model to replicate medical staff, resources, and space for the purpose of investigating hospital responsiveness to mass casualty incidents. It provides details of probable demand patterns of different mass casualty incident types in terms of patient categories and arrival patterns, and accounts for related transient system behavior over the response period. Using the layout of a typical urban hospital, it investigates a hospital’s capacity and capability to handle mass casualty incidents of various sizes with various characteristics, and assesses the effectiveness of designed demand management and capacity-expansion strategies. Average performance improvements gained through capacity-expansion strategies are quantified and best response actions are identified. Capacity-expansion strategies were found to have superadditive benefits when combined. In fact, an acceptable service level could be achieved by implementing only 2 to 3 of the 9 studied enhancement strategies. (Disaster Med Public Health Preparedness. 2018;12:778-790)
Outpatient central venous catheters (CVCs) are being used more frequently; however, data describing mechanical complications and central-line–associated bloodstream infections (CLABSI) in the outpatient setting are limited. We performed a retrospective observational cohort study to understand the burden of these complications to elucidate their impact on the healthcare system.
Data were retrospectively collected on patients discharged from Vanderbilt University Medical Center with a CVC in place and admitted into the care of Vanderbilt Home Care Services. Risk factors for medically attended catheter-associated complications (CACs) and outpatient CLABSIs were analyzed.
A CAC developed in 143 patients (21.9%), for a total of 165 discrete CAC events. Among these, 76 (46%) required at least 1 visit to the emergency department or an inpatient admission, while the remaining 89 (54%) required an outpatient clinic visit. The risk for developing a CAC was significantly increased in female patients, patients with a CVC with >1 lumen, and patients receiving total parenteral nutrition. The absolute number of CLABSIs identified in the study population was small at 16, or 2.4% of the total cohort.
Medically attended catheter complications were common among outpatients discharged with a CVC, and reduction of these events should be the focus of outpatient quality improvement programs.
Debates about the value of digital methods often return to the nature of knowledge itself. Specifically, do not digital methods tell us what we intuitively already know? Or, if we do not know something yet, is it trivial or discoverable through other more traditional humanistic modes of analysis?
The varied textual traditions of the premodern Islamicate World represent an opportunity and a problem for the Digital Humanities (DH). The opportunity lies in the sheer extent of this textual heritage: if we combine the textual output of premodern Persian and Arabic authors (not to mention Turkish and other less well-represented Islamicate languages), this body of texts constitutes arguably the largest written repository of human culture. Analytical methods developed for other linguistic heritages can be repurposed to make use of this wealth of texts, and efforts are now underway to apply to them a series of computationally enhanced methods that derive from a variety of disciplines (e.g., corpus linguistics, computational linguistics, the social sciences, and statistics). The application of these forms of analysis to these large new corpora promises new insights on premodern Islamicate cultures and the improvement of existing digital tools and methodologies.
To examine dietary Na and K intake at eating occasions in Australian adults and identify the contribution of major food sources to Na and K at different eating occasions.
Secondary analysis of 24 h recall diet data from the Australian Health Survey (2011–2013).
Nationally representative survey in Australia.
Male and female Australians aged 18–84 years (n 7818).
Dinner contributed the greatest proportion to total daily Na intake (33 %) and K intake (35 %). Na density was highest at lunch (380 mg/MJ) and K density highest at between-meal time eating occasions (401 mg/MJ). Between-meal time eating occasions provided 20 % of daily Na intake and 26 % of daily K intake. The major food group sources of Na were different at meal times (breads and mixed dishes) compared with between-meal times (cakes, muffins, scones, cake-type desserts). The top food group sources of K at meal times were potatoes and unprocessed meat products and dishes.
Foods which contributed to Na and K intake differed according to eating occasion. Major food sources of Na were bread and processed foods. Major food sources of K were potatoes and meat products and dishes. Public health messages that emphasise meal-based advice and diet patterns high in vegetables, fruits and unprocessed foods may also aid reduction in dietary Na intake and increase in dietary K intake.