We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Most oviposition by Helicoverpa zea (Boddie) occurs near the top of the canopy in soybean, Glycine max (L.) Merr, and larval abundance is influenced by the growth habit of plants. However, the vertical distribution of larvae within the canopy is not as well known. We evaluated the vertical distribution of H. zea larvae in determinate and indeterminate varieties, hypothesizing that larval distribution in the canopy would vary between these two growth habits and over time. We tested this hypothesis in a naturally infested replicated field experiment and two experimentally manipulated cage experiments. In the field experiment, flowering time was synchronized between the varieties by manipulating planting date, while infestation timing was manipulated in the cage experiments. Larvae were recovered using destructive sampling of individual soybean plants, and their vertical distribution by instar was recorded from three sampling points over time in each experiment. While larval population growth and development varied between the determinate and indeterminate varieties within and among experiments, we found little evidence that larvae have preference for different vertical locations in the canopy. This study lends support to the hypothesis that larval movement and location within soybean canopies do not result entirely from oviposition location and nutritional requirements.
Helicoverpa zea (Boddie) is a damaging pest of many crops including soybean, Glycine max (L.), especially in the southern United States. Previous studies have concluded that oviposition and development of H. zea larvae mirror the phenology of soybean, with oviposition occurring during full bloom, younger larvae developing on blooms and leaves, intermediate aged larvae developing on varying tissue types, and older larvae developing on flowers and pods. In a field trial, we investigated the presence of natural infestations of H. zea larvae by instar in determinate and indeterminate soybean varieties. In complementary experiments, we artificially infested H. zea and allowed them to oviposit on plants within replicated cages (one with a determinate variety and two with an indeterminate variety). Plants were sampled weekly during the time larvae were present. In the natural infestation experiment, most larvae were found on blooms during R3 and were early to middle instars; by R4, most larvae were found on leaves and were middle to late instars. In contrast, in the cage study, most larvae were found on leaves regardless of soybean growth stage or larval stage. Determinate and indeterminate growth habit did not impact larval preference for different soybean tissue types. Our studies suggest H. zea larvae prefer specific tissue types, but also provide evidence that experimental design can influence the results. Finally, our finding of larval preference for leaves contrasts with findings from previous studies.
A new species of Eriocaulon, E. vamanae, is described from the southern Western Ghats of Kerala, India. It resembles Eriocaulon nepalense var. luzulifolium (Mart.) Praj. & J.Parn. but differs in the shape of its involucral bracts and receptacle, the fusion of the sepals in male flowers, the shape and indumentum of the sepals in female flowers, the size and indumentum of the petals in female flowers, and the seed coat appendages. Eriocaulon vamanae is so far known only from the type locality, Meesapulimala in Idukki District, Kerala, and is assessed as ‘Critically Endangered’ according to the IUCN’s Red List Categories and Criteria.
United Nations (UN) personnel address a diverse range of political, social, and cultural crises throughout the world. Compared with other occupations routinely exposed to traumatic stress, there remains a paucity of research on mental health disorders and access to mental healthcare in this population. To fill this gap, personnel from UN agencies were surveyed for mental health disorders and mental healthcare utilization.
Methods
UN personnel (N = 17 363) from 11 UN entities completed online measures of generalized anxiety disorder (GAD), major depressive disorder (MDD), posttraumatic stress disorder (PTSD), trauma exposure, mental healthcare usage, and socio-demographic information.
Results
Exposure to one or more traumatic events was reported by 36.2% of survey responders. Additionally, 17.9% screened positive for GAD, 22.8% for MDD, and 19.9% for PTSD. Employing multivariable logistic regressions, low job satisfaction, younger age (<35 years of age), greater length of employment, and trauma exposure on or off-duty was significantly associated with all the three disorders. Among individuals screening positive for a mental health disorder, 2.05% sought mental health treatment within and 10.01% outside the UN in the past year.
Conclusions
UN personnel appear to be at high risk for trauma exposure and screening positive for a mental health disorder, yet a small percentage screening positive for mental health disorders sought treatment. Despite the mental health gaps observed in this study, additional research is needed, as these data reflect a large sample of convenience and it cannot be determined if the findings are representative of the UN.
A more efficient utilisation of marine-derived sources of dietary n-3 long-chain PUFA (n-3 LC PUFA) in cultured Atlantic salmon (Salmo salar L.) could be achieved by nutritional strategies that maximise endogenous n-3 LC PUFA synthesis. The objective of the present study was to quantify the extent of n-3 LC PUFA biosynthesis and the resultant effect on fillet nutritional quality in large fish. Four diets were manufactured, providing altered levels of dietary n-3 substrate, namely, 18 : 3n-3, and end products, namely, 20 : 5n-3 and 22 : 6n-3. After 283 d of feeding, fish grew in excess of 3000 g and no differences in growth performance or biometrical parameters were recorded. An analysis of fatty acid composition and in vivo metabolism revealed that endogenous production of n-3 LC PUFA in fish fed a diet containing no added fish oil resulted in fillet levels of n-3 LC PUFA comparable with fish fed a diet with added fish oil. However, this result was not consistent among all treatments. Another major finding of this study was the presence of abundant dietary n-3 substrate, with the addition of dietary n-3 end product (i.e. fish oil) served to increase final fillet levels of n-3 LC PUFA. Specifically, preferential β-oxidation of dietary C18n-3 PUFA resulted in conservation of n-3 LC PUFA from catabolism. Ultimately, this study highlights the potential for endogenous synthesis of n-3 LC PUFA to, partially, support a substantial reduction in the amount of dietary fish oil in diets for Atlantic salmon reared in seawater.
OBJECTIVES/SPECIFIC AIMS: Delirium, a form of acute brain dysfunction, characterized by changes in attention and alertness, is a known independent predictor of mortality in the Intensive Care Unit (ICU). We sought to understand whether catatonia, a more recently recognized form of acute brain dysfunction, is associated with increased 30-day mortality in critically ill older adults. METHODS/STUDY POPULATION: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Coma, was defined as a Richmond Agitation Scale score of −4 or −5. We used the Cox Proportional Hazards model predicting 30-day mortality after adjusting for delirium, coma and catatonia status. RESULTS/ANTICIPATED RESULTS: We enrolled 335 medical, surgical or trauma critically ill patients with 1103 matched delirium and catatonia assessments. Median age was 58 years (IQR: 48 - 67). Main indications for admission to the ICU included: airway disease or protection (32%; N=100) or sepsis and/or shock (25%; N=79. In the unadjusted analysis, regardless of the presence of catatonia, non-delirious individuals have the highest median survival times, while delirious patients have the lowest median survival time. Comparing the absence and presence of catatonia, the presence of catatonia worsens survival (Figure 1). In a time-dependent Cox model, comparing non-delirious individuals, holding catatonia status constant, delirious individuals have 1.72 times the hazards of death (IQR: 1.321, 2.231) while those with coma have 5.48 times the hazards of death (IQR: 4.298, 6.984). For DSM-5 catatonia scores, a 1-unit increase in the score is associated with 1.18 times the hazards of in-hospital mortality. Comparing two individuals with the same delirium status, an individual with a DSM-5 catatonia score of 0 (no catatonia) will have 1.178 times the hazard of death (IQR: 1.086, 1.278), while an individual with a score of 3 catatonia items (catatonia) present will have 1.63 times the hazard of death. DISCUSSION/SIGNIFICANCE OF IMPACT: Non-delirious individuals have the highest median survival times, while those who are comatose have the lowest median survival times after a critical illness, holding catatonia status constant. Comparing the absence and presence of catatonia, the presence of catatonia seems to worsen survival. Those individual who are both comatose and catatonic have the lowest median survival time.
The purpose of this study was to demonstrate effectiveness of an educational training workshop using role-playing to teach medical students in Botswana to deliver bad news.
Method
A 3-hour small group workshop for University of Botswana medical students rotating at the Princess Marina Hospital in Gaborone was developed. The curriculum included an overview of communication basics and introduction of the validated (SPIKES) protocol for breaking bad news. Education strategies included didactic lecture, handouts, role-playing cases, and open forum discussion. Pre- and posttraining surveys assessed prior exposure and approach to breaking bad news using multiple-choice questions and perception of skill about breaking bad news using a 5-point Likert scale. An objective structured clinical examination (OSCE) with a standardized breaking bad news skills assessment was conducted; scores compared two medical student classes before and after the workshop was implemented.
Result
Forty-two medical students attended the workshop and 83% (35/42) completed the survey. Medical students reported exposure to delivering bad news on average 6.9 (SD = 13.7) times monthly, with 71% (25/35) having delivered bad news themselves without supervision. Self-perceived skill and confidence increased from 23% (8/35) to 86% (30/35) of those who reported feeling “good” or “very good” with their ability to break bad news after the workshop. Feedback after the workshop demonstrated that 100% found the SPIKES approach helpful and planned to use it in clinical practice, found role-playing helpful, and requested more sessions. Competency for delivering bad news increased from a mean score of 14/25 (56%, SD = 3.3) at baseline to 18/25 (72%, SD = 3.6) after the workshop (p = 0.0002).
Significance of results
This workshop was effective in increasing medical student skill and confidence in delivering bad news. Standardized role-playing communication workshops integrated into medical school curricula could be a low-cost, effective, and easily implementable strategy to improve communication skills of doctors.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
High definition video from a towed camera system was used to describe the deep-sea benthic habitats within an elongate depression located at the western margin of Rockall Bank in the Hatton–Rockall Basin. At depths greater than 1190 m, an extensive area (10 km long by 1.5 km wide) of what appeared to be reduced sediments, bacterial mats and flocculent matter indicated possible cold-seep habitat. Plumes of sediment-rich fluid were observed alongside raised elongate features that gave topographic relief to the otherwise flat seafloor. In the deepest section of the depression (1215 m) dense flocculent matter was observed suspended in the water column, in places obscuring the seabed. Away from the bacterial mats, the habitat changed rapidly to sediments dominated by tube-dwelling polychaete worms and then to deep-sea sedimentary habitats more typical for the water depth (sponges and burrowing megafauna in areas of gentle slopes, and coral gardens on steeper slopes).
To identify predominant dietary patterns in four African populations and examine their association with obesity.
Design
Cross-sectional study.
Setting/Subjects
We used data from the Africa/Harvard School of Public Health Partnership for Cohort Research and Training (PaCT) pilot study established to investigate the feasibility of a multi-country longitudinal study of non-communicable chronic disease in sub-Saharan Africa. We applied principal component analysis to dietary intake data collected from an FFQ developed for PaCT to ascertain dietary patterns in Tanzania, South Africa, and peri-urban and rural Uganda. The sample consisted of 444 women and 294 men.
Results
We identified two dietary patterns: the Mixed Diet pattern characterized by high intakes of unprocessed foods such as vegetables and fresh fish, but also cold cuts and refined grains; and the Processed Diet pattern characterized by high intakes of salad dressing, cold cuts and sweets. Women in the highest tertile of the Processed Diet pattern score were 3·00 times more likely to be overweight (95 % CI 1·66, 5·45; prevalence=74 %) and 4·24 times more likely to be obese (95 % CI 2·23, 8·05; prevalence=44 %) than women in this pattern’s lowest tertile (both P<0·0001; prevalence=47 and 14 %, respectively). We found similarly strong associations in men. There was no association between the Mixed Diet pattern and overweight or obesity.
Conclusions
We identified two major dietary patterns in several African populations, a Mixed Diet pattern and a Processed Diet pattern. The Processed Diet pattern was associated with obesity.
The solar magnesium II core-to-wing ratio has been a well-studied proxy for chromospheric activity since 1978. Daily measurements at high spectral (0.1 nm) resolution began with the launch of the Solar Radiation and Climate Experiment (SORCE) in 2003. The next generation of measurements from the Extreme Ultraviolet Sensor (EUVS) on the Geostationary Operational Environmental Satellite 16 (GOES-16) will add high time cadence (every 30 seconds) to the observational Mg II irradiance record. We present a comparison of the two measurements during the period of overlap.
Ultraviolet (UV) Solar spectral Irradiance (SSI) has been measured from orbit on a regular basis since the beginning of the space age. These observations span four Solar Cycles, and they are crucial for our understanding of the Sun-Earth connection and space weather. SSI at these wavelengths are the main drivers for the upper atmosphere including the production and destruction of ozone in the stratosphere. The instruments that measure UV SSI not only require good preflight calibration, but also need a robust method to maintain that calibration on orbit. We will give an overview of the catalog of current and former UV SSI measurements along with the calibration philosophy of each instrument and an estimation of the uncertainties in the published irradiances.
This report describes the effective public health response to a measles outbreak involving a university campus in Brisbane, Australia. Eleven cases in total were notified, mostly university students. The public health response included targeted measles vaccination clinics which were established on campus and focused on student groups most likely to have been exposed. The size of the university population, social interaction between students on and off campus, as well as limited vaccination records for the university community presented challenges for the control of this extremely infectious illness. We recommend domestic students ensure vaccinations are current prior to matriculation. Immunisation information should be included in university student enrolment packs. Incoming international students should ensure routine vaccinations are up-to-date prior to arrival in Australia, thereby reducing the risk of importation of measles and other infectious diseases.
Crop phenotype is usually expressed in terms of characteristics like plant height, leaf architecture and leaf area index (LAI). In the case of maize, stalk diameter is seldom quantified because its measurement does not readily lend itself to automation. Justification for automating the measurement of stalk diameter and plant spacing is based on the finding that stalk diameter was able to account for about 65% of the variability in maize yield per plant in three irrigated field studies. A high-speed reflectance sensor and simulation apparatus was developed to explore the potential for automating maize stalk diameter assessment. The prototyped system accurately measured both stalk diameter and plant spacing in the laboratory at simulated velocities up to 12 km/h.
When engaged in conversation, both parents and children tend to re-use words that their partner has just said. This study explored whether proportions of maternal and/or child utterances that overlapped in content with what their partner had just said contributed to growth in mean length of utterance (MLU), developmental sentence score, and vocabulary diversity over time. We analyzed the New England longitudinal corpus from the CHILDES database, comprising transcripts of mother–child conversations at 14, 20, and 32 months, using the CHIP command to compute proportions of utterances with overlapping content. Rates of maternal overlap, but not child overlap, at earlier time-points predicted child language outcomes at later time-points, after controlling for earlier child MLU. We suggest that maternal overlap plays a formative role in child language development by providing content that is immediately relevant to what the child has in mind.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
Over 300 cases of acute toxoplasmosis are confirmed by reference testing in England and Wales annually. We conducted a case-control study to identify risk factors for Toxoplasma gondii infection to inform prevention strategies. Twenty-eight cases and 27 seronegative controls participated. We compared their food history and environmental exposures using logistic regression to calculate odds ratios (OR) and 95% confidence intervals in a model controlling for age and sex. Univariable analysis showed that the odds of eating beef (OR 10·7, P < 0·001), poultry (OR 6·4, P = 0·01) or lamb/mutton (OR 4·9, P = 0·01) was higher for cases than controls. After adjustment for potential confounders a strong association between beef and infection remained (OR 5·6, P = 0·01). The small sample size was a significant limitation and larger studies are needed to fully investigate potential risk factors. The study findings emphasize the need to ensure food is thoroughly cooked and handled hygienically, especially for those in vulnerable groups.
Public agencies at all levels of government and other organizations that manage archaeological resources often face the problem of many undertakings that collectively impact large numbers of individually significant archaeological resources. Such situations arise when an agency is managing a large area, such as a national forest, land management district, park unit, wildlife refuge, or military installation. These situations also may arise in regard to large-scale development projects, such as energy developments, highways, reservoirs, transmission lines, and other major infrastructure projects that cover substantial areas. Over time, the accumulation of impacts from small-scale projects to individual archaeological resources may degrade landscape or regional-scale cultural phenomena. Typically, these impacts are mitigated at the site level without regard to how the impacts to individual resources affect the broader population of resources. Actions to mitigate impacts rarely are designed to do more than avoid resources or ensure some level of data recovery at single sites. Such mitigation activities are incapable of addressing research question at a landscape or regional scale.