We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As the novel coronavirus disease 2019 changed patient presentation, this study aimed to prospectively identify these changes in a single ENT centre.
Design
A seven-week prospective case series was conducted of urgently referred patients from primary care and accident and emergency department.
Results
There was a total of 133 referrals. Referral rates fell by 93 per cent over seven weeks, from a mean of 5.4 to 0.4 per day. Reductions were seen in referrals from both primary care (89 per cent) and the accident and emergency department (93 per cent). Presentations of otitis externa and epistaxis fell by 83 per cent, and presentations of glandular fever, tonsillitis and peritonsillar abscess fell by 67 per cent.
Conclusion
Coronavirus disease 2019 has greatly reduced the number of referrals into secondary care ENT. The cause for this reduction is likely to be due to patients’ increased perceived risk of the virus presence in a medical setting. The impact of this reduction is yet to be ascertained, but will likely result in a substantial increase in emergency pressures once the lockdown is lifted and the general public's perception of the coronavirus disease 2019 risk reduces.
Lewy body dementia, consisting of both dementia with Lewy bodies (DLB) and Parkinson's disease dementia (PDD), is considerably under-recognised clinically compared with its frequency in autopsy series.
Aims
This study investigated the clinical diagnostic pathways of patients with Lewy body dementia to assess if difficulties in diagnosis may be contributing to these differences.
Method
We reviewed the medical notes of 74 people with DLB and 72 with non-DLB dementia matched for age, gender and cognitive performance, together with 38 people with PDD and 35 with Parkinson's disease, matched for age and gender, from two geographically distinct UK regions.
Results
The cases of individuals with DLB took longer to reach a final diagnosis (1.2 v. 0.6 years, P = 0.017), underwent more scans (1.7 v. 1.2, P = 0.002) and had more alternative prior diagnoses (0.8 v. 0.4, P = 0.002), than the cases of those with non-DLB dementia. Individuals diagnosed in one region of the UK had significantly more core features (2.1 v. 1.5, P = 0.007) than those in the other region, and were less likely to have dopamine transporter imaging (P < 0.001). For patients with PDD, more than 1.4 years prior to receiving a dementia diagnosis: 46% (12 of 26) had documented impaired activities of daily living because of cognitive impairment, 57% (16 of 28) had cognitive impairment in multiple domains, with 38% (6 of 16) having both, and 39% (9 of 23) already receiving anti-dementia drugs.
Conclusions
Our results show the pathway to diagnosis of DLB is longer and more complex than for non-DLB dementia. There were also marked differences between regions in the thresholds clinicians adopt for diagnosing DLB and also in the use of dopamine transporter imaging. For PDD, a diagnosis of dementia was delayed well beyond symptom onset and even treatment.
We agree that the emergence of cumulative technological culture was tied to nonsocial cognitive skills, namely, technical-reasoning skills, which allowed humans to constantly acquire and improve information. Our concern is with a reading of the history of cumulative technological culture that is based largely on modern experiments in simulated settings and less on phenomena crucial to the long-term dynamics of cultural evolution.
The early Ordovician (∼385 Ma) Power Steps Formation, Newfoundland, Canada, exposes a well-preserved mudstone-dominated clinothem that serves as an excellent archive for understanding how mud has been produced, transported and converted into mudstone prior to the evolution of globally widespread, deep soil horizons. Sedimentological analysis of four sandstone and five mudstone facies, along the Ochre Cove clinothem, reveal that mud and sand were delivered by unidirectional currents and experienced episodic reworking by storm waves. Petrographic examination and X-ray diffraction from described mudstone facies reveal significant variability in the distribution of illite versus chlorite between the lower and upper part of the Ochre Cove clinothem. This research highlights that in the present-day clay mineral fraction, illite is often detrital whereas chlorite originated via the alteration of silt-sized, highly unstable, mafic (volcanoclastic?) grains. Throughout all sedimentologic facies, albeit in different proportions, these mafic lithic grains were diagenetically altered via in situ weathering before significant compaction occurred, resulting in the precipitation of significant volumes of pore-bridging, silica- and iron-rich chlorite cement. Compositional, diagenetic and textural attributes across the Ochre Cove mud clinothem vary as a function of starting composition, hydrodynamic sorting and grain density. Given that a significant proportion of clay minerals has been generated via in situ transformation of a mafic, non-stable precursor assemblage, we recommend future studies to incorporate detailed petrographic description along with X-ray diffraction analyses when aiming to employ trends in whole-rock clay mineral data as a proxy in provenance and palaeoclimate studies of very old (pre-Devonian) mudstones and sandstones.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
Late prehistoric archaeological research in Myanmar is in a phase of rapid expansion. Recent work by the Mission Archéologique Française au Myanmar aims to establish a reliable Neolithic to Iron Age culture-historical sequence, which can then be compared to surrounding regions of Southeast Asia. Excavations at Nyaung'gan and Oakaie in central Myanmar have provided 52 new AMS dates, which allow the creation of Myanmar's first reliable prehistoric radiometric chronology. They have also identified the Neolithic to Bronze Age transition in central Myanmar, which is of critical importance in understanding long-range interactions at the national, regional and inter-regional level. This research provides the first significant step towards placing late prehistoric Myanmar in its global context.
Precision farming advances are providing opportunities in both production agriculture and agricultural research. For growers and agronomists, the benefits of identifying where crops are stressed, the location of weeds and estimating yields on a large scale are clear. Researchers, who have different needs, can benefit from a detailed focus on a specific characteristic, such as one disease (e.g. yellow rust). This paper will review how recent advances in technology are beginning to allow the development of specialised tools within research and agriculture and how current precision agriculture tools can be effective at measuring desirable traits.
The insurance hypothesis is a reasonable explanation for the current obesity epidemic. One alternative explanation is that the marketing of high-sugar foods, especially sugar-sweetened beverages, drives the rise in obesity. Another prominent hypothesis is that obesity spreads through social influence. We offer a framework for estimating the extent to which these different models explain the rise in obesity.
The objective of this paper was to compare demographics, employment variables, satisfaction, and motivation for entering the field of Emergency Medical Services (EMS) between members of under-represented races/ethnicities and members of the majority group.
Methods
A cohort of nationally certified EMS professionals was followed for 10 years through annual surveys; however, race/ethnicity was only available for 9 years (2000-2008). Descriptive statistics and 95% confidence intervals (CIs) were calculated and significance was determined by lack of CI overlap.
Results
From 2000 through 2008, the range of proportions of nationally certified EMS professionals by race/ethnicity was as follows: whites: 83.5%-86.0%, Hispanics: 4.2%-5.9%, and African-Americans: 2.5%-4.6%. There were no significant changes in the proportion of minority EMS professionals over the study period. Hispanics and African-Americans combined increased slightly from 6.7% of the population in 2000 to 9.9% in 2008. Likewise, the proportion of all under-represented races/ethnicities increased slightly from 2000 (14.0%) to 2008 (16.5%). Females were under-represented in all years. Nationally certified African-Americans were significantly more likely to be certified at the Emergency Medical Technician (EMT)-Basic level (compared with the EMT-Paramedic level) than whites in all but one survey year. The proportion of Hispanics registered at the EMT-Basic level was significantly higher than whites in three survey years. Accordingly, a larger proportion of whites were nationally registered at the EMT-Paramedic level than both African-Americans and Hispanics. A significantly larger proportion of African-Americans reported working in urban communities (population >25,000) compared with whites for nine of the 10 survey years. Similarly, a significantly larger proportion of Hispanics worked in urban communities compared with whites in 2002 and from 2005 to 2008. For satisfaction measures, there were no consistent differences between races/ethnicities. Among factors for entering EMS, the proportion of whites who reported having a friend or family member in the field was significantly higher than African-Americans in all years and significantly higher than Hispanics in four of the nine years.
Conclusion
The ethnic/racial diversity of the population of nationally certified EMS professionals is not representative of the population served and has not improved over the 2000-2008 period. Similar to other health care professions, Hispanics and African-Americans are under-represented in EMS compared with the US population. This study serves as a baseline to examine under-represented populations in EMS.
CroweRP, LevineR, EggerichsJJ, BentleyMA. A Longitudinal Description of Emergency Medical Services Professionals by Race/Ethnicity. Prehosp Disaster Med. 2016;31(Suppl. 1):s30–s69.
Sustainability, culture change, inequality and global health are among the much-discussed challenges of our time, and rightly so, given the drastic effects such variables can have on modern populations. Yet with many populations today living in tightly connected geographic communities—cities, for example—or in highly networked electronic communities, can we still learn anything about societal challenges by studying simple farming communities from many thousands of years ago? We think there is much to learn, be it Malthusian pressures and ancient societal collapse, the devastating effects of European diseases on indigenous New World populations or endemic violence in pre-state societies (e.g. Pinker 2012). By affording a simpler, ‘slow motion’ view of processes that are greatly accelerated in this century, the detailed, long-term record of the European Neolithic can offer insight into many of these fundamental issues. These include: human adaptations to environmental change (Palmer & Smith 2014), agro-pastoral innovation, human population dynamics, biological and cultural development, hereditary inequality, specialised occupations and private ownership.
A history of self-injurious thoughts and behaviors (SITBs) is consistently cited as one of the strongest predictors of future suicidal behavior. However, stark discrepancies in the literature raise questions about the true magnitude of these associations. The objective of this study is to examine the magnitude and clinical utility of the associations between SITBs and subsequent suicide ideation, attempts, and death.
Method
We searched PubMed, PsycInfo, and Google Scholar for papers published through December 2014. Inclusion required that studies include at least one longitudinal analysis predicting suicide ideation, attempts, or death using any SITB variable. We identified 2179 longitudinal studies; 172 met inclusion criteria.
Results
The most common outcome was suicide attempt (47.80%), followed by death (40.50%) and ideation (11.60%). Median follow-up was 52 months (mean = 82.52, s.d. = 102.29). Overall prediction was weak, with weighted mean odds ratios (ORs) of 2.07 [95% confidence interval (CI) 1.76–2.43] for ideation, 2.14 (95% CI 2.00–2.30) for attempts, and 1.54 (95% CI 1.39–1.71) for death. Adjusting for publication bias further reduced estimates. Diagnostic accuracy analyses indicated acceptable specificity (86–87%) and poor sensitivity (10–26%), with areas under the curve marginally above chance (0.60–0.62). Most risk factors generated OR estimates of <2.0 and no risk factor exceeded 4.5. Effects were consistent regardless of sample severity, sample age groups, or follow-up length.
Conclusions
Prior SITBs confer risk for later suicidal thoughts and behaviors. However, they only provide a marginal improvement in diagnostic accuracy above chance. Addressing gaps in study design, assessment, and underlying mechanisms may prove useful in improving prediction and prevention of suicidal thoughts and behaviors.
To assess how breast-feeding and dietary diversity relate to infant length-for-age Z-score (LAZ) and weight-for-age Z-score (WAZ).
Design
Breast-feeding, dietary and anthropometric data from the Cebu Longitudinal Health and Nutrition Survey were analysed using sex-stratified fixed-effects longitudinal regression models. A dietary diversity score (DDS) based on seven food groups was classified as low (<4) or high (≥4). The complementary feeding patterns were: (i) non-breast-fed with low DDS (referent); (ii) breast-fed with low DDS; (iii) non-breast-fed with high DDS; and (iv) breast-fed with high DDS (optimal). Interactions between age, energy intake and complementary feeding patterns were included.
Setting
Philippines.
Subjects
Infants (n 2822) measured bimonthly from 6 to 24 months.
Results
Breast-feeding (regardless of DDS) was significantly associated with higher LAZ (until 24 months) and WAZ (until 20 months). For example, at 6 months, breast-fed boys with low DDS were 0·246 (95 % CI 0·191, 0·302) sd longer and 0·523 (95 % CI 0·451, 0·594) sd heavier than the referent group. There was no significant difference in size between breast-fed infants with high v. low DDS. Similarly, high DDS conferred no advantage in LAZ or WAZ among non-breast-fed infants. There were modest correlations between the 7-point DDS and nutrient intakes but these correlations were substantially attenuated after energy adjustment. We elucidated several interactions between sex, age, energy intake and complementary feeding patterns.
Conclusions
These results demonstrate the importance of prolonged breast-feeding up to 24 months. The DDS provided qualitative information on infant diets but did not confer a significant advantage in LAZ or WAZ.