To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In Daily Life of the Aztecs, Frances Berdan and Michael E. Smith offer a view into the lives of real people, doing very human things, in the unique cultural world of Aztec central Mexico. The first section focuses on people from an array of social classes - the emperor, a priest, a feather worker, a merchant, a farmer, and a slave - who interacted in the economic, social and religious realms of the Aztec world. In the second section, the authors examine four important life events where the lives of these and others intersected: the birth and naming of a child, market day, a day at court, and a battle. Through the microscopic views of individual types of lives, and interweaving of those lives into the broader Aztec world, Berdan and Smith recreate everyday life in the final years of the Aztec Empire.
Older patients with complex care needs and limited personal and social resources are heavy users of emergency department (ED) services and are often admitted when they present to the ED. Updated information is needed regarding the most effective strategies to appropriately avoid ED presentation and hospital admission among older patients.
This systematic review aimed to identify interventions that have demonstrated effectiveness in decreasing ED use and hospital admissions in older patients. We conducted a comprehensive literature search within Ovid MEDLINE, EMBASE, CINAHL, and Cochrane Central Register of Controlled Trials from database inception to July 2019 with no language restrictions. Interventional study designs conducted in populations of 65 years and older were included. Primary outcomes were ED visits and hospital admissions. Secondary outcomes included hospital readmission, mortality, cost, and patient-reported outcomes.
Of 7,943 citations reviewed for eligibility, 53 studies were included in our qualitative synthesis, including 26 randomized controlled trials (RCT), 8 cluster-RCTs, and 19 controlled before-after studies. Data characterization revealed that community-based strategies reduced ED visits, particularly those that included comprehensive geriatric assessments and home visits. These strategies reported decreases in mean ED use (for interventions versus controls) ranging from -0.12 to -1.32 visits/patient. Interventions that included home visits also showed reductions in hospital admissions ranging from -6% to -14%. There was, however, considerable variability across individual studies with respect to outcome reporting, statistical analyses, and risk of bias, which limited our ability to further quantify the effect of these interventions.
Various interventional strategies to avoid ED presentations and hospital admissions for older patients have been studied. While models of care that include comprehensive geriatric assessments and home visits may reduce acute care utilization, the standardization of outcome measures is needed to further delineate which parts of these complex interventions are contributing to efficacy. The potential effects of multidisciplinary team composition on patient outcomes also warrant further investigation.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
As part of the ongoing effort to improve the Northern Hemisphere radiocarbon (14C) calibration curve, this study investigates the period of 856 BC to 626 BC (2805–2575 yr BP) with a total of 403 single-year 14C measurements. In this age range, IntCal13 was constructed largely from German and Irish oak as well as Californian bristlecone pine 14C dates, with most samples measured with a 10-yr resolution. The new data presented here is the first atmospheric 14C single-year record of the older end of the Hallstatt plateau based on an absolutely dated tree-ring chronology. The data helped reveal a major solar proton event (SPE) which caused a spike in the production rate of cosmogenic radionuclides around 2610/2609 BP. This production event is thought to have reached a magnitude similar to the 774/775 AD production event but has remained undetected due to averaging effects in the decadal calibration data. The record leading up to the 2610/2609 BP event reveals a 11-yr solar cycle with varying cyclicity. Features of the new data and the benefits of higher resolution calibration are discussed.
Food security has been suggested to be a risk factor for depression, stress and anxiety. We therefore undertook a systematic review and meta-analysis of available publications to examine these associations further.
Relevant studies were identified by searching Web of Science, Embase, Scopus and PubMed databases up to January 2019.
OR was pooled using a random-effects model. Standard methods were used for assessment of heterogeneity and publication bias.
Data were available from nineteen studies with 372 143 individual participants from ten different countries that were pooled for the meta-analysis.
The results showed there was a positive relationship between food insecurity (FI) and risk of depression (OR = 1·40; 95 % CI: 1·30, 1·58) and stress (OR = 1·34; 95 % CI: 1·24, 1·44) but not anxiety. Subgroup analysis by age showed that subjects older than ≥65 years exhibited a higher risk of depression (OR = 1·75; 95 % CI: 1·20, 2·56) than younger participants (OR = 1·34; 95 % CI: 1·20, 1·50), as well as a greater risk of depression in men (OR = 1·42; 95 % CI: 1·17, 1·72) than women (OR = 1·30; 95 % CI: 1·16, 1·46). Finally, subgroup analysis according to geographical location illustrated that food insecure households living in North America had the highest risk of stress and anxiety.
The evidence from this meta-analysis suggests that FI has a significant effect on the likelihood of being stressed or depressed. This indicates that health care services, which alleviate FI, would also promote holistic well-being in adults.
Our knowledge and understanding of the structure and function of complex host-associated communities has grown exponentially in the last decade through improvements in sequencing technologies. Despite this, there are still many outstanding research questions, which will undoubtably lead to many more. Concerted effort is required to elucidate the composition and function of taxonomic groups other than bacteria that constitute host microbiomes, and to extend our current cataloguing efforts to non-model and field-based host organisms. Further to this, we need to continue to move beyond the 'who?' question provided by relatively cheap amplicon sequencing to gain a better understanding of 'what?' the microbiome is doing, using metatranscriptomics approaches. Critically, we need to understand how members of the microbiome interact to confer function. Given the current unprecedented environmental change, microbiome plasticity may prove vital to host resilience and fitness. Furthermore, there is considerable potential for microbial biotechnology to improve numerous aspects of humanity, although care must be taken to ensure environmental and social justice prevail.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Through a long history of co-evolution, multicellular organisms form a complex of host cells plus many associated microorganism species. Consisting of algae, bacteria, archaea, fungi, protists and viruses, and collectively referred to as the microbiome, these microorganisms contribute to a range of important functions in their hosts, from nutrition, to behaviour and disease susceptibility. In this book, a diverse and international group of active researchers outline how multicellular organisms have become reliant on their microbiomes to function, and explore this vital interdependence across the breadth of soil, plant, animal and human hosts. They draw parallels and contrasts across hosts in different environments, and discuss how this invisible microbial ecosystem influences everything from the food we eat, to our health, to the correct functioning of ecosystems we depend on. This insightful read also pertinently encourages students and researchers in microbial ecology, ecology, and microbiology to consider how this interdependence may be key to mitigating environmental changes and developing microbial biotechnology to improve life on Earth.
Low-carbohydrate diets (LCD) have been promoted for weight control and type 2 diabetes (T2D) management, based on an emerging body of evidence, including meta-analyses with an indication of publication bias. Proposed definitions vary between 50 and 130 g/d, or <10 and <40 % of energy from carbohydrate, with no consensus on LCD compositional criteria. LCD are usually followed with limited consideration for other macronutrients in the overall diet composition, introducing variance in the constituent foods and in metabolic responses. For weight management, extensive evidence supports LCD as a valid weight loss treatment, up to 1–2 years. Solely lowering carbohydrate intake does not, in the medium/long term, reduce HbA1c for T2D prevention or treatment, as many mechanisms interplay. Under controlled feeding conditions, LCD are not physiologically or clinically superior to diets with higher carbohydrates for weight-loss, fat loss, energy expenditure or glycaemic outcomes; indeed, all metabolic improvements require weight loss. Long-term evidence also links the LCD pattern to increased CVD risks and mortality. LCD can lead to micronutrient deficiencies and increased LDL-cholesterol, depending on food selection to replace carbohydrates. Evidence is limited but promising regarding food choices/sources to replace high-carbohydrate foods that may alleviate the negative effects of LCD, demanding further insight into the dietary practice of medium to long term LCD followers. Long-term, high-quality studies of LCD with different food sources (animal and/or plant origins) are needed, aiming for clinical endpoints (T2D incidence and remission, cardiovascular events, mortality). Ensuring micronutrient adequacy by food selection or supplementation should be considered for people who wish to pursue long-term LCD.
Soccer is the most popular sport worldwide and is the only sport where athletes purposely use their head to deflect the ball during play, termed “heading” the ball. These repetitive head impacts (RHI) are associated with worse neuropsychological function; however, factors that can increase risk of injury following exposure to such head impacts have been largely unexamined. The present study provided a novel examination of the modifying role of sleep on the relationship between RHI exposure and neuropsychological function in college-age soccer players.
Fifty varsity and intramural college soccer players completed questionnaires assessing recent and long-term heading exposure, a self-report measure of sleep function, and a battery of neuropsychological tests.
A high level of recent heading exposure was significantly associated with poorer processing speed, independent of concussion history. With reduced sleep duration, a high level of recent heading exposure was related to worse sustained attention. However, with greater hours of sleep duration, heading exposure was related to preserved neuropsychological outcome in sustained attention.
We replicated our earlier finding of an association between recent head impact exposure and worse processing speed in an independent sample. In addition, we found that sleep may serve as a risk or protective factor for soccer players following extensive exposure to head impacts. Ultimately, this study furthers the understanding of factors impacting neuropsychological function in soccer players and provides empirical support for sleep interventions to help ensure safer soccer play and recovery from injury.
To evaluate the National Health Safety Network (NHSN) hospital-onset Clostridioides difficile infection (HO-CDI) standardized infection ratio (SIR) risk adjustment for general acute-care hospitals with large numbers of intensive care unit (ICU), oncology unit, and hematopoietic cell transplant (HCT) patients.
Retrospective cohort study.
Eight tertiary-care referral general hospitals in California.
We used FY 2016 data and the published 2015 rebaseline NHSN HO-CDI SIR. We compared facility-wide inpatient HO-CDI events and SIRs, with and without ICU data, oncology and/or HCT unit data, and ICU bed adjustment.
For these hospitals, the median unmodified HO-CDI SIR was 1.24 (interquartile range [IQR], 1.15–1.34); 7 hospitals qualified for the highest ICU bed adjustment; 1 hospital received the second highest ICU bed adjustment; and all had oncology-HCT units with no additional adjustment per the NHSN. Removal of ICU data and the ICU bed adjustment decreased HO-CDI events (median, −25%; IQR, −20% to −29%) but increased the SIR at all hospitals (median, 104%; IQR, 90%–105%). Removal of oncology-HCT unit data decreased HO-CDI events (median, −15%; IQR, −14% to −21%) and decreased the SIR at all hospitals (median, −8%; IQR, −4% to −11%).
For tertiary-care referral hospitals with specialized ICUs and a large number of ICU beds, the ICU bed adjustor functions as a global adjustment in the SIR calculation, accounting for the increased complexity of patients in ICUs and non-ICUs at these facilities. However, the SIR decrease with removal of oncology and HCT unit data, even with the ICU bed adjustment, suggests that an additional adjustment should be considered for oncology and HCT units within general hospitals, perhaps similar to what is done for ICU beds in the current SIR.
It is unclear what session frequency is most effective in cognitive–behavioural therapy (CBT) and interpersonal psychotherapy (IPT) for depression.
Compare the effects of once weekly and twice weekly sessions of CBT and IPT for depression.
We conducted a multicentre randomised trial from November 2014 through December 2017. We recruited 200 adults with depression across nine specialised mental health centres in the Netherlands. This study used a 2 × 2 factorial design, randomising patients to once or twice weekly sessions of CBT or IPT over 16–24 weeks, up to a maximum of 20 sessions. Main outcome measures were depression severity, measured with the Beck Depression Inventory-II at baseline, before session 1, and 2 weeks, 1, 2, 3, 4, 5 and 6 months after start of the intervention. Intention-to-treat analyses were conducted.
Compared with patients who received weekly sessions, patients who received twice weekly sessions showed a statistically significant decrease in depressive symptoms (estimated mean difference between weekly and twice weekly sessions at month 6: 3.85 points, difference in effect size d = 0.55), lower attrition rates (n = 16 compared with n = 32) and an increased rate of response (hazard ratio 1.48, 95% CI 1.00–2.18).
In clinical practice settings, delivery of twice weekly sessions of CBT and IPT for depression is a way to improve depression treatment outcomes.