To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
High-quality diets have been found to be beneficial in preventing long-term weight gain. However, concurrent changes in diet quality and body weight over time have rarely been reported. We examined the association between 10-year changes in diet quality and body weight in the Multiethnic Cohort Study. Analyses included 53 977 African Americans, Native Hawaiians, Japanese Americans, Latinos and Whites, who completed both baseline (1993–1996, 45–69 years) and 10-year follow-up (2003–2008) surveys including a FFQ and had no history of heart disease or cancer. Using multivariable regression, weight changes were regressed on changes in four diet quality indexes, Healthy Eating Index-2015, Alternative Healthy Eating Index-2010, alternate Mediterranean Diet and Dietary Approaches to Stop Hypertension scores. Mean weight change over 10 years was 1·2 (sd 6·8) kg in men and 1·5 (sd 7·2) kg in women. Compared with stable diet quality (< 0·5 sd change), the greatest increase (≥ 1 sd increase) in the diet scores was associated with less weight gain (by 0·55–1·17 kg in men and 0·62–1·31 kg in women). Smaller weight gain with improvement in diet quality was found in most subgroups by race/ethnicity, baseline age and baseline BMI. The inverse association was stronger in younger age and higher BMI groups. Ten-year improvement in diet quality was associated with a smaller weight gain, which varied by race/ethnicity and baseline age and BMI. Our findings suggest that maintaining a high-quality diet and improving diet quality over time may prevent excessive weight gain.
Cognitive impairment plays a key role in determining the course of illness and functional outcomes in mood disorders. This article summarises and discusses important papers within this thematic series of BJPsych Open that contribute to a greater understanding of the complexity of ‘Cognition in Mood Disorders’.
Influenza vaccine effectiveness (VE) wanes over the course of a temperate climate winter season but little data are available from tropical countries with year-round influenza virus activity. In Singapore, a retrospective cohort study of adults vaccinated from 2013 to 2017 was conducted. Influenza vaccine failure was defined as hospital admission with polymerase chain reaction-confirmed influenza infection 2–49 weeks after vaccination. Relative VE was calculated by splitting the follow-up period into 8-week episodes (Lexis expansion) and the odds of influenza infection in the first 8-week period after vaccination (weeks 2–9) compared with subsequent 8-week periods using multivariable logistic regression adjusting for patient factors and influenza virus activity. Records of 19 298 influenza vaccinations were analysed with 617 (3.2%) influenza infections. Relative VE was stable for the first 26 weeks post-vaccination, but then declined for all three influenza types/subtypes to 69% at weeks 42–49 (95% confidence interval (CI) 52–92%, P = 0.011). VE declined fastest in older adults, in individuals with chronic pulmonary disease and in those who had been previously vaccinated within the last 2 years. Vaccine failure was significantly associated with a change in recommended vaccine strains between vaccination and observation period (adjusted odds ratio 1.26, 95% CI 1.06–1.50, P = 0.010).
Patients with pregnancy-associated secondary brain tumors (PASBT) are challenging to manage. Because no guidelines for the management of such patients currently exist, we performed a systematic review of the literature using PRISMA guidelines with a discussion of management from a neurosurgeon’s perspective.
Systematic review of the literature using PRISMA guidelines from 1999 to 2018.
We identified 301 studies of which 16 publications (22 patients reporting 25 pregnancies, 20 deliveries, 5 early terminations) were suitable for final analysis. The most frequent primary cancers were breast (8/22, 36.36%), skin (6/22, 27.27%), and lung (5/22, 22.73%). Four patients (18.18%) had neurosurgical procedures during their pregnancies. Five patients (22.73%) received neurosurgical resection after their pregnancies. Nine patients (40.91%) received radiation therapy and seven patients (31.82%) received chemotherapy during pregnancy while seven patients (31.82%) received chemotherapy and radiation after pregnancy. There was 1 fetal death (5%) out of 20 healthy deliveries. Five pregnancies (20%) were terminated in the first trimester due to a need for urgent neurosurgical intervention.
Management of PASBT remains a challenging issue. Maternal and fetal risks associated with surgical resection and teratogenicity due to adjuvant therapy should be discussed in the context of a multidisciplinary team. Timing of surgery and the use of systemic chemoradiation depends on the gestational age (GA) of the fetus, extent, and control of the mother’s primary and metastatic disease. Guidelines need to be established to help neuro-oncology teams safely and effectively manage this group of patients.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: Surgical site infections (SSIs) among cardiothoracic (CT) patients are associated with high rates of morbidity and mortality. Data are limited regarding SSI incidence among pediatric patients undergoing primary reparative procedures for congenital cardiac disease. Published evidence on targeted interventions to prevent pediatric CT-surgery SSI is lacking. We aimed to establish standard metrics for measuring CT-surgery SSI incidence and to implement bundled interventions for SSI prevention. Methods: A dedicated CT-surgery SSI prevention workgroup was established, consisting of hospital leadership, CT surgeons, cardiac critical care unit staff, anesthesia, perfusion, environmental services, instrument sterile processing, risk management, infection prevention and antibiotic stewardship. We created a standard definition for CT-surgery SSI and calculated retrospective SSI rates over a 24-month period (2017–2019). The outcome measured was incidence of CT-surgery SSI per 100 primary cardiac procedures with delayed ( 3 days after primary surgery) or non-delayed chest closure. The difference in proportion of SSI was reported separately for delayed closure and non-delayed closure; statistical significance was tested using a Fisher’s Exact test. We identified many potential improvement opportunities, including gaps in SSI surveillance, poor compliance with daily bathing, inconsistent perioperative antimicrobial prophylaxis, lack of controlled environment for bedside chest closures, and lapses in environmental cleaning. These issues informed the enhanced SSI prevention bundle, which included education on sterility with the operating room (OR) staff. Protocols for care of cardiac patients with delayed chest closures focused on universal daily and preoperative chlorhexidine baths. In addition, the bundle incorporated stringent environmental cleaning interventions including scheduled decluttering of patient rooms and clinical spaces, terminal cleaning of patient rooms prior to returning from the OR, and use of adjunctive ultraviolet light for the daily cleaning of operating rooms and patient rooms at discharge. Results: Surveillance definition of microbiological growth from a clinical sample obtained within 30 days of primary cardiac procedure sufficiently captured all CT-surgery SSIs. Of 551 CT-surgery procedures prior to intervention, 91 (17%) had delayed final operative closures. Prior to the intervention, 16 SSIs were identified from July 2017 – May 2019 for a rate of 2.90 per /100 procedures, and was higher among patients with delayed chest closure 6.59 per /100 procedures (6 SSIs/91 procedures) versus those with primary chest closure 2.17 per /100 procedures (10 SSIs/460 procedures; P = 0.034). Gram-positive organisms, including coagulase coagulase-negative Staphylococci, were most frequently identified as the causative organisms for SSIs. Compliance with bundled intervention, rolled out over a 2-month period, was associated with an immediate decrease in the number of SSIs for primary and delayed chest closures 6SSIs /185 procedures in the initial quarters (August – December 2019) of the post-intervention period. However, this decrease was not reflected in the overall rate (3.24 per /100 procedures) due to fewer procedures performed. Data collection to measure sustainability is ongoing. Conclusions: Bundled interventions targeting skin antisepsis and environmental cleaning may be associated with a decrease in SSIs among pediatric CT-surgery patients. Ongoing surveillance is required to determine sustainability of these interventions.
Background: In March 2012, the Veterans’ Health Administration (VHA) published the Guideline for the Prevention of Clostridium difficile infection (CDI) in VHA Inpatient Acute-Care Facilities, with a goal of 30% reduction of cases within 2 years. In March 2011, this facility, along with 31 others, served as a pilot site to develop the guidelines. Methods: The CDI prevention bundle was implemented to prevent new onset CDI cases in the facility with 4 core measures: (1) environmental cleaning (EMS), (2) hand hygiene, (3) contact precautions, and (4) cultural transformation. Education was provided to EMS staff, nursing, and care providers on the CDI case definition, criteria for testing, empiric isolation for patients with diarrhea, hand hygiene, and PPE to control spread. In 2014, antimicrobial stewardship was added, and within 5 years an algorithm for isolation and testing was published. Cases were reviewed weekly using TheraDoc software and were reported monthly to the national VHA Inpatient Evaluation Center (IPEC). Isolation was communicated using a ward roster/isolation list in TheraDoc for all unit champions to consult daily. CDI cases were classified using NHSN definitions for a laboratory-identified (LabID) event, recurrent cases, and community-onset cases. Real-time case review and weekly multidisciplinary case discussions identified opportunities for improved compliance with the core measures. Results: Over an 8-year period, CDI healthcare-onset LabID events decreased by 73%. The cases decreased from 149 to 40 over the 8-year period. The infection rate decreased 70% from 16.19 per 10,000 bed days of care in FY2011 (October 2010) to 4.88 in FY2019. The incidence of community onset infections increased from 75 in FY2011 to a high of 146 in FY2018 for a rate of 8.15 to 18.17. In FY2019, there was a decrease in both LabID events and community-onset cases to lows of 40 and 102, respectively. Inappropriate testing decreased by 84% from 50 in FY2011 to 8 in FY2019. Conclusions: A multidisciplinary team approach that included support from leadership and clinical providers as well as front line staff involvement, daily rounding, and case review by infection preventionists has reduced all CDI cases over an 8-year period using the modified VHA CDI bundle. TheraDoc enabled case review, correct isolation, changes to cleaning practices, and more appropriate lab testing. The antimicrobial stewardship program that includes clinical pharmacists working daily with providers was a strong driver for change.
To conduct international comparisons of self-reports, collateral reports, and cross-informant agreement regarding older adult psychopathology.
We compared self-ratings of problems (e.g. I cry a lot) and personal strengths (e.g. I like to help others) for 10,686 adults aged 60–102 years from 19 societies and collateral ratings for 7,065 of these adults from 12 societies.
Data were obtained via the Older Adult Self-Report (OASR) and the Older Adult Behavior Checklist (OABCL; Achenbach et al., 2004).
Cronbach’s alphas were .76 (OASR) and .80 (OABCL) averaged across societies. Across societies, 27 of the 30 problem items with the highest mean ratings and 28 of the 30 items with the lowest mean ratings were the same on the OASR and the OABCL. Q correlations between the means of the 0–1–2 ratings for the 113 problem items averaged across all pairs of societies yielded means of .77 (OASR) and .78 (OABCL). For the OASR and OABCL, respectively, analyses of variance (ANOVAs) yielded effect sizes (ESs) for society of 15% and 18% for Total Problems and 42% and 31% for Personal Strengths, respectively. For 5,584 cross-informant dyads in 12 societies, cross-informant correlations averaged across societies were .68 for Total Problems and .58 for Personal Strengths. Mixed-model ANOVAs yielded large effects for society on both Total Problems (ES = 17%) and Personal Strengths (ES = 36%).
The OASR and OABCL are efficient, low-cost, easily administered mental health assessments that can be used internationally to screen for many problems and strengths.
Anthocyanins and bromelain have gained significant attention due to their antioxidative and anti-inflammatory properties. Both have been shown to improve endothelial function, blood pressure (BP) and oxygen utility capacity in humans; however, the combination of these two and the impacts on endothelial function, BP, total antioxidant capacity (TAC) and oxygen utility capacity have not been previously investigated. The purpose of this study was to investigate the impacts of a combined anthocyanins and bromelain supplement (BE) on endothelial function, BP, TAC, oxygen utility capacity and fatigability in healthy adults. Healthy adults (n 18, age 24 (sd 4) years) received BE or placebo in a randomised crossover design. Brachial artery flow-mediated dilation (FMD), BP, TAC, resting heart rate, oxygen utility capacity and fatigability were measured pre- and post-BE and placebo intake. The BE group showed significantly increased FMD, reduced systolic BP and improved oxygen utility capacity compared with the placebo group (P < 0·05). Tissue saturation and oxygenated Hb significantly increased following BE intake, while deoxygenated Hb significantly decreased (P < 0·05) during exercise. Additionally, TAC was significantly increased following BE intake (P < 0·05). There were no significant differences for resting heart rate, diastolic BP or fatigability index. These results suggest that BE intake is an effective nutritional therapy for improving endothelial function, BP, TAC and oxygen utility capacity, which may be beneficial to support vascular health in humans.
There is a need for accurate, inexpensive and field-friendly methods to assess body composition in children. Bioelectrical impedance analysis (BIA) is a promising approach; however, there have been limited validation and use among young children in resource-poor settings. We aim to develop and validate population-specific prediction equations for estimating total fat mass (FM), fat free-mass (FFM) and percentage body fat (PBF) in Vietnamese children (4–7 years) using reactance and resistance from BIA, anthropometric variables and demographic information. We conducted a cross-sectional survey of 120 children. Body composition was measured using dual-energy X-ray absorptiometry (DXA), BIA and anthropometry. To develop prediction equations, we split all data into development (70 %) and validation datasets (30 %). The model performance was evaluated using predicted residual error sum of squares, root mean squared error (RMSE), mean absolute error (MAE) and R2. We identified a top performing model with the least number of parameters (age, sex, weight and resistance index or resistance and height), low RMSE (FM 0·70, FFM 0·74, PBF 3·10), low MAE (FM 0·55, FFM 0·62, PBF 2·49), high R2 (FM 0·95, FFM 0·92, PBF 0·82) and the least difference between predicted values and actual values from DXA (FM 0·03 kg or 0·01 sd, FFM 0·06 kg or 0·02 sd, PBF 0·27 % or 0·04 sd). In conclusion, we developed the first valid and highly predictive equations to estimate FM, FFM and PBF in Vietnamese children using BIA. These findings have important implications for future research on the double burden of disease and risks associated with overweight and obesity in young children.
A significant proportion of inpatient antimicrobial prescriptions are inappropriate. Post-prescription review with feedback has been shown to be an effective means of reducing inappropriate antimicrobial use. However, implementation is resource intensive. Our aim was to evaluate the performance of traditional statistical models and machine-learning models designed to predict which patients receiving broad-spectrum antibiotics require a stewardship intervention.
We performed a single-center retrospective cohort study of inpatients who received an antimicrobial tracked by the antimicrobial stewardship program. Data were extracted from the electronic medical record and were used to develop logistic regression and boosted-tree models to predict whether antibiotic therapy required stewardship intervention on any given day as compared to the criterion standard of note left by the antimicrobial stewardship team in the patient’s chart. We measured the performance of these models using area under the receiver operating characteristic curves (AUROC), and we evaluated it using a hold-out validation cohort.
Both the logistic regression and boosted-tree models demonstrated fair discriminatory power with AUROCs of 0.73 (95% confidence interval [CI], 0.69–0.77) and 0.75 (95% CI, 0.72–0.79), respectively (P = .07). Both models demonstrated good calibration. The number of patients that would need to be reviewed to identify 1 patient who required stewardship intervention was high for both models (41.7–45.5 for models tuned to a sensitivity of 85%).
Complex models can be developed to predict which patients require a stewardship intervention. However, further work is required to develop models with adequate discriminatory power to be applicable to real-world antimicrobial stewardship practice.
Mood disorders, i.e. major depressive disorder (MDD) and bipolar disorders, are leading sources of disability worldwide. Currently available treatments do not yield remission in approximately a third of patients with a mood disorder. This is in part because these treatments do not target a specific core pathology underlying these heterogeneous disorders. In recent years, abnormal inflammatory processes have been identified as putative pathophysiological mechanisms and treatment targets in mood disorders, particularly among individuals with treatment-resistant conditions.
In this selective review, we aimed to summarise recent advances in the field of immunopsychiatry, including emerging pathophysiological models and findings from treatment ttrials of immunomodulatory agents for both MDD and bipolar disorders.
We performed a literature review by searching Medline for clinical trials of immunomodulating agents as monotherapy or adjunctive treatments in MDD and bipolar disorders. Included studies are randomised controlled trials (RCTs), cluster RCTs or cross-over trials of immunomodulating agents that had an active comparator or a placebo-arm.
Current evidence shows an association between inflammation and mood symptoms. However, there is conflicting evidence on whether this link is causal.
Future studies should focus on identifying specific neurobiological underpinnings for the putative causal association between an activated inflammatory response and mood disorders. Results of these studies are needed before further treatment trials of immunomodulatory agents can be justified.
OBJECTIVES/GOALS: Patient online portal (POP) allows patients to access electronic health records (EHRs) and have efficient communication with their clinicians. We assessed disparities in access to POP by families with different SES and its impact on asthma research which is little known in the literature. METHODS/STUDY POPULATION: A randomized controlled trial testing the efficacy of an EHRs-based clinical decision support (CDS) system was conducted at a pediatric primary care setting of Mayo Clinic. Asthma Control Test (ACT) questionnaire was administered to parents every 3 months through phone or email for this study after consenting, and reminders were sent to unanswered subjects through the POP. SES was measured by HOUSES (in quartiles), a validated individual-level SES index based on housing features (the higher HOUSES, the higher SES).The association of HOUSES with availability of POP access and missing ACT score rate was assessed. RESULTS/ANTICIPATED RESULTS: The mean age of 184 participants was 9.0 years (57% male) and parents of 152 (83%) children had POP. Only 68% of children from lowest HOUSES (Q1) had access to POP (vs. 74% (Q2), 88% (Q3), and 92% (Q4; highest SES); p = .02). ACT score was completed by 144 (78%), 150 (82%), 171 (94%), and 164 (95%) at each intervention conducted every 3 months with a total of 61 (33%) missing at least once. Overall, children whose parents had access to POP had a lower missing rate in ACT score at all interventions during the study; 16% (those with access to POP) vs. 47% (those without), 13% vs. 44%, 3% vs. 16%, and 1% vs. 23% for 1st, 2nd, 3rd, and 4th intervention, respectively (p < .007 for all). DISCUSSION/SIGNIFICANCE OF IMPACT: There are significant disparities in access to POP by SES defined by HOUSES which impact availability of ACT score resulting in a systematic bias in asthma research and potentially widening disparities in asthma care. CONFLICT OF INTEREST DESCRIPTION: NA.
Vertebrates may be born highly dependent (altricial) or may rapidly gain independence (precocial). Primates are generally considered somatically precocial. However, all are at least initially helpless, and many primates have a prolonged phase of juvenility. In this chapter, we discuss how selection may influence the relative timing of appearance of morphological features (heterochrony). Newborn primate morphology offers unique insights into the roles of prenatal and postnatal growth processes, primarily because metabolic costs for growth commence a transition from the mother to the infant at this point in time. With this in mind, primates vary remarkably at birth in dental eruption and mineralization status as well as limb skeleton ossification (e.g., wrists and ankles). We also discuss evidence, still relatively scant, that at birth primates vary greatly in the degree to which neural organs (e.g., brains, eyes) have achieved adult size and proportions. In preparation for morphological descriptions to follow, the reader is introduced to the concept of modularity of growth: different parts of the skeleton or even parts of regions have different rates of growth and development.
Skeletal Anatomy of the Newborn Primate was written to broaden our knowledge of non-human primates from a comparative and developmental perspective. This chapter explains that the main focus of our book is on the inherently risky neonatal period. The “neonate,” or newborn, is considered here to be a perinatal primate of up to seven days postnatal age. However, there is no simple way to physically identify primate newborns, not in the same many have defined “infants,” based on dental maturity. This is precisely what makes the neonatal stage so interesting: primates, like most other groups of mammals, vary in how rapidly they attain physical maturity. This introductory chapter discusses terminology and methodological challenges in studying newborns.
Feeding ontogeny in primates has three stages. In utero, nutrition is gained maternally. After birth, primates suckle. We know little about functional variation in these stages. The transition to adult feeding – highlighted by weaning – varies across species. Variation is tied to many socioecological and morphological influences across primates. Primate feeding apparatus ontogeny is affected by many factors. Diet exhibits a complex relationship with the clearest signal marked by rapid dental mineralization and eruption in folivorous strepsirrhines. Mineralization varies across primates. Emergence and eruption of postcanine teeth tends to follow size in both suborders with smaller taxa showing earlier emergence, the exception being rapid eruption in some folivores. Compared to teeth, less is known about the musculoskeletal ontogeny of the feeding apparatus. Most studies compare closely related species and link musculoskeletal robustness to challenging diets. Looking forward, better understanding of primate feeding apparatus growth will require improved samples (a challenge for long-lived species) and emphasis on the evolutionary significance of feeding throughout ontogeny.