We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Influenza vaccine effectiveness (VE) wanes over the course of a temperate climate winter season but little data are available from tropical countries with year-round influenza virus activity. In Singapore, a retrospective cohort study of adults vaccinated from 2013 to 2017 was conducted. Influenza vaccine failure was defined as hospital admission with polymerase chain reaction-confirmed influenza infection 2–49 weeks after vaccination. Relative VE was calculated by splitting the follow-up period into 8-week episodes (Lexis expansion) and the odds of influenza infection in the first 8-week period after vaccination (weeks 2–9) compared with subsequent 8-week periods using multivariable logistic regression adjusting for patient factors and influenza virus activity. Records of 19 298 influenza vaccinations were analysed with 617 (3.2%) influenza infections. Relative VE was stable for the first 26 weeks post-vaccination, but then declined for all three influenza types/subtypes to 69% at weeks 42–49 (95% confidence interval (CI) 52–92%, P = 0.011). VE declined fastest in older adults, in individuals with chronic pulmonary disease and in those who had been previously vaccinated within the last 2 years. Vaccine failure was significantly associated with a change in recommended vaccine strains between vaccination and observation period (adjusted odds ratio 1.26, 95% CI 1.06–1.50, P = 0.010).
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Observational studies have found associations between smoking and both poorer cognitive ability and lower educational attainment; however, evaluating causality is challenging. We used two complementary methods to explore this.
Methods
We conducted observational analyses of up to 12 004 participants in a cohort study (Study One) and Mendelian randomisation (MR) analyses using summary and cohort data (Study Two). Outcome measures were cognitive ability at age 15 and educational attainment at age 16 (Study One), and educational attainment and fluid intelligence (Study Two).
Results
Study One: heaviness of smoking at age 15 was associated with lower cognitive ability at age 15 and lower educational attainment at age 16. Adjustment for potential confounders partially attenuated findings (e.g. fully adjusted cognitive ability β −0.736, 95% CI −1.238 to −0.233, p = 0.004; fully adjusted educational attainment β −1.254, 95% CI −1.597 to −0.911, p < 0.001). Study Two: MR indicated that both smoking initiation and lifetime smoking predict lower educational attainment (e.g. smoking initiation to educational attainment inverse-variance weighted MR β −0.197, 95% CI −0.223 to −0.171, p = 1.78 × 10−49). Educational attainment results were robust to sensitivity analyses, while analyses of general cognitive ability were less so.
Conclusion
We find some evidence of a causal effect of smoking on lower educational attainment, but not cognitive ability. Triangulation of evidence across observational and MR methods is a strength, but the genetic variants associated with smoking initiation may be pleiotropic, suggesting caution in interpreting these results. The nature of this pleiotropy warrants further study.
Diet has a major influence on the composition and metabolic output of the gut microbiome. Higher-protein diets are often recommended for older consumers; however, the effect of high-protein diets on the gut microbiota and faecal volatile organic compounds (VOC) of elderly participants is unknown. The purpose of the study was to establish if the faecal microbiota composition and VOC in older men are different after a diet containing the recommended dietary intake (RDA) of protein compared with a diet containing twice the RDA (2RDA). Healthy males (74⋅2 (sd 3⋅6) years; n 28) were randomised to consume the RDA of protein (0⋅8 g protein/kg body weight per d) or 2RDA, for 10 weeks. Dietary protein was provided via whole foods rather than supplementation or fortification. The diets were matched for dietary fibre from fruit and vegetables. Faecal samples were collected pre- and post-intervention for microbiota profiling by 16S ribosomal RNA amplicon sequencing and VOC analysis by head space/solid-phase microextraction/GC-MS. After correcting for multiple comparisons, no significant differences in the abundance of faecal microbiota or VOC associated with protein fermentation were evident between the RDA and 2RDA diets. Therefore, in the present study, a twofold difference in dietary protein intake did not alter gut microbiota or VOC indicative of altered protein fermentation.
Psychiatric morbidity was measured in a prospective follow-up study of 51 patients admitted to hospital after minor head injury. By means of self report questionnaires (eg General Health Questionnaire (SHQ) and Impact of Event Scale), semistructured interviews and symptom-checklists, it was found that nearly half of the patients suffered considerable discomfort after 1 week. Improvement during the 3 months follow-up was generally poor. Both concussional symptoms and stress response contributed to the compromised well-being as measured by the GHQ, but outcome did not correlate to severity of injury. The GHQ-60 score of 1 week showed a strong positive correlation with outcome after 3 months. The incidence of post-traumatic stress disorder was low.
Depression is a major public health problem in European countries, and health systems need to ensure access to effective psychological and pharmacological treatments. Research suggests that improvements in depression care require “complex interventions” that implement change in several areas simultaneously.
Methods
We describe an observational study of the implementation of a “stepped care” model to provide care for all adults presenting with a new case of depression in a mixed urban-rural area of Scotland with a population of 76,000 people.
A team of 5.2 clinicians provided care for about 1,000 new cases of depression each year. “Guided Self-Help” was the baseline intervention for all patients, supplemented where necessary with pharmacological treatment and Cognitive Behavioural or Interpersonal Therapy.
Service delivery systems were reformed to provide: specialist treatment in primary care settings using primarily non-medical clinicians, comprehensive electronic clinical records, continuous outcome monitoring and intensive investment in staff training and support.
Results
Clinical outcomes (measured by the Personal Health Questionnaire, Social and Work Adjustment Scale and EQ-5D) showed significant improvement despite relatively brief clinician contact (2.5 hours over 4.6 contacts). Savings of more than 50% were made on the antidepressant drug budget. Service user satisfaction ratings were high.
Conclusions
Population needs for depression care can be met using “stepped care” models such as that described above. A randomised controlled study of this approach would be required to fully test the model.
Why patients with psychosis use cannabis remains debated. The self-medication hypothesis has received some support but other evidence points towards an alleviation of dysphoria model. This study investigated the reasons for cannabis use in first-episode psychosis (FEP) and whether strength in their endorsement changed over time.
Methods:
FEP inpatients and outpatients at the South London and Maudsley, Oxleas and Sussex NHS Trusts UK, who used cannabis, rated their motives at baseline (n = 69), 3 months (n = 29) and 12 months (n = 36). A random intercept model was used to test the change in strength of endorsement over the 12 months. Paired-sample t-tests assessed the differences in mean scores between the five subscales on the Reasons for Use Scale (enhancement, social motive, coping with unpleasant affect, conformity and acceptance and relief of positive symptoms and side effects), at each time-point.
Results:
Time had a significant effect on scores when controlling for reason; average scores on each subscale were higher at baseline than at 3 months and 12 months. At each time-point, patients endorsed ‘enhancement’ followed by ‘coping with unpleasant affect’ and ‘social motive’ more highly for their cannabis use than any other reason. ‘Conformity and acceptance’ followed closely. ‘Relief of positive symptoms and side effects’ was the least endorsed motive.
Conclusions:
Patients endorsed their reasons for use at 3 months and 12 months less strongly than at baseline. Little support for the self-medication or alleviation of dysphoria models was found. Rather, patients rated ‘enhancement’ most highly for their cannabis use.
Observational studies have linked elevated homocysteine to vascular conditions. Folate intake has been associated with lower homocysteine concentration, although randomised controlled trials of folic acid supplementation to decrease the incidence of vascular conditions have been inconclusive. We investigated determinants of maternal homocysteine during pregnancy, particularly in a folic acid-fortified population.
Design:
Data were from the Ottawa and Kingston Birth Cohort of 8085 participants. We used multivariable regression analyses to identify factors associated with maternal homocysteine, adjusted for gestational age at bloodwork. Continuous factors were modelled using restricted cubic splines. A subgroup analysis examined the modifying effect of MTHFR 677C>T genotype on folate, in determining homocysteine concentration.
Setting:
Participants were recruited in Ottawa and Kingston, Canada, from 2002 to 2009.
Participants:
Women were recruited when presenting for prenatal care in the early second trimester.
Results:
In 7587 participants, factors significantly associated with higher homocysteine concentration were nulliparous, smoking and chronic hypertension, while factors significantly associated with lower homocysteine concentration were non-Caucasian race, history of a placenta-mediated complication and folic acid supplementation. Maternal age and BMI demonstrated U-shaped associations. Folic acid supplementation of >1 mg/d during pregnancy did not substantially increase folate concentration. In the subgroup analysis, MTHFR 677C>T modified the effect of folate status on homocysteine concentration.
Conclusions:
We identified determinants of maternal homocysteine relevant to the lowering of homocysteine in the post-folic acid fortification era, characterised by folate-replete populations. A focus on periconceptional folic acid supplementation and improving health status may form an effective approach to lower homocysteine.
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
Infants undergoing stage 1 palliation for hypoplastic left heart syndrome may have post-operative feeding difficulties. Although the cause of feeding difficulties in these patients is multi-factorial, residual arch obstruction may affect gut perfusion, contributing to feeding intolerance. We hypothesised that undergoing arch reintervention following stage 1 palliation would be associated with post-operative feeding difficulties.
Methods:
This was a retrospective cohort study. We analysed data from the National Pediatric Cardiology Quality Improvement Collaborative, which maintains a multicentre registry for infants with hypoplastic left heart syndrome discharged home following stage 1 palliation. Patients who underwent arch reintervention (percutaneous or surgical) prior to discharge following stage 1 palliation were compared with those who underwent non-aortic arch interventions after stage 1 palliation and those who underwent no intervention. Median post-operative days to full enteral feeds and weight for age z-scores were compared. Predictors of post-operative days to full feeds were identified.
Results:
Among patients who underwent arch reintervention, post-operative days to full enteral feeds were greater than for those who underwent non-aortic arch interventions (25 versus 16, p = 0.003) or no intervention (median days 25 versus 12, p < 0.001). Arch intervention, multiple interventions, gestational age, and the presence of a gastrointestinal anomaly were predictors of days to full feeds.
Conclusions:
Repeat arch intervention is associated with a longer time to achieve full enteral feeding in patients with hypoplastic left heart syndrome after stage 1 palliation. Further investigation of this association is needed to understand the role of arch obstruction in feeding problems in these patients.
Aging is associated with numerous stressors that negatively impact older adults’ well-being. Resilience improves ability to cope with stressors and can be enhanced in older adults. Senior housing communities are promising settings to deliver positive psychiatry interventions due to rising resident populations and potential impact of delivering interventions directly in the community. However, few intervention studies have been conducted in these communities. We present a pragmatic stepped-wedge trial of a novel psychological group intervention intended to improve resilience among older adults in senior housing communities.
Design:
A pragmatic modified stepped-wedge trial design.
Setting:
Five senior housing communities in three states in the US.
Participants:
Eighty-nine adults over age 60 years residing in independent living sector of senior housing communities.
Intervention:
Raise Your Resilience, a manualized 1-month group intervention that incorporated savoring, gratitude, and engagement in value-based activities, administered by unlicensed residential staff trained by researchers. There was a 1-month control period and a 3-month post-intervention follow-up.
Measurements:
Validated self-report measures of resilience, perceived stress, well-being, and wisdom collected at months 0 (baseline), 1 (pre-intervention), 2 (post-intervention), and 5 (follow-up).
Results:
Treatment adherence and satisfaction were high. Compared to the control period, perceived stress and wisdom improved from pre-intervention to post-intervention, while resilience improved from pre-intervention to follow-up. Effect sizes were small in this sample, which had relatively high baseline resilience. Physical and mental well-being did not improve significantly, and no significant moderators of change in resilience were identified.
Conclusion:
This study demonstrates feasibility of conducting pragmatic intervention trials in senior housing communities. The intervention resulted in significant improvement in several measures despite ceiling effects. The study included several features that suggest high potential for its implementation and dissemination across similar communities nationally. Future studies are warranted, particularly in samples with lower baseline resilience or in assisted living facilities.
Rules of thumb (RoTs) are proposed as a means of promoting higher levels of Defined Contribution (DC) pension saving and to help stimulate debate about the high and uncertain cost of pension provision, leading to the development of solutions. The Lifetime Pension Contribution (LPC) tells young people what pension contribution is required over a full working life to achieve a decent retirement income, calculated as 23% of average UK earnings. Another RoT is that each 1% of earnings provides a pension of 1.5% of earnings. Other RoTs show how costs vary by retirement age and if the saverʼs retirement planning is on track. The current high cost of pensions is partly due to low interest rates and the inefficiencies of the DC market, with inadequate bulk purchasing power and risk sharing. RoTs might help encourage higher employer contributions, either through automatic enrolment or on a voluntary basis.
Effective management of uncertainty can lead to better, more informed decisions. However, many decision makers and their advisers do not always face up to uncertainty, in part because there is little constructive guidance or tools available to help. This paper outlines six Uncertainty Principles to manage uncertainty.
Face up to uncertainty
Deconstruct the problem
Don’t be fooled (un/intentional biases)
Models can be helpful, but also dangerous
Think about adaptability and resilience
Bring people with you
These were arrived at following extensive discussions and literature reviews over a 5-year period. While this is an important topic for actuaries, the intended audience is any decision maker or advisor in any sector (public or private).
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
Moral reasoning and decision making help guide behavior and facilitate interpersonal relationships. Accounts of morality that position commonsense psychology as the foundation of moral development, (i.e., rationalist theories) have dominated research in morality in autism spectrum disorder (ASD). Given the well-documented differences in commonsense psychology among autistic individuals, researchers have investigated whether the development and execution of moral judgement and reasoning differs in this population compared with neurotypical individuals. In light of the diverse findings of investigations of moral development and reasoning in ASD, a summation and critical evaluation of the literature could help make sense of what is known about this important social-cognitive skill in ASD. To that end, we conducted a systematic review of the literature investigating moral decision making among autistic children and adults. Our search identified 29 studies. In this review, we synthesize the research in the area and provide suggestions for future research. Such research could include the application of an alternative theoretical framework to studying morality in autism spectrum disorder that does not assume a deficits-based perspective.
Review findings on the role of dietary patterns in preventing depression are inconsistent, possibly due to variation in assessment of dietary exposure and depression. We studied the association between dietary patterns and depressive symptoms in six population-based cohorts and meta-analysed the findings using a standardised approach that defined dietary exposure, depression assessment and covariates.
Methods
Included were cross-sectional data from 23 026 participants in six cohorts: InCHIANTI (Italy), LASA, NESDA, HELIUS (the Netherlands), ALSWH (Australia) and Whitehall II (UK). Analysis of incidence was based on three cohorts with repeated measures of depressive symptoms at 5–6 years of follow-up in 10 721 participants: Whitehall II, InCHIANTI, ALSWH. Three a priori dietary patterns, Mediterranean diet score (MDS), Alternative Healthy Eating Index (AHEI-2010), and the Dietary Approaches to Stop Hypertension (DASH) diet were investigated in relation to depressive symptoms. Analyses at the cohort-level adjusted for a fixed set of confounders, meta-analysis used a random-effects model.
Results
Cross-sectional and prospective analyses showed statistically significant inverse associations of the three dietary patterns with depressive symptoms (continuous and dichotomous). In cross-sectional analysis, the association of diet with depressive symptoms using a cut-off yielded an adjusted OR of 0.87 (95% confidence interval 0.84–0.91) for MDS, 0.93 (0.88–0.98) for AHEI-2010, and 0.94 (0.87–1.01) for DASH. Similar associations were observed prospectively: 0.88 (0.80–0.96) for MDS; 0.95 (0.84–1.06) for AHEI-2010; 0.90 (0.84–0.97) for DASH.
Conclusion
Population-scale observational evidence indicates that adults following a healthy dietary pattern have fewer depressive symptoms and lower risk of developing depressive symptoms.
A growing interest in constellations of small satellites has recently emerged due to the increasing capability of these platforms and their reduced time and cost of development. However, in the absence of dedicated launch services for these systems, alternative methods for the deployment of these constellations must be considered which can take advantage of the availability of secondary-payload launch opportunities. Furthermore, a means of exploring the effects and tradeoffs in corresponding system architectures is required. This paper presents a methodology to integrate the deployment of constellations of small satellites into the wider design process for these systems. Using a method of design-space exploration, enhanced understanding of the tradespace is supported , whilst identification of system designs for development is enabled by the application of an optimisation process. To demonstrate the method, a simplified analysis framework and a multiobjective genetic algorithm are implemented for three mission case-studies with differing application. The first two cases, modelled on existing constellations, indicate the benefits of design-space exploration, and possible savings which could be made in cost, system mass, or deployment time. The third case, based on a proposed Earth observation nanosatellite constellation, focuses on deployment following launch using a secondary-payload opportunity and demonstrates the breadth of feasible solutions which may not be considered if only point-designs are generated by a priori analysis. These results indicate that the presented method can support the development of future constellations of small satellites by improving the knowledge of different deployment strategies available during the early design phases and through enhanced exploration and identification of promising design alternatives.