To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
The coronavirus disease 2019 (COVID-19) pandemic has challenged the ability of Emergency Medical Services (EMS) providers to maintain personal safety during the treatment and transport of patients potentially infected. Increased rates of COVID-19 infection in EMS providers after patient care exposure, and notably after performing aerosol-generating procedures (AGPs), have been reported. With an already strained workforce seeing rising call volumes and increased risk for AGP-requiring patient presentations, development of novel devices for the protection of EMS providers is of great importance.
Based on the concept of a negative pressure room, the AerosolVE BioDome is designed to encapsulate the patient and contain aerosolized infectious particles produced during AGPs, making the cabin of an EMS vehicle safer for providers. The objective of this study was to determine the efficacy and safety of the tent in mitigating simulated infectious particle spread in varied EMS transport platforms during AGP utilization.
Fifteen healthy volunteers were enrolled and distributed amongst three EMS vehicles: a ground ambulance, an aeromedical-configured helicopter, and an aeromedical-configured jet. Sodium chloride particles were used to simulate infectious particles and particle counts were obtained in numerous locations close to the tent and around the patient compartment. Counts near the tent were compared to ambient air with and without use of AGPs (non-rebreather mask, continuous positive airway pressure [CPAP] mask, and high-flow nasal cannula [HFNC]).
For all transport platforms, with the tent fan off, the particle generator alone, and with all AGPs produced particle counts inside the tent significantly higher than ambient particle counts (P <.0001). With the tent fan powered on, particle counts near the tent, where EMS providers are expected to be located, showed no significant elevation compared to baseline ambient particle counts during the use of the particle generator alone or with use of any of the AGPs across all transport platforms.
Development of devices to improve safety for EMS providers to allow for use of all available therapies to treat patients while reducing risk of communicable respiratory disease transmission is of paramount importance. The AerosolVE BioDome demonstrated efficacy in creating a negative pressure environment and workspace around the patient and provided significant filtration of simulated respiratory droplets, thus making the confined space of transport vehicles potentially safer for EMS personnel.
The coronavirus disease 2019 (COVID-19) pandemic has created challenges in maintaining the safety of prehospital providers caring for patients. Reports have shown increased rates of Emergency Medical Services (EMS) provider infection with COVID-19 after patient care exposure, especially while utilizing aerosol-generating procedures (AGPs). Given the increased risk and rising call volumes for AGP-necessitating complaints, development of novel devices for the protection of EMS clinicians is of great importance.
Drawn from the concept of the powered air purifying respirator (PAPR), the AerosolVE helmet creates a personal negative pressure space to contain aerosolized infectious particles produced by patients, making the cabin of an EMS vehicle safer for providers. The helmet was developed initially for use in hospitals and could be of significant use in the prehospital setting. The objective of this study was to determine the efficacy and safety of the helmet in mitigating simulated infectious particle spread in varied EMS transport platforms during AGP utilization.
Fifteen healthy volunteers were enrolled and distributed amongst three EMS vehicles: a ground ambulance, a medical helicopter, and a medical jet. Sodium chloride particles were used to simulate infectious particles, and particle counts were obtained in numerous locations close to the helmet and around the patient compartment. Counts near the helmet were compared to ambient air with and without use of AGPs (non-rebreather mask [NRB], continuous positive airway pressure mask [CPAP], and high-flow nasal cannula [HFNC]).
Without the helmet fan on, the particle generator alone and with all AGPs produced particle counts inside the helmet significantly higher than ambient particle counts. With the fan on, there was no significant difference in particle counts around the helmet compared to baseline ambient particle counts. Particle counts at the filter exit averaged less than one despite markedly higher particle counts inside the helmet.
Given the risk to EMS providers by communicable respiratory diseases, development of devices to improve safety while still enabling use of respiratory therapies is of paramount importance. The AerosolVE helmet demonstrated efficacy in creating a negative pressure environment and provided significant filtration of simulated respiratory droplets, thus making the confined space of transport vehicles potentially safer for EMS personnel.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Our research group demonstrated that vitamin A restriction affected meat quality of Angus cross and Simmental steers. Therefore, the aim of this study is to highlight the genotype variations in response to dietary vitamin A levels. Commercial Angus and Simmental steers (n = 32 per breed; initial BW = 337.2 ± 5.9 kg; ~8 months of age) were fed a low-vitamin A (LVA) (1017 IU/kg DM) backgrounding diet for 95 days to reduce hepatic vitamin A stores. During finishing, steers were randomly assigned to treatments in a 2 × 2 factorial arrangement of genotype × dietary vitamin A concentration. The LVA treatment was a finishing diet with no supplemental vitamin A (723 IU vitamin A/kg DM); the control (CON) was the LVA diet plus supplementation with 2200 IU vitamin A/kg DM. Blood samples were collected at three time points throughout the study to analyze serum retinol concentration. At the completion of finishing, steers were slaughtered at a commercial abattoir. Meat characteristics assessed were intramuscular fat concentration, color, Warner-Bratzler shear force, cook loss and pH. Camera image analysis was used for determination of marbling, 12th rib back fat and longissimus muscle area (LMA). The LVA steers had lower (P < 0.001) serum retinol concentration than CON steers. The LVA treatment resulted in greater (P = 0.03) average daily gain than the CON treatment, 1.52 and 1.44 ± 0.03 kg/day, respectively; however, there was no effect of treatment on final BW, DM intake or feed efficiency. Cooking loss and yield grade were greater and LMA was smaller in LVA steers (P < 0.05). There was an interaction between breed and treatment for marbling score (P = 0.01) and percentage of carcasses grading United States Department of Agriculture (USDA) Prime (P = 0.02). For Angus steers, LVA treatment resulted in a 16% greater marbling score than CON (683 and 570 ± 40, respectively) and 27% of LVA Angus steers graded USDA Prime compared with 0% for CON. Conversely, there was no difference in marbling score or USDA Quality Grades between LVA and CON for Simmental steers. In conclusion, feeding a LVA diet during finishing increased marbling in Angus but not in Simmental steers. Reducing the vitamin A level of finishing diets fed to cattle with a high propensity to marble, such as Angus, has the potential to increase economically important traits such as marbling and quality grade without negatively impacting gain : feed or yield grade.
The disproportionate burden of prevalent, persistent pathogens among disadvantaged groups may contribute to socioeconomic and racial/ethnic disparities in long-term health. We assessed if the social patterning of pathogen burden changed over 16 years in a U.S.-representative sample. Data came from 17 660 National Health and Nutrition Examination Survey participants. Pathogen burden was quantified by summing the number of positive serologies for cytomegalovirus, herpes simplex virus-1, HSV-2, human papillomavirus and Toxoplasma gondii and dividing by the number of pathogens tested, giving a percent-seropositive for each participant. We examined sex- and age-adjusted mean pathogen burdens from 1999–2014, stratified by race/ethnicity and SES (poverty-to-income ratio (PIR); educational attainment). Those with a PIR < 1.3 had a mean pathogen burden 1.4–1.8 times those with a PIR > 3.5, with no change over time. Educational disparities were even greater and showed some evidence of increasing over time, with the mean pathogen burden among those with less than a high school education approximately twice that of those who completed more than high school. Non-Hispanic Black, Mexican American and other Hispanic participants had a mean pathogen burden 1.3–1.9 times non-Hispanic Whites. We demonstrate that socioeconomic and racial/ethnic disparities in pathogen burden have persisted across 16 years, with little evidence that the gap is closing.
Space Infrared Telescope for Cosmology and Astrophysics (SPICA), the cryogenic infrared space telescope recently pre-selected for a ‘Phase A’ concept study as one of the three remaining candidates for European Space Agency (ESA's) fifth medium class (M5) mission, is foreseen to include a far-infrared polarimetric imager [SPICA-POL, now called B-fields with BOlometers and Polarizers (B-BOP)], which would offer a unique opportunity to resolve major issues in our understanding of the nearby, cold magnetised Universe. This paper presents an overview of the main science drivers for B-BOP, including high dynamic range polarimetric imaging of the cold interstellar medium (ISM) in both our Milky Way and nearby galaxies. Thanks to a cooled telescope, B-BOP will deliver wide-field 100–350 $\mu$m images of linearly polarised dust emission in Stokes Q and U with a resolution, signal-to-noise ratio, and both intensity and spatial dynamic ranges comparable to those achieved by Herschel images of the cold ISM in total intensity (Stokes I). The B-BOP 200 $\mu$m images will also have a factor $\sim $30 higher resolution than Planck polarisation data. This will make B-BOP a unique tool for characterising the statistical properties of the magnetised ISM and probing the role of magnetic fields in the formation and evolution of the interstellar web of dusty molecular filaments giving birth to most stars in our Galaxy. B-BOP will also be a powerful instrument for studying the magnetism of nearby galaxies and testing Galactic dynamo models, constraining the physics of dust grain alignment, informing the problem of the interaction of cosmic rays with molecular clouds, tracing magnetic fields in the inner layers of protoplanetary disks, and monitoring accretion bursts in embedded protostars.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
Arterial wall thickening, stimulated by low-grade systemic inflammation, underlies many cardiovascular events. As diet is a significant moderator of systemic inflammation, the dietary inflammatory index (DIITM) has recently been devised to assess the overall inflammatory potential of an individual’s diet. The primary objective of this study was to assess the association of the DII with common carotid artery–intima-media thickness (CCA–IMT) and carotid plaques. To substantiate the clinical importance of these findings we assessed the relationship of DII score with atherosclerotic vascular disease (ASVD)-related mortality, ischaemic cerebrovascular disease (CVA)-related mortality and ischaemic heart disease (IHD)-related mortality more. The study was conducted in Western Australian women aged over 70 years (n 1304). Dietary data derived from a validated FFQ (completed at baseline) were used to calculate a DII score for each individual. In multivariable-adjusted models, DII scores were associated with sub-clinical atherosclerosis: a 1 sd (2·13 units) higher DII score was associated with a 0·013-mm higher mean CCA–IMT (P=0·016) and a 0·016-mm higher maximum CCA–IMT (P=0·008), measured at 36 months. No relationship was seen between DII score and carotid plaque severity. There were 269 deaths during follow-up. High DII scores were positively associated with ASVD-related death (per sd, hazard ratio (HR): 1·36; 95 % CI 1·15, 1·60), CVA-related death (per sd, HR: 1·30; 95 % CI 1·00, 1·69) and IHD-related death (per sd, HR: 1·40; 95 % CI 1·13, 1·75). These results support the hypothesis that a pro-inflammatory diet increases systemic inflammation leading to development and progression of atherosclerosis and eventual ASVD-related death.
Two broad aims drive weed science research: improved management and improved
understanding of weed biology and ecology. In recent years, agricultural
weed research addressing these two aims has effectively split into separate
subdisciplines despite repeated calls for greater integration. Although some
excellent work is being done, agricultural weed research has developed a
very high level of repetitiveness, a preponderance of purely descriptive
studies, and has failed to clearly articulate novel hypotheses linked to
established bodies of ecological and evolutionary theory. In contrast,
invasive plant research attracts a diverse cadre of nonweed scientists using
invasions to explore broader and more integrated biological questions
grounded in theory. We propose that although studies focused on weed
management remain vitally important, agricultural weed research would
benefit from deeper theoretical justification, a broader vision, and
increased collaboration across diverse disciplines. To initiate change in
this direction, we call for more emphasis on interdisciplinary training for
weed scientists, and for focused workshops and working groups to develop
specific areas of research and promote interactions among weed scientists
and with the wider scientific community.
Herbicides are the foundation of weed control in commercial crop-production systems. However, herbicide-resistant (HR) weed populations are evolving rapidly as a natural response to selection pressure imposed by modern agricultural management activities. Mitigating the evolution of herbicide resistance depends on reducing selection through diversification of weed control techniques, minimizing the spread of resistance genes and genotypes via pollen or propagule dispersal, and eliminating additions of weed seed to the soil seedbank. Effective deployment of such a multifaceted approach will require shifting from the current concept of basing weed management on single-year economic thresholds.
Objectives: Treatment switching occurs when patients in a randomized clinical trial switch from the treatment initially assigned to them to another treatment, typically from the control to experimental treatment. This study discusses the issues this raises and possible approaches to addressing them in trials of cancer drugs.
Methods: Stakeholders from around the world were invited to a 1.5-day Workshop in Adelaide, Australia. This study attempts to capture the key points from the discussion and the perspectives of the various stakeholder groups, but is not a formal consensus statement.
Results: Treatment switching raises challenging ethical issues with arguments for and against allowing it. It is increasingly common in cancer drug trials and presents challenges for the interpretation of results by regulators, clinicians, patients, and payers. Proposals are offered for good practice in the design, management, and analysis of trials and wider development programs for cancer drugs in which treatment switching has occurred or is likely to. Recommendations are also offered for further action to improve understanding of the importance and challenges of treatment switching and to promote agreement between key stakeholders on guidelines and other steps to address these challenges.
Conclusions: The handling of treatment switching in trials is of concern to all stakeholders. On the basis of the discussions at the Adelaide International Workshop, there would appear to be common ground on approaches to addressing treatment switching in cancer trials and scope for the development of formal guidelines to inform the work of regulators, payers, industry, trial designers and other stakeholders.
To characterize meal patterns across ten European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) calibration study.
Cross-sectional study utilizing dietary data collected through a standardized 24 h diet recall during 1995–2000. Eleven predefined intake occasions across a 24 h period were assessed during the interview. In the present descriptive report, meal patterns were analysed in terms of daily number of intake occasions, the proportion reporting each intake occasion and the energy contributions from each intake occasion.
Twenty-seven centres across ten European countries.
Women (64 %) and men (36 %) aged 35–74 years (n 36 020).
Pronounced differences in meal patterns emerged both across centres within the same country and across different countries, with a trend for fewer intake occasions per day in Mediterranean countries compared with central and northern Europe. Differences were also found for daily energy intake provided by lunch, with 38–43 % for women and 41–45 % for men within Mediterranean countries compared with 16–27 % for women and 20–26 % for men in central and northern European countries. Likewise, a south–north gradient was found for daily energy intake from snacks, with 13–20 % (women) and 10–17 % (men) in Mediterranean countries compared with 24–34 % (women) and 23–35 % (men) in central/northern Europe.
We found distinct differences in meal patterns with marked diversity for intake frequency and lunch and snack consumption between Mediterranean and central/northern European countries. Monitoring of meal patterns across various cultures and populations could provide critical context to the research efforts to characterize relationships between dietary intake and health.
Higher fruit intake is associated with lower risk of all-cause and disease-specific mortality. However, data on individual fruits are limited, and the generalisability of these findings to the elderly remains uncertain. The objective of this study was to examine the association of apple intake with all-cause and disease-specific mortality over 15 years in a cohort of women aged over 70 years. Secondary analyses explored relationships of other fruits with mortality outcomes. Usual fruit intake was assessed in 1456 women using a FFQ. Incidence of all-cause and disease-specific mortality over 15 years was determined through the Western Australian Hospital Morbidity Data system. Cox regression was used to determine the hazard ratios (HR) for mortality. During 15 years of follow-up, 607 (41·7 %) women died from any cause. In the multivariable-adjusted analysis, the HR for all-cause mortality was 0·89 (95 % CI 0·81, 0·97) per sd (53 g/d) increase in apple intake, HR 0·80 (95 % CI 0·65, 0·98) for consumption of 5–100 g/d and HR 0·65 (95 % CI 0·48, 0·89) for consumption of >100 g/d (an apple a day), compared with apple intake of <5 g/d (Pfor trend=0·03). Our analysis also found that higher apple intake was associated with lower risk for cancer mortality, and that higher total fruit and banana intakes were associated lower risk of CVD mortality (P<0·05). Our results support the view that regular apple consumption may contribute to lower risk of mortality.
Twin pairs discordant for disease may help elucidate the epigenetic mechanisms and causal environmental factors in disease development and progression. To obtain the numbers of pairs, especially monozygotic (MZ) twin pairs, necessary for in-depth studies while also allowing for replication, twin studies worldwide need to pool their resources. The Discordant Twin (DISCOTWIN) consortium was established for this goal. Here, we describe the DISCOTWIN Consortium and present an analysis of type 2 diabetes (T2D) data in nearly 35,000 twin pairs. Seven twin cohorts from Europe (Denmark, Finland, Norway, the Netherlands, Spain, Sweden, and the United Kingdom) and one from Australia investigated the rate of discordance for T2D in same-sex twin pairs aged 45 years and older. Data were available for 34,166 same-sex twin pairs, of which 13,970 were MZ, with T2D diagnosis based on self-reported diagnosis and medication use, fasting glucose and insulin measures, or medical records. The prevalence of T2D ranged from 2.6% to 12.3% across the cohorts depending on age, body mass index (BMI), and national diabetes prevalence. T2D discordance rate was lower for MZ (5.1%, range 2.9–11.2%) than for same-sex dizygotic (DZ) (8.0%, range 4.9–13.5%) pairs. Across DISCOTWIN, 720 discordant MZ pairs were identified. Except for the oldest of the Danish cohorts (mean age 79), heritability estimates based on contingency tables were moderate to high (0.47–0.77). From a meta-analysis of all data, the heritability was estimated at 72% (95% confidence interval 61–78%). This study demonstrated high T2D prevalence and high heritability for T2D liability across twin cohorts. Therefore, the number of discordant MZ pairs for T2D is limited. By combining national resources, the DISCOTWIN Consortium maximizes the number of discordant MZ pairs needed for in-depth genotyping, multi-omics, and phenotyping studies, which may provide unique insights into the pathways linking genes to the development of many diseases.