We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to systematically review the literature to synthesise and summarise whether using knowledge-based planning (KBP) can improve the planning of stereotactic radiotherapy treatments.
Methods:
A systematic literature search was carried out using Medline, Scopus and Cochrane databases to evaluate the use of KBP planning in stereotactic radiotherapy. Three hundred twenty-five potential studies were identified and screened to find 25 relevant studies.
Results:
Twenty-five studies met the inclusion criteria. Where a commercial KBP was used, 72.7% of studies reported a quality improvement, and 45.5% reported a reduction in planning time. There is evidence that when used as a quality control tool, KBP can highlight stereotactic plans that need revision. In studies that use KBP as the starting point for radiotherapy planning optimisation, the radiotherapy plans generated are typically equal to or superior to those planned manually.
Conclusions:
There is evidence that KBP has the potential to improve the quality and speed of stereotactic radiotherapy planning. Further research is required to accurately quantify such systems’ quality improvements and time savings. Notably, there has been little research into their use for prostate, spinal or liver stereotactic radiotherapy, and research in these areas would be desirable. It is recommended that future studies use the ICRU 91 level 2 reporting format and that blinded physician review could add a qualitative assessment of KBP system performance.
Increasing the availability of lower energy food options is a promising public health approach. However, it is unclear the extent to which availability interventions may result in consumers later ‘compensating’ for reductions in energy intake caused by selecting lower energy food options and to what extent these effects may differ based on socio-economic position (SEP). Our objective was to examine the impact of increasing availability of lower energy meal options on immediate meal energy intake and subsequent energy intake in participants of higher v. lower SEP. In a within-subjects design, seventy-seven UK adults ordered meals from a supermarket ready meal menu with standard (30 %) and increased (70 %) availability of lower energy options. The meals were delivered to be consumed at home, with meal intake measured using the Digital Photography of Foods Method. Post-meal compensation was measured using food diaries to determine self-reported energy intake after the meal and the next day. Participants consumed significantly less energy (196 kcal (820 kJ), 95 % CI 138, 252) from the menu with increased availability of lower energy options v. the standard availability menu (P < 0·001). There was no statistically significant evidence that this reduction in energy intake was substantially compensated for (33 % compensated, P = 0·57). The effects of increasing availability of lower energy food items were similar in participants from lower and higher SEP. Increasing the availability of lower energy food options is likely to be an effective and equitable approach to reducing energy intake which may contribute to improving diet and population health.
Portion sizes of many foods have increased over time. However, the size of effect that reducing food portion sizes has on daily energy intake and body weight is less clear. We used a systematic review methodology to identify eligible articles that used an experimental design to manipulate portion size served to human participants and measured energy intake for a minimum of 1 d. Searches were conducted in September 2020 and again in October 2021. Fourteen eligible studies contributing eighty-five effects were included in the primary meta-analysis. There was a moderate-to-large reduction in daily energy intake when comparing smaller v. larger portions (Standardised Mean Difference (SMD) = –0·709 (95 % CI: –0·956, –0·461), approximately 235 kcal (983·24 kJ)). Larger reductions to portion size resulted in larger decreases in daily energy intake. There was evidence of a curvilinear relationship between portion size and daily energy intake; reductions to daily energy intake were markedly smaller when reducing portion size from very large portions. In a subset of studies that measured body weight (four studies contributing five comparisons), being served smaller v. larger portions was associated with less weight gain (0·58 kg). Reducing food portion sizes may be an effective population-level strategy to prevent weight gain.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
As the US faced its lowest levels of reported trust in government, the COVID-19 crisis revealed the essential service that various federal agencies provide as sources of information. This Element explores variations in trust across various levels of government and government agencies based on a nationally-representative survey conducted in March of 2020. First, it examines trust in agencies including the Department of Health and Human Services, state health departments, and local health care providers. This includes variation across key characteristics including party identification, age, and race. Second, the Element explores the evolution of trust in health-related organizations throughout 2020 as the pandemic continued. The Element concludes with a discussion of the implications for agency-specific assessments of trust and their importance as we address historically low levels of trust in government. This title is also available as Open Access on Cambridge Core.
Perinatal light exposure predisposes towards health and behaviour in adulthood. Season of birth is associated with psychiatric, allergic, cardiovascular and metabolic problems. It has been proposed that early-life environmental light disrupts the development of biological rhythms which, in turn, influence later-life health. However, the mechanisms linking perinatal seasonal light to later-life biological rhythm and health in humans are unknown. In this study, we investigated the association between season of birth and epigenome-wide DNA methylation of two postmortem human brain regions (16 hypothalamus, 14 temporal cortex). We did not find statistically significant differences at the whole epigenome level, either because we lacked statistical power or that no association exists. However, when we examined 24 CpG sites that had the highest significance or differential methylation, we identified regions which may be associated with circadian rhythm entrainment, cholinergic neurotransmission and neural development. Amongst methylation of the core clock genes, we identified that hypothalamus Neuronal PAS Domain Protein 2 (NPAS2) gene has hypermethylated regions in long photoperiod-born individuals. In addition, we found nominal associations between season of birth and genes linked to chronotype and narcolepsy. Season of birth-related brain DNA methylation profile was different than a previously reported blood methylation profile, suggesting a tissue-specific mechanism of perinatal light programming. Overall, we are the first to analyse the relationship between season of birth and human brain DNA methylation. Further studies with larger sample sizes are required to confirm an imprinting effect of perinatal light on the circadian clock.
A profound characteristic of field cancerization is alterations in chromatin packing. This study aimed to quantify these alterations using electron microscopy image analysis of buccal mucosa cells of laryngeal, esophageal, and lung cancer patients. Analysis was done on normal-appearing mucosa, believed to be within the cancerization field, and not tumor itself. Large-scale electron microscopy (nanotomy) images were acquired of cancer patients and controls. Within the nuclei, the chromatin packing of euchromatin and heterochromatin was characterized. Furthermore, the chromatin organization was quantified through chromatin packing density scaling. A significant difference was found between the cancer and control groups in the chromatin packing density scaling parameter for length scales below the optical diffraction limit (200 nm) in both the euchromatin (p = 0.002) and the heterochromatin (p = 0.006). The chromatin packing scaling analysis also indicated that the chromatin organization of cancer patients deviated significantly from the control group. They might allow for novel strategies for cancer risk stratification and diagnosis with high sensitivity. This could aid clinicians in personalizing screening strategies for high-risk patients and follow-up strategies for treated cancer patients.
COVID-19-related morbidity and mortality have disproportionately affected communities of colour across the United States. Originally dubbed the ‘great equalizer’, many individuals believed that COVID-19 affected everyone equally (Gupta, 2020). However, COVID-19 has exposed ethnic and racial differences in morbidity and mortality (Yaya et al, 2020). Early data showed that African Americans, Latinos and Native Americans were more likely to grow ill and die from COVID-19 than White Americans (Bassett et al, 2020). As data continues to emerge, it is evident that communities of colour bear a disproportionate burden of COVID-19. Thus, relevant COVID-19 data must be viewed as a foundation for conducting health disparities research.
Health disparities research identifies groups that receive inequitable access to care, treatment and resources (Chan et al, 2018). This research is necessary because it offers an in-depth understanding of the demographic framework (for example, race, ethnicity, gender, age, socioeconomic status, marital status and ability status) for addressing COVID-19 (Chan et al, 2018). Zastrow and Kirst-Ashman (2010) posited that academic researchers should encompass cultural competence and cultural sensitivity when investigating the behaviour and social environment of specific groups. See (2007) suggested that Eurocentric research may generate a misunderstanding of the issues that communities of colour face in light of COVID-19. Therefore, establishing multicultural and multidisciplinary research teams with an inherent understanding of health disparities is paramount to understanding communities of colour.
Since the onset of the COVID-19 global pandemic, academic researchers were forced to change approaches to research and building teams (Kupferschmidt, 2020). These rapid changes were driven by the infectivity of COVID-19 and the need to socially distance and isolate. Fortunately, technology, such as Cisco WebEx, enabled a newly created diverse research team to work without geographical constraints to facilitate COVID-19 research. The purpose of this chapter is to describe how a diverse research team worked together to conduct meaningful research regarding the impact of stress and coping in the age of COVID-19. Colleagues from the University of Nevada, Las Vegas (UNLV) and the University of Wisconsin, Madison led the development of a social mediadisseminated research project.
Can multicellular life be distinguished from single cellular life on an exoplanet? We hypothesize that abundant upright photosynthetic multicellular life (trees) will cast shadows at high sun angles that will distinguish them from single cellular life and test this using Earth as an exoplanet. We first test the concept using unmanned aerial vehicles at a replica moon-landing site near Flagstaff, Arizona and show trees have both a distinctive reflectance signature (red edge) and geometric signature (shadows at high sun angles) that can distinguish them from replica moon craters. Next, we calculate reflectance signatures for Earth at several phase angles with POLDER (Polarization and Directionality of Earth's reflectance) satellite directional reflectance measurements and then reduce Earth to a single pixel. We compare Earth to other planetary bodies (Mars, the Moon, Venus and Uranus) and hypothesize that Earth's directional reflectance will be between strongly backscattering rocky bodies with no weathering (like Mars and the Moon) and cloudy bodies with more isotropic scattering (like Venus and Uranus). Our modelling results put Earth in line with strongly backscattering Mars, while our empirical results put Earth in line with more isotropic scattering Venus. We identify potential weaknesses in both the modelled and empirical results and suggest additional steps to determine whether this technique could distinguish upright multicellular life on exoplanets.
We discuss the factors influencing the relationship between government policy-makers and scientists and how they affect the use of science in policy. We highlight issues related to context, values, culture, timeframes, communication and interpersonal relationships, providing insights from policy-makers and scientists. A spectrum of working strategies is given with examples of practical mechanisms that improve the effective use of science in policy. The shared governance model is a relatively mature approach with the potential to overcome many of the barriers discussed. At its core, shared governance, or co-production, invites policy-makers and scientists to develop and manage research priorities collaboratively. We explore the primary features of a successful shared governance arrangement, exemplified by the collaborative working model between the Australian Government Department of Agriculture and the Centre of Excellence for Biosecurity Risk Analysis. We conclude by outlining the advantages and disadvantages of the co-production of research priorities by scientists and policy-makers and present the learnings from its implementation in the biosecurity sector in Australia.
This paper follows up from a previous study on this topic and outlines the second part of a wider, two-part study on the information seeking behaviour (ISB) of law students. Exploratory work was outlined in a previous publication17 and there we found that although mobile technologies offered benefits to law students seeking information for their academic studies, there was concern from law librarians that the use of electronic resources via both non-mobile and mobile interfaces resulted in a loss of skills required for information retrieval due to the increasing capabilities of electronic resources’ search interfaces. To gain more insight into how law students were using mobile information resources, and better understand the advantages and disadvantages of such, we extended our study to a wider cohort and employed more research techniques including a focus group. This final phase of our study was conducted between 2015 to 2017. Here our cohort included another set of law librarians (13) and a further 54 law students. We expanded our research tools to include 2 thematic questionnaires and a focus group exercise. Our findings discovered that law librarians were concerned with the intangibility of digital formats. Law students remained indifferent to this aspect and valued the speed, multi-tasking and near-ubiquitous accessibility attributes that electronic format use via mobile technologies provided. These learnings and more, with conclusions, are reported in the course of this paper written by Zaki Abbas, Andrew MacFarlane and Lyn Robinson.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Methods:
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Results:
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
Conclusion:
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
In response to increasing numbers of older people in general hospitals who have cognitive impairment such as dementia and delirium, many hospitals have developed education and training programmes to prepare staff for this area of clinical practice.
Aims
To review the evidence on educational interventions on hospital care for older people with cognitive impairment.
Method
A mixed methods systematic review and narrative synthesis was undertaken. The following electronic databases were searched: Medline, Embase, CINAHL, PsycINFO, EBM Reviews, ASSIA and Scopus, as well as Health Management Information Consortium (HMIC), ProQuest, PubMed and SCIE: Social Care Online. Initial searches were run in August 2014 (update search September 2016). Titles and abstracts of studies retrieved were screened independently. The full text of eligible studies were then independently assessed by two review team members. All included studies were assessed using a standard quality appraisal tool.
Results
Eight studies relating to delirium, six on dementia and two on delirium and dementia were included, each testing the use of a different educational intervention. Overall, the quality of the studies was low. In relation to delirium, all studies reported a significant increase in participants' knowledge immediately post-intervention. Two of the dementia studies reported an increase in dementia knowledge and dementia confidence immediately post-intervention.
Conclusions
The variety of outcomes measured makes it difficult to summarise the findings. Although studies found increases in staff knowledge, there is insufficient evidence to conclude that educational interventions for staff lead to improved patient outcomes.
The objective of this WSSA Weed Loss Committee report is to provide quantitative data on the potential yield loss in sugar beet due to weed interference from the major sugar beet growing areas of the United States and Canada. Researchers and extension specialists who conducted research on weed control in sugar beet in the United States and Canada provided quantitative data on sugar beet yield loss due to weed interference in their regions. Specifically, data were requested from weed control studies in sugar beet from up to 10 individual studies per calendar year over a 15-yr period between 2002 and 2017. Data collected indicated that if weeds are left uncontrolled under optimal agronomic practices, growers in Idaho, Michigan, Minnesota, Montana, Nebraska, North Dakota, Ontario, Oregon, and Wyoming would potentially lose an average of 79%, 61%, 66%, 68%, 63%, 75%, 83%, 78%, and 77% of the sugar beet yield. The corresponding monetary loss would be approximately US$234, US$122, US$369, US$43, US$40, US$211, US$12, US$14, and US$32 million, respectively. The average yield loss due to weed interference for the primary sugar beet growing areas of North America was estimated to be 70%. Thus, if weeds are not controlled, growers in the United States would lose approximately 22.4 million tonnes of sugar beet yield valued at approximately US$1.25 billion, and growers in Canada would lose approximately 0.5 million tonnes of sugar beet yield valued at approximately US$25 million. The high return on investment in weed management highlights the importance of continued weed science research for sustaining high crop yield and profitability of sugar beet production in North America.
Increased use of dicamba and/or glyphosate in dicamba/glyphosate-tolerant soybean might affect many sensitive crops, including potato. The objective of this study was to determine the growth and yield of ‘Russet Burbank’ potato grown from seed tubers (generation 2) from mother plants (generation 1) treated with dicamba (4, 20, and 99 g ae ha−1), glyphosate (8, 40, and 197 g ae ha−1), or a combination of dicamba and glyphosate during tuber initiation. Generation 2 tubers were planted near Oakes and Inkster, ND, in 2016 and 2017, at the same research farm where the generation 1 tubers were grown the previous year. Treatment with 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate caused emergence of generation 2 plants to be reduced by up to 84%, 86%, and 87%, respectively, at 5 wk after planting. Total tuber yield of generation 2 was reduced up to 67%, 55%, and 68% when 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate was applied to generation 1 plants, respectively. At each site year, 197 g ha−1 glyphosate reduced total yield and marketable yield, while 99 g ha−1 dicamba reduced total yield and marketable yield in some site-years. This study confirms that exposure to glyphosate and dicamba of potato grown for potato seed tubers can negatively affect the growth and yield potential of the subsequently grown daughter generation.
Increasing longevity and the strain on state and occupational pensions have brought into question long-held assumptions about the age of retirement, and raised the prospect of a workplace populated by ageing workers. In the United Kingdom the default retirement age has gone, incremental increases in state pension age are being implemented and ageism has been added to workplace anti-discrimination laws. These changes are yet to bring about the anticipated transformation in workplace demographics, but it is coming, making it timely to ask if the workplace is ready for the ageing worker and how the extension of working life will be managed. We report findings from qualitative case studies of five large organisations located in the United Kingdom. Interviews and focus groups were conducted with employees, line managers, occupational health staff and human resources managers. Our findings reveal a high degree of uncertainty and ambivalence among workers and managers regarding the desirability and feasibility of extending working life; wide variations in how older workers are managed within workplaces; a gap between policies and practices; and evidence that while casualisation might be experienced negatively by younger workers, it may be viewed positively by financially secure older workers seeking flexibility. We conclude with a discussion of the challenges facing employers and policy makers in making the modern workplace fit for the ageing worker.