To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To test the effectiveness of a social network intervention (SNI) to improve children’s healthy drinking behaviors.
A three-arm cluster randomized control trial design was used. In the SNI, a subset of children were selected and trained as ‘influence agents’ to promote water consumption—as an alternative to sugar-sweetened beverages (SSBs)—among their peers. In the active control condition, all children were simultaneously exposed to the benefits of water consumption. The control condition received no intervention.
11 schools in the Netherlands.
451 children (Mage = 10.74, SDage = .97; 50.8% girls).
Structural path models showed that children exposed to the SNI consumed .20 less SSBs per day compared to those in the control condition (β = .25, p = .035). There was a trend showing that children exposed to the SNI consumed .17 less SSBs per day than those in the active control condition (β = .20, p = .061). No differences were found between conditions for water consumption. However, the moderation effects of descriptive norms (β = -.12, p = .028) and injunctive norms (β = .11 to .14, both p = .050) indicated that norms are more strongly linked to water consumption in the SNI condition compared to the active control and control conditions.
These findings suggest that a SNI promoting healthy drinking behaviors may prevent children from consuming more SSBs. Moreover, for water consumption, the prevailing social norms in the context play an important role in mitigating the effectiveness of the SNI.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter phenology in thirteen economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after physiological maturity at multiple sites spread across fourteen states in the southern, northern, and mid-Atlantic U.S. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus species seed shatter was low (0 to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2 to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than ten percent of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Hyperprolific sows rear more piglets than they have teats, and in order to accommodate this, milk replacers are often offered as a supplement. Milk replacers are based on bovine milk, yet components of vegetable origin are often added. This may reduce growth, but could also accelerate maturational changes. Therefore, we investigated the effect of feeding piglets a milk replacer with gradually increasing levels of wheat flour on growth, gut enzyme activity and immune function compared to a diet based entirely on bovine milk. The hypothesis tested was that adding a starch component (wheat flour) induces maturation of the mucosa as measured by higher digestive activity and improved integrity and immunity of the small intestines (SI). To test this hypothesis, piglets were removed from the sow at day 3 and fed either a pure milk replacer diet (MILK) or from day 11 a milk replacer diet with increasing levels of wheat (WHEAT). The WHEAT piglets had an increased enzyme activity of maltase and sucrase in the proximal part of the SI compared with the MILK group. There were no differences in gut morphology, histopathology and gene expression between the groups. In conclusion, the pigs given a milk replacer with added wheat displayed immunological and gut mucosal enzyme maturational changes, indicatory of adaptation toward a vegetable-based diet. This was not associated with any clinical complications and future studies are needed to show whether this could improve responses in the subsequent weaning process.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after maturity at multiple sites spread across eleven states in the southern, northern, and mid-Atlantic U.S. From soybean maturity to four weeks after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased as the states moved further north. At soybean maturity, the percent of seed shatter ranged from 1 to 70%. That range had shifted to 5 to 100% (mean: 42%) by 25 days after soybean maturity. There were considerable differences in seed shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output at during certain years.
There are sparse data on the outcomes of endoscopic stapling of pharyngeal pouches. The Mersey ENT Trainee Collaborative compared regional practice against published benchmarks.
A 10-year retrospective analysis of endoscopic pharyngeal pouch surgery was conducted and practice was assessed against eight standards. Comparisons were made between results from the tertiary centre and other sites.
A total of 225 procedures were performed (range of 1.2–9.2 cases per centre per year). All centres achieved 90 per cent resumption of oral intake within 2 days. All centres achieved less than 2-day hospital stays. Primary success (84 per cent (i.e. abandonment of endoscopic stapling in 16 per cent)), symptom resolution (83 per cent) and recurrence rates (13 per cent) failed to meet the standard across the non-tertiary centres.
Endoscopic pharyngeal pouch stapling is a procedure with a low mortality and brief in-patient stay. There was significant variance in outcomes across the region. This raises the question of whether this service should become centralised and the preserve of either tertiary centres or sub-specialist practitioners.
SHEA endorses adhering to the recommendations by the CDC and ACIP for immunizations of all children and adults. All persons providing clinical care should be familiar with these recommendations and should routinely assess immunization compliance of their patients and strongly recommend all routine immunizations to patients. All healthcare personnel (HCP) should be immunized against vaccine-preventable diseases as recommended by the CDC/ACIP (unless immunity is demonstrated by another recommended method). SHEA endorses the policy that immunization should be a condition of employment or functioning (students, contract workers, volunteers, etc) at a healthcare facility. Only recognized medical contraindications should be accepted for not receiving recommended immunizations.
Establishment of alfalfa by interseeding into corn planted for silage can enhance crop productivity but weed management is a challenge to adoption. Although a glyphosate-based herbicide program could be a simple and effective approach, concerns about herbicide resistance and limitations in available alfalfa varieties exist. Field experiments were conducted to compare the efficacy and selectivity of preemergence (PRE), postemergence (POST) and PRE followed by POST herbicide programs to a glyphosate only strategy when interseeding alfalfa into corn. Experiment 1 compared PRE applications of acetochlor, mesotrione, S-metalochlor, metribuzin, and flumetsulam, and found both rates of acetochlor and metribuzin, and S-metalochlor at 1.1 kg ha-1 were the most effective and selective PRE herbicides 4 weeks after treatment (WAT), but each resulted in greater overall weed cover than glyphosate by 8 WAT. Experiment 2 evaluated POST applications at early and late timings of bentazon, bromoxynil, 2,4-DB, and mesotrione. Several POST herbicides exhibited similar effectiveness and selectivity as glyphosate including early applications of bromoxynil (0.14 kg ha-1) and 2,4-DB (0.84 or 1.68 kg ha-1), as well as late applications of bromoxynil (0.42 kg ha-1), 2,4-DB (0.84 kg ha-1) and mesotrione (0.05 or 0.11 kg ha-1). A third experiment compared applications of acetochlor PRE, bromoxynil POST, and the combination of acetochlor PRE with bromoxynil POST. All treatments were effective and safe for use in this interseeded system, although interseeded alfalfa provided 65-70% weed suppression in corn planted for silage without any herbicide. Herbicide treatments had no observable impacts on corn and alfalfa yields so weed management was likely of limited economic importance. However, weed competitiveness can vary based on several different factors including weed species, density, and site-specific factors, and so further investigations under different environments and conditions are needed.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
Clinicians are consistently presented with the arduous task of characterizing, identifying, classifying, and evaluating response-to-intervention when treating or examining a broad array of patient populations. The primary aim of this chapter is to outline and define wellness among patients living with chronic medical conditions (PLW-CMC). An operational definition of a chronic medical condition is one requiring ongoing management and treatment over extended periods of time, often comprised of a broad constellation of conditions including heart disease, stroke, cancer, chronic respiratory diseases, infectious diseases, metabolic/endocrine disorders, genetic disorders, and disorders resulting in disability/impairment . The number of persons living with one or more chronic medical conditions continues to increase, both nationally and internationally. Thus, the need for literature pertaining to interventions that optimize a patient's quality of life (QOL) is pertinent, as health status is known to be associated with an individual's perception or appraisal of wellness, life satisfaction, happiness, and overall well-being.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Prolonged survival of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on environmental surfaces and personal protective equipment may lead to these surfaces transmitting this pathogen to others. We sought to determine the effectiveness of a pulsed-xenon ultraviolet (PX-UV) disinfection system in reducing the load of SARS-CoV-2 on hard surfaces and N95 respirators.
Chamber slides and N95 respirator material were directly inoculated with SARS-CoV-2 and were exposed to different durations of PX-UV.
For hard surfaces, disinfection for 1, 2, and 5 minutes resulted in 3.53 log10, >4.54 log10, and >4.12 log10 reductions in viral load, respectively. For N95 respirators, disinfection for 5 minutes resulted in >4.79 log10 reduction in viral load. PX-UV significantly reduced SARS-CoV-2 on hard surfaces and N95 respirators.
With the potential to rapidly disinfectant environmental surfaces and N95 respirators, PX-UV devices are a promising technology to reduce environmental and personal protective equipment bioburden and to enhance both healthcare worker and patient safety by reducing the risk of exposure to SARS-CoV-2.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Although Britain's electrification started with considerable technological and market advantages, it proceeded remarkably slowly and hesitantly. Using share-price data, this study investigates the conventional explanations for this disappointing outcome: notably, perverse regulation and competition from entrenched gas-light providers. It finds that these oft-cited factors had an imperceptible impact on the course of the British electrical industry's turbulent market launch in 1882. However, we show that, owing to the fledgling electrical industry's need for incessant experimentation, short-sighted, self-serving decisions by the management of the early British industry's most prominent firm squandered a well-funded start, with long-lasting adverse consequences.
Cognitive deficits affect a significant proportion of patients with bipolar disorder (BD). Problems with sustained attention have been found independent of mood state and the causes are unclear. We aimed to investigate whether physical parameters such as activity levels, sleep, and body mass index (BMI) may be contributing factors.
Forty-six patients with BD and 42 controls completed a battery of neuropsychological tests and wore a triaxial accelerometer for 21 days which collected information on physical activity, sleep, and circadian rhythm. Ex-Gaussian analyses were used to characterise reaction time distributions. We used hierarchical regression analyses to examine whether physical activity, BMI, circadian rhythm, and sleep predicted variance in the performance of cognitive tasks.
Neither physical activity, BMI, nor circadian rhythm predicted significant variance on any of the cognitive tasks. However, the presence of a sleep abnormality significantly predicted a higher intra-individual variability of the reaction time distributions on the Attention Network Task.
This study suggests that there is an association between sleep abnormalities and cognition in BD, with little or no relationship with physical activity, BMI, and circadian rhythm.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Addition of fats to the diets of ruminants has long been known to result in a reduction in enteric methane emissions. Tannins have also been used to reduce methane emissions but with mixed success. However, the effect of feeding fat in combination with tannin is unknown. Eight ruminally cannulated Holstein-Friesian cows were fed four diets in a double Latin-square, full crossover sequence. The treatments were 800 ml/day of water (CON), 800 g/day of cottonseed oil, 400 g/day of tannin, and 800 g/day of cottonseed oil and 400 g/day of tannin in combination (fat- and tannin-supplemented diet). Methane emissions were measured using open-circuit respiration chambers. Intake of basal diets was not different between treatments. Cows fed cottonseed oil had greater milk yield (34.9 kg/day) than those fed CON (32.3 kg/day), but the reduced concentration of milk fat meant there was no difference in energy-corrected milk between treatments. Methane yield was reduced when either cottonseed oil (14%) or tannin (11%) was added directly to the rumen, and their effect was additive when given in combination (20% reduction). The mechanism of the anti-methanogenic effect remains unclear but both fat and tannin appear to cause a reduction in fermentation in general rather than cause a change in the type of fermentation.
There is a substantial proportion of patients who drop out of treatment before they receive minimally adequate care. They tend to have worse health outcomes than those who complete treatment. Our main goal is to describe the frequency and determinants of dropout from treatment for mental disorders in low-, middle-, and high-income countries.
Respondents from 13 low- or middle-income countries (N = 60 224) and 15 in high-income countries (N = 77 303) were screened for mental and substance use disorders. Cross-tabulations were used to examine the distribution of treatment and dropout rates for those who screened positive. The timing of dropout was examined using Kaplan–Meier curves. Predictors of dropout were examined with survival analysis using a logistic link function.
Dropout rates are high, both in high-income (30%) and low/middle-income (45%) countries. Dropout mostly occurs during the first two visits. It is higher in general medical rather than in specialist settings (nearly 60% v. 20% in lower income settings). It is also higher for mild and moderate than for severe presentations. The lack of financial protection for mental health services is associated with overall increased dropout from care.
Extending financial protection and coverage for mental disorders may reduce dropout. Efficiency can be improved by managing the milder clinical presentations at the entry point to the mental health system, providing adequate training, support and specialist supervision for non-specialists, and streamlining referral to psychiatrists for more severe cases.