To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Radiocarbon (14C or carbon-14, half-life 5730 yr) is a key radionuclide in the assessment of the safety of a geological disposal facility (GDF) for radioactive waste. In particular, the radiological impact of gaseous carbon-14 bearing species has been recognized as a potential issue. Irradiated steels are one of the main sources of carbon-14 in the United Kingdom’s radioactive waste inventory. However, there is considerable uncertainty about the chemical form(s) in which the carbon-14 will be released. The objective of the work was to measure the rate and speciation of carbon-14 release from irradiated 316L(N) stainless steel on leaching under high-pH anoxic conditions, representative of a cement-based near field for low-heat generating wastes. Periodic measurements of carbon-14 releases to both the gas phase and to solution were made in duplicate experiments over a period of up to 417 days. An initial fast release of carbon-14 from the surface of the steel is observed during the first week of leaching, followed by a drop in the rate of release at longer times. Carbon-14 is released primarily to the solution phase with differing fractions released to the gas phase in the two experiments: about 1% of the total release in one and 6% in the other. The predominant dissolved carbon-14 releases are in inorganic form (as 14C-carbonate) but also include organic species. The predominant gas-phase species are hydrocarbons with a smaller fraction of 14CO (which may include some volatile oxygen-containing carbon-species). The experiments are continuing, with final sampling and termination planned after leaching for a total of two years.
Research consistently demonstrates that common polymorphic variation in monoamine oxidase A (MAOA) moderates the influence of childhood maltreatment on later antisocial behavior, with growing evidence that the “risk” allele (high vs. low activity) differs for females. However, little is known about how this Gene × Environment interaction functions to increase risk, or if this risk pathway is specific to antisocial behavior. Using a prospectively assessed, longitudinal sample of females (n = 2,004), we examined whether changes in emotional reactivity (ER) during adolescence mediated associations between this Gene × Environment and antisocial personality disorder in early adulthood. In addition, we assessed whether this putative risk pathway also conferred risk for borderline personality disorder, a related disorder characterized by high ER. While direct associations between early maltreatment and later personality pathology did not vary by genotype, there was a significant difference in the indirect path via ER during adolescence. Consistent with hypotheses, females with high-activity MAOA genotype who experienced early maltreatment had greater increases in ER during adolescence, and higher levels of ER predicted both antisocial personality disorder and borderline personality disorder symptom severity. Taken together, findings suggest that the interaction between MAOA and early maltreatment places women at risk for a broader range of personality pathology via effects on ER.
This study determines the prevalence of inadequate micronutrient intakes consumed by long-term care (LTC) residents. This cross-sectional study was completed in thirty-two LTC homes in four Canadian provinces. Weighed and estimated food and beverage intake were collected over 3 non-consecutive days from 632 randomly selected residents. Nutrient intakes were adjusted for intra-individual variation and compared with the Dietary Reference Intakes. Proportion of participants, stratified by sex and use of modified (MTF) or regular texture foods, with intakes below the Estimated Average Requirement (EAR) or Adequate Intake (AI), were identified. Numbers of participants that met these adequacy values with use of micronutrient supplements was determined. Mean age of males (n 197) was 85·2 (sd 7·6) years and females (n 435) was 87·4 (sd 7·8) years. In all, 33 % consumed MTF; 78·2 % (males) and 76·1 % (females) took at least one micronutrient pill. Participants on a MTF had lower intake for some nutrients (males=4; females=8), but also consumed a few nutrients in larger amounts than regular texture consumers (males=4; females =1). More than 50 % of participants in both sexes and texture groups consumed inadequate amounts of folate, vitamins B6, Ca, Mg and Zn (males only), with >90 % consuming amounts below the EAR/AI for vitamin D, E, K, Mg (males only) and K. Vitamin D supplements resolved inadequate intakes for 50–70 % of participants. High proportions of LTC residents have intakes for nine of twenty nutrients examined below the EAR or AI. Strategies to improve intake specific to these nutrients are needed.
Hyperhidrosis is characterized by uncontrollable excessive sweating, which occurs at rest, regardless of temperature, and can significantly affect quality of life. There is substantial variation in the availability of treatments in secondary care and uncertainty regarding optimal patient management. A systematic review was undertaken to assess the clinical effectiveness of treatments prescribed by dermatologists (iontophoresis, anticholinergic medications, botulinum toxin injections) and minor surgical treatments (curettage and newer energy based technologies) for primary hyperhidrosis and identify areas for further research.
Fifteen databases and trial registers were searched to July 2016. Pairwise meta-analyses were conducted for comparisons between botulinum toxin injections and placebo for axillary hyperhidrosis. For other treatments data were synthesised narratively due to limited and heterogeneous data.
Fifty studies were included in the review; thirty-two randomized controlled trials (RCTs), seventeen non-RCTs and one case series. There was substantial variation between the studies in terms of country of origin (indicating climate and population differences), interventions and methods of outcome assessment. Most studies were small, at high risk of bias and poorly reported. There was moderate quality evidence of a large statistically significant effect of botulinum toxin injections on axillary hyperhidrosis symptoms in the short to medium term (up to 16 weeks), compared with placebo. There was weak but consistent evidence for iontophoresis for palmar hyperhidrosis. Evidence for other interventions was low or very low quality. Combining the evidence and patient advisor input, we established that further research on the clinical and cost-effectiveness of botulinum toxin injections (with anesthesia) versus iontophoresis for palmar hyperhidrosis would be useful.
The evidence for the effectiveness and safety of treatments for primary hyperhidrosis is limited overall and few firm conclusions can be drawn. However, there is moderate quality evidence to support the use of botulinum toxin injections for axillary hyperhidrosis. A trial comparing botulinum toxin injections with iontophoresis for palmar hyperhidrosis is warranted.
Hyperhidrosis is characterized by uncontrollable excessive sweating, which occurs at rest, regardless of temperature. Symptoms can significantly affect quality of life. There is substantial variation in the secondary care treatment of hyperhidrosis and uncertainty regarding optimal patient management. The objective of the Health Technology Assessment (HTA) was to review the evidence and establish the expected value of undertaking additional research into effective interventions for the management of primary hyperhidrosis in secondary care. Capturing the perspectives of patients and clinicians treating hyperhidrosis was an important part of the research.
The assessment included a systematic review and economic model, including value of information analysis. Patients, dermatologists, a vascular surgeon and a specialist nurse (who set up the UK Hyperhidrosis Support Group) provided advice at various stages, including at an end-of-project workshop, to help interpret results and prioritize research recommendations.
Patients and clinicians considered the key findings of the systematic review and economic analyses to be appropriate. Advisors advocated a trial of botulinum toxin injections (plus anaesthetic) versus iontophoresis for palmar hyperhidrosis. Patients preferred the HydroQoL® tool over other commonly used quality of life tools in hyperhidrosis research.
Primary hyperhidrosis has no discernible cause and is characterised by uncontrollable excessive and unpredictable sweating, which occurs at rest, regardless of temperature. The symptoms of hyperhidrosis can significantly affect quality of life, and can lead to social embarrassment, loneliness, anxiety and depression.
The aim of this literature review was to identify the tools used to measure quality of life in studies of hyperhidrosis. Patient advisors provided insight and their perspective.
Studies were identified through searches undertaken in January 2016. The search strategies combined topic terms for hyperhidrosis with a recognised search filter for “quality of life”. All studies that reported measuring quality of life or described a quality of life measure/tool in the context of primary hyperhidrosis were included. The information on the tools and their use in hyperhidrosis was summarized in a narrative synthesis. Patient advisors contributed to the interpretation of the findings.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
Objectives: Fatigue is a common and persisting symptom after childhood brain injury. This study examined whether child characteristics and symptomatology preinjury or 6 months postinjury (pain, sleep, and mood, inattention) predicted fatigue at 12months postinjury. Methods: Parents of 79 children (0–18 years) rated fatigue at 12 months after injury on a multidimensional scale (general, sleep/rest, and cognitive). Demographic and clinical data were collected at injury. Parents rated child sleep, pain, physical/motor function, mood, and inattention at injury (preinjury description), and 6 months postinjury. Children were divided into two traumatic brain injury severity groups: mild TBI (n=57) and moderate/severe TBI (n=27). Hierarchical regression models were used to examine (i) preinjury factors and (ii) symptoms 6 months postinjury predictive of fatigue (general, sleep/rest, and cognitive) at 12 months postinjury. Results: Sleep/rest fatigue was predicted by preinjury fatigue (7% of variance) and psychological symptoms preinjury (10% of variance). General fatigue was predicted by physical/motor symptoms (27%), sleep (10%) and mood symptoms (9%) 6 months postinjury. Sleep/rest fatigue was predicted by physical/motor symptoms (10%), sleep symptoms (13%) and mood symptoms (9%) 6 months postinjury. Cognitive fatigue was predicted by physical/motor symptoms (17%) 6 months postinjury. Conclusions: Preinjury fatigue and psychological functioning identified those at greatest risk of fatigue 12 months post-TBI. Predictors of specific fatigue domains at 12 months differed across each of the domains, although consistently included physical/motor function as well as sleep and mood symptoms postinjury. (JINS, 2018, 24, 224–236)
This paper reports the findings of a scoping review on the organisation and delivery of health improvement activities in general practice and the primary healthcare team. The project was designed to examine who delivers these interventions, where they are located, what approaches are developed in practices and how individual practices and the primary healthcare team organise such public health activities and how these contribute to health improvement. Our focus was on health promotion and prevention activities and aimed to identify the current extent of knowledge about the health improvement activities in general practice and the wider primary healthcare team. Many of the research studies reviewed had some details about the type, process, location or who provided the intervention. Little attention is paid in the literature to examining the impact of the organisational context on the way services are delivered or how this affects the effectiveness of health improvement interventions in general practice. We found that the focus of attention is mainly on individual prevention approaches with practices engaging in both primary and secondary prevention. Although many GPs do not take a population approach and focus on individual patients some do see health promotion as an integral part of practice – whether as individual approaches to primary or secondary health improvement or as a practice-based approach to improving the health of their patients. Based on our analysis we conclude that there is insufficient good evidence to support many of the health improvement interventions undertaken in general practice and primary care.
The revised Dietary Guideline Index (DGI-2013) scores individuals’ diets according to their compliance with the Australian Dietary Guideline (ADG). This cross-sectional study assesses the diet quality of 794 community-dwelling men aged 74 years and older, living in Sydney, Australia participating in the Concord Health and Ageing in Men Project; it also examines sociodemographic and lifestyle factors associated with DGI-2013 scores; it studies associations between DGI-2103 scores and the following measures: homoeostasis model assessment – insulin resistance, LDL-cholesterol, HDL-cholesterol, TAG, blood pressure, waist:hip ratio, BMI, number of co-morbidities and medications and frailty status while also accounting for the effect of ethnicity in these relationships. Median DGI-2013 score was 93·7 (54·4, 121·2); most individuals failed to meet recommendations for vegetables, dairy products and alternatives, added sugar, unsaturated fat and SFA, fluid and discretionary foods. Lower education, income, physical activity levels and smoking were associated with low scores. After adjustments for confounders, high DGI-2013 scores were associated with lower HDL-cholesterol, lower waist:hip ratios and lower probability of being frail. Proxies of good health (fewer co-morbidities and medications) were not associated with better compliance to the ADG. However, in participants with a Mediterranean background, low DGI-2013 scores were not generally associated with poorer health. Older men demonstrated poor diet quality as assessed by the DGI-2013, and the association between dietary guidelines and health measures and indices may be influenced by ethnic background.
Research into the analysis, physical properties and health effects of dietary fibre has continued steadily over the last 40–50 years. From the knowledge gained, countries have developed guidelines for their populations on the optimal amount of fibre to be consumed each day. Food composition tables from many countries now contain values for the dietary fibre content of foods, and, from these, combined with dietary surveys, population intakes have been determined. The present review assessed the uniformity of the analytical methods used, health claims permitted, recommendations and intakes, particularly from national surveys across Europe and around the world. It also assessed current knowledge on health effects of dietary fibre and related the impact of different fibre types on health. The overall intent was to be able to provide more detailed guidance on the types of fibre which should be consumed for good health, rather than simply a total intake figure, the current situation. Analysis of data indicated a fair degree of uniformity in the definition of dietary fibre, the method used for analysis, the recommended amount to be consumed and a growing literature on effects on digestive health and disease risk. However, national dietary survey data showed that intakes do not reach recommendations and very few countries provide guidance on the types of fibre that are preferable to achieve recommended intakes. Research gaps were identified and ideas suggested to provide information for more detailed advice to the public about specific food sources that should be consumed to achieve health benefits.
Nutrition in the second year is important as this is a period of rapid growth and development. Milk is a major food for young children and this analysis evaluated the impact of the type of milk consumed on nutrient intakes and nutritional status. Data from the Diet and Nutrition Survey of Infants and Young Children were used to investigate the intakes of key nutrients, and Fe and vitamin D status, of children aged 12–18 months, not breastfed, and consuming >400 g/d fortified milk (n 139) or >400 g/d of whole cows’ milk (n 404). Blood samples from eligible children for measurement of Hb (n 113), serum ferritin and plasma 25-hydroxyvitamin D (25(OH)D) concentrations (n 105) were available for approximately 20 % of children. Unpaired Mann–Whitney tests were used to compare nutrient intakes and status between consumers of fortified and cows’ milk. Mean daily total dietary intakes of Fe, Zn, vitamin A and vitamin D were significantly higher in the fortified milk group. Mean daily total dietary intakes of energy, protein, Ca, iodine, Na and saturated fat were significantly higher in the cows’ milk group. Hb was not different between groups. The fortified milk group had significantly higher serum ferritin (P = 0·049) and plasma 25(OH)D (P = 0·014). This analysis demonstrates significantly different nutrient intakes and status between infants consuming >400 g/d fortified milk v. those consuming >400 g/d whole cows’ milk. These results indicate that fortified milks can play a significant role in improving the quality of young children's diets in their second year of life.
Studying irregular meal patterns fits in with the latest research focusing not only on what people eat but also when they eat, also called chrono-nutrition. Chrono-nutrition involves studying the impact of nutrition on metabolism via circadian patterns, including three aspects of time: (ir)regularity, frequency and clock time. The present paper aimed to narratively review research on irregular meal patterns and cardiometabolic consequences. Only few cross-sectional studies and prospective cohort studies were identified, and most of these suggested that eating meals irregularly is associated with a higher risk of the metabolic syndrome and cardiometabolic risk factors, including BMI and blood pressure. This was supported by two randomised controlled intervention studies showing that consuming meals regularly for 2 weeks v. an irregular meal pattern, led to beneficial impact on cardiometabolic risk factors as lower peak insulin, lower fasting total and LDL-cholesterol, both in lean and obese women. In conclusion, the limited evidence on meal regularity and cardiometabolic consequences supports the hypothesis that consuming meals irregularly is adversely associated with cardiometabolic risk. However, it also highlights the need for more large-scale studies, including detailed dietary assessment to further advance the understanding of the impact of chrono-nutrition on public health.
The benefits of fetoscopic laser photocoagulation (FLP) for treatment of twin-to-twin transfusion syndrome (TTTS) have been recognized for over a decade, yet access to FLP remains limited in many settings. This means at a population level, the potential benefits of FLP for TTTS are far from being fully realized. In part, this is because there are many centers where the case volume is relatively low. This creates an inevitable tension; on one hand, wanting FLP to be readily accessible to all women who may need it, yet on the other, needing to ensure that a high degree of procedural competence is maintained. Some of the solutions to these apparently competing priorities may be found in novel training solutions to achieve, and maintain, procedural proficiency, and with the increased utilization of ‘competence based’ assessment and credentialing frameworks. We suggest an under-utilized approach is the development of collaborative surgical services, where pooling of personnel and resources can improve timely access to surgery, improve standardized assessment and management of TTTS, minimize the impact of the surgical learning curve, and facilitate audit, education, and research. When deciding which centers should offer laser for TTTS and how we decide, we propose some solutions from a collaborative model.
Irregularity in eating patterns could be a potential cardiometabolic risk factor. We aimed to study the associations of irregular intake of energy at meals in relation to cardiometabolic risk factors 10 and 17 years later. Variability of energy intake data – derived from 5-d estimated diet diaries of cohort members of the National Survey for Health and Development collected at ages 36 (n 1416), 43 (n 1505) and 53 years (n 1381) – was used as a measure for irregularity. Associations between meal irregularity scores with cardiometabolic risk factors measured 10 and 17 years later were investigated using linear mixed models and logistic regression models. The results showed that irregularity scores changed significantly over the years (P<0·05). At age 36 years, subjects with a more irregular intake of energy at lunch (OR 1·42; 95 % CI 1·05, 1·91) and between meals (OR 1·35; 95 % CI 1·01, 1·82) had an increased risk for the metabolic syndrome 17 years later; at lunch was also associated with an increased waist circumference (OR 1·58; 95 % 1·27, 1·96) and TAG levels (OR 1·33; 95 % CI 1·02, 1·72). At age 43 years, subjects with a more irregular intake at breakfast had an increased risk of the metabolic syndrome 10 years later (OR 1·53; 95 % CI 1·15, 2·04), as well as an increased BMI (OR 1·66; 95 % CI 1·31, 2·10), waist circumference (OR 1·53; 95 % CI 1·23, 1·90) and diastolic blood pressure (OR 1·42; 95 % CI 1·13, 1·78). In conclusion, subjects with a more irregular intake of energy, mostly at breakfast and lunch, appeared to have an increased cardiometabolic risk 10 and 17 years later.
Disorganized attachment is an important early risk factor for socioemotional problems throughout childhood and into adulthood. Prevailing models of the etiology of disorganized attachment emphasize the role of highly dysfunctional parenting, to the exclusion of complex models examining the interplay of child and parental factors. Decades of research have established that extreme child birth weight may have long-term effects on developmental processes. These effects are typically negative, but this is not always the case. Recent studies have also identified the dopamine D4 receptor (DRD4) as a moderator of childrearing effects on the development of disorganized attachment. However, there are inconsistent findings concerning which variant of the polymorphism (seven-repeat long-form allele or non–seven-repeat short-form allele) is most likely to interact with caregiving in predicting disorganized versus organized attachment. In this study, we examined possible two- and three-way interactions and child DRD4 polymorphisms and birth weight and maternal caregiving at age 6 months in longitudinally predicting attachment disorganization at 36 months. Our sample is from the Maternal Adversity, Vulnerability and Neurodevelopment project, a sample of 650 mother–child dyads. Birth weight was cross-referenced with normative data to calculate birth weight percentile. Infant DRD4 was obtained with buccal swabs and categorized according to the presence of the putative allele seven repeat. Macroanalytic and microanalytic measures of maternal behavior were extracted from a videotaped session of 20 min of nonfeeding interaction followed by a 10-min divided attention maternal task at 6 months. Attachment was assessed at 36 months using the Strange Situation procedure, and categorized into disorganized attachment and others. The results indicated that a main effect for DRD4 and a two-way interaction of birth weight and 6-month maternal attention (frequency of maternal looking away behavior) and sensitivity predicted disorganized attachment in robust logistic regression models adjusted for social demographic covariates. Specifically, children in the midrange of birth weight were more likely to develop a disorganized attachment when exposed to less attentive maternal care. However, the association reversed with extreme birth weight (low and high). The DRD4 seven-repeat allele was associated with less disorganized attachment (protective), while non–seven-repeat children were more likely to be classified as disorganized attachment. The implications for understanding inconsistencies in the literature about which DRD4 genotype is the risk direction are also considered. Suggestions for intervention with families with infants at different levels of biological risk and caregiving risk are also discussed.
The 2008 Centers for Medicare & Medicaid Services hospital-acquired conditions policy limited additional payment for conditions deemed reasonably preventable.
To examine whether this policy was associated with decreases in billing rates for 2 targeted conditions, vascular catheter-associated infections (VCAI) and catheter-associated urinary tract infections (CAUTI).
Adult Medicare patients admitted to 569 acute care hospitals in California, Massachusetts, or New York and subject to the policy.
We used an interrupted times series design to assess whether the hospital-acquired conditions policy was associated with changes in billing rates for VCAI and CAUTI.
Before the policy, billing rates for VCAI and CAUTI were increasing (prepolicy odds ratio per quarter for VCAI, 1.17 [95% CI, 1.11–1.23]; for CAUTI, 1.19 [1.16–1.23]). The policy was associated with an immediate drop in billing rates for VCAI and CAUTI (odds ratio for change at policy implementation for VCAI, 0.75 [95% CI, 0.69–0.81]; for CAUTI, 0.87 [0.79–0.96]). In the postpolicy period, we observed a decreasing trend in the billing rate for VCAI and a leveling-off in the billing rate for CAUTI (postpolicy odds ratio per quarter for VCAI, 0.98 [95% CI, 0.97–0.99]; for CAUTI, 0.99 [0.97–1.00]).
The Centers for Medicare & Medicaid Services hospital-acquired conditions policy appears to have been associated with immediate reductions in billing rates for VCAI and CAUTI, followed by a slight decreasing trend or leveling-off in rates. These billing rates, however, may not correlate with changes in clinically meaningful patient outcomes and may reflect changes in coding practices.
Infect. Control Hosp. Epidemiol. 2015;36(8):871–877
Policymakers may wish to align healthcare payment and quality of care while minimizing unintended consequences, particularly for safety net hospitals.
To determine whether the 2008 Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy had a differential impact on targeted healthcare-associated infection rates in safety net compared with non–safety net hospitals.
Interrupted time-series design.
SETTING AND PARTICIPANTS
Nonfederal acute care hospitals that reported central line–associated bloodstream infection and ventilator-associated pneumonia rates to the Centers for Disease Control and Prevention’s National Health Safety Network from July 1, 2007, through December 31, 2013.
We did not observe changes in the slope of targeted infection rates in the postpolicy period compared with the prepolicy period for either safety net (postpolicy vs prepolicy ratio, 0.96 [95% CI, 0.84–1.09]) or non–safety net (0.99 [0.90–1.10]) hospitals. Controlling for prepolicy secular trends, we did not detect differences in an immediate change at the time of the policy between safety net and non–safety net hospitals (P for 2-way interaction, .87).
The Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy did not have an impact, either positive or negative, on already declining rates of central line–associated bloodstream infection in safety net or non–safety net hospitals. Continued evaluations of the broad impact of payment policies on safety net hospitals will remain important as the use of financial incentives and penalties continues to expand in the United States.