To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine sociodemographic factors associated with occupational, recreational and firearm-related noise exposure.
This nationally representative, multistage, stratified, cluster cross-sectional study sampled eligible National Health and Nutrition Examination Survey participants aged 20–69 years (n = 4675) about exposure to occupational and recreational noise and recurrent firearm usage, using a weighted multivariate logistic regression analysis.
Thirty-four per cent of participants had exposure to occupational noise and 12 per cent to recreational noise, and 13 per cent repeatedly used firearms. Males were more likely than females to have exposure to all three noise types (adjusted odds ratio range = 2.63–14.09). Hispanics and Asians were less likely to have exposure to the three noise types than Whites. Blacks were less likely than Whites to have occupational and recurrent firearm noise exposure. Those with insurance were 26 per cent less likely to have exposure to occupational noise than those without insurance (adjusted odds ratio = 0.74, 95 per cent confidence interval = 0.60–0.93).
Whites, males and uninsured people are more likely to have exposure to potentially hazardous loud noise.
Nutritional therapy is a cornerstone of burns management. The optimal macronutrient intake for wound healing after burn injury has not been identified, although high-energy, high-protein diets are favoured. This study aimed to identify the optimal macronutrient intake for burn wound healing. The Geometric Framework (GF) was used to analyse wound healing after a 10% TBSA contact burn in mice ad libitum fed one of 11 high-energy diets, varying in macronutrient composition with protein (P5%-60%), carbohydrate (C20%-75%) and fat (F20%-75%). In the GF study, the optimal ratio for wound healing was identified as a moderate-protein, high-carbohydrate diet with a protein:carbohydrate:fat (P:C:F) ratio of 1:4:2. High-carbohydrate intake was associated with lower mortality, improved body weight and a beneficial pattern of body fat reserves. Protein intake was essential to prevent weight loss and mortality, but a protein intake target of ~7 kJ/day (~15% of energy intake) was identified, above which no further benefit was gained. High-protein intake was associated with delayed wound healing and increased liver and spleen weight. As the GF study demonstrated that an initial very high-protein intake prevented mortality, a very high-protein, moderate-carbohydrate diet (P40:C42:F18) was specifically designed. The dynamic diet study was also designed to combine and validate the benefits of an initial very high-protein intake for mortality, and subsequent moderate-protein, high-carbohydrate intake for optimal wound healing. The dynamic feeding experiment showed switching from an initial very high-protein diet to the optimal moderate-protein, high-carbohydrate diet accelerated wound healing whilst preventing mortality and liver enlargement.
NASA has put people in unique and extreme environments for over six decades. Supporting these individuals with a comprehensive health-care system has evolved over this period. As the Apollo program ended and NASA began to contemplate a space shuttle and space station program, societal pressures in the late 1960s and early 1970s caused federal agencies such as NASA to reconsider how to link the needs of the space program with a growing pressure to address societal needs by forging interagency partnerships. The Space Technology Applied to the Rural Papago Health Care (STARPAHC) project provides an example of how NASA sought to balance these two imperatives in an age of diminishing federal support. This project can provide lessons for today’s uncertain budgetary future for agencies such as NASA, which are once again being asked to find creative and innovative ways to support their missions while demonstrating their larger value to society.
Anecdotal evidence suggests the use of bolus tube feeding is increasing in the long-term home enteral tube feed (HETF) patients. A cross-sectional survey to assess the prevalence of bolus tube feeding and to characterise these patients was undertaken. Dietitians from ten centres across the UK collected data on all adult HETF patients on the dietetic caseload receiving bolus tube feeding (n 604, 60 % male, age 58 years). Demographic data, reasons for tube and bolus feeding, tube and equipment types, feeding method and patients’ complete tube feeding regimens were recorded. Over a third of patients receiving HETF used bolus feeding (37 %). Patients were long-term tube fed (4·1 years tube feeding, 3·5 years bolus tube feeding), living at home (71 %) and sedentary (70 %). The majority were head and neck cancer patients (22 %) who were significantly more active (79 %) and lived at home (97 %), while those with cerebral palsy (12 %) were typically younger (age 31 years) but sedentary (94 %). Most patients used bolus feeding as their sole feeding method (46 %), because it was quick and easy to use, as a top-up to oral diet or to mimic mealtimes. Importantly, oral nutritional supplements (ONS) were used for bolus feeding in 85 % of patients, with 51 % of these being compact-style ONS (2·4 kcal (10·0 kJ)/ml, 125 ml). This survey shows that bolus tube feeding is common among UK HETF patients, is used by a wide variety of patient groups and can be adapted to meet the needs of a variety of patients, clinical conditions, nutritional requirements and lifestyles.
Online self-reported 24-h dietary recall systems promise increased feasibility of dietary assessment. Comparison against interviewer-led recalls established their convergent validity; however, reliability and criterion-validity information is lacking. The validity of energy intakes (EI) reported using Intake24, an online 24-h recall system, was assessed against concurrent measurement of total energy expenditure (TEE) using doubly labelled water in ninety-eight UK adults (40–65 years). Accuracy and precision of EI were assessed using correlation and Bland–Altman analysis. Test–retest reliability of energy and nutrient intakes was assessed using data from three further UK studies where participants (11–88 years) completed Intake24 at least four times; reliability was assessed using intra-class correlations (ICC). Compared with TEE, participants under-reported EI by 25 % (95 % limits of agreement −73 % to +68 %) in the first recall, 22 % (−61 % to +41 %) for average of first two, and 25 % (−60 % to +28 %) for first three recalls. Correlations between EI and TEE were 0·31 (first), 0·47 (first two) and 0·39 (first three recalls), respectively. ICC for a single recall was 0·35 for EI and ranged from 0·31 for Fe to 0·43 for non-milk extrinsic sugars (NMES). Considering pairs of recalls (first two v. third and fourth recalls), ICC was 0·52 for EI and ranged from 0·37 for fat to 0·63 for NMES. EI reported with Intake24 was moderately correlated with objectively measured TEE and underestimated on average to the same extent as seen with interviewer-led 24-h recalls and estimated weight food diaries. Online 24-h recall systems may offer low-cost, low-burden alternatives for collecting dietary information.
Introduction: The Prehospital Evidence-Based Practice (PEP) program is an online, freely accessible, continuously updated Emergency Medical Services (EMS) evidence repository. This summary describes the research evidence for the identification and management of adult patients suffering from sepsis syndrome or septic shock. Methods: PubMed was searched in a systematic manner. One author reviewed titles and abstracts for relevance and two authors appraised each study selected for inclusion. Primary outcomes were extracted. Studies were scored by trained appraisers on a three-point Level of Evidence (LOE) scale (based on study design and quality) and a three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings based on the studies’ primary outcome for each intervention). LOE and DOE of each intervention were plotted on an evidence matrix (DOE x LOE). Results: Eighty-eight studies were included for 15 interventions listed in PEP. The interventions with the most evidence were related to identification tools (ID) (n = 26, 30%) and early goal directed therapy (EGDT) (n = 21, 24%). ID tools included Systematic Inflammatory Response Syndrome (SIRS), quick Sequential Organ Failure Assessment (qSOFA) and other unique measures. The most common primary outcomes were related to diagnosis (n = 30, 34%), mortality (n = 40, 45%) and treatment goals (e.g. time to antibiotic) (n = 14, 16%). The evidence rank for the supported interventions were: supportive-high quality (n = 1, 7%) for crystalloid infusion, supportive-moderate quality (n = 7, 47%) for identification tools, prenotification, point of care lactate, titrated oxygen, temperature monitoring, and supportive-low quality (n = 1, 7%) for vasopressors. The benefit of prehospital antibiotics and EGDT remain inconclusive with a neutral DOE. There is moderate level evidence opposing use of high flow oxygen. Conclusion: EMS sepsis interventions are informed primarily by moderate quality supportive evidence. Several standard treatments are well supported by moderate to high quality evidence, as are identification tools. However, some standard in-hospital therapies are not supported by evidence in the prehospital setting, such as antibiotics, and EGDT. Based on primary outcomes, no identification tool appears superior. This evidence analysis can guide selection of appropriate prehospital therapies.
Salmonella enterica serovar Wangata (S. Wangata) is an important cause of endemic salmonellosis in Australia, with human infections occurring from undefined sources. This investigation sought to examine possible environmental and zoonotic sources for human infections with S. Wangata in north-eastern New South Wales (NSW), Australia. The investigation adopted a One Health approach and was comprised of three complimentary components: a case–control study examining human risk factors; environmental and animal sampling; and genomic analysis of human, animal and environmental isolates. Forty-eight human S. Wangata cases were interviewed during a 6-month period from November 2016 to April 2017, together with 55 Salmonella Typhimurium (S. Typhimurium) controls and 130 neighbourhood controls. Indirect contact with bats/flying foxes (S. Typhimurium controls (adjusted odds ratio (aOR) 2.63, 95% confidence interval (CI) 1.06–6.48)) (neighbourhood controls (aOR 8.33, 95% CI 2.58–26.83)), wild frogs (aOR 3.65, 95% CI 1.32–10.07) and wild birds (aOR 6.93, 95% CI 2.29–21.00) were statistically associated with illness in multivariable analyses. S. Wangata was detected in dog faeces, wildlife scats and a compost specimen collected from the outdoor environments of cases’ residences. In addition, S. Wangata was detected in the faeces of wild birds and sea turtles in the investigation area. Genomic analysis revealed that S. Wangata isolates were relatively clonal. Our findings suggest that S. Wangata is present in the environment and may have a reservoir in wildlife populations in north-eastern NSW. Further investigation is required to better understand the occurrence of Salmonella in wildlife groups and to identify possible transmission pathways for human infections.
The design of mixed-technology quasi-reflectionless planar bandpass filters (BPFs), bandstop filters (BSFs), and multi-band filters is reported. The proposed quasi-reflectionless filter architectures comprise a main filtering section that determines the power transmission response (bandpass, bandstop, or multi-band type) of the overall circuit network and auxiliary sections that absorb the reflected radio-frequency (RF) signal energy. By loading the input and output ports of the main filtering section with auxiliary filtering sections that exhibit a complementary transfer function with regard to the main one, a symmetric quasi-reflectionless behavior can be obtained at both accesses of the overall filter. The operating principles of the proposed filter concept are shown through synthesized first-order BPF and BSF designs. Selectivity-increase techniques are also described. They are based on: (i) cascading in-series multiple first-order stages and (ii) increasing the order of the filtering sections. Moreover, the RF design of quasi-reflectionless multi-band BPFs and BSFs is discussed. A hybrid integration scheme in which microstrip-type and lumped-elements are effectively combined within the filter volume is investigated for size miniaturization purposes. For experimental validation purposes, two quasi-reflectionless BPF prototypes (one- and two-stage architectures) centered at 2 GHz and a second-order BSF prototype centered at 1 GHz were designed, manufactured, and measured.
We assessed whether paternal demographic, anthropometric and clinical factors influence the risk of an infant being born large-for-gestational-age (LGA). We examined the data on 3659 fathers of term offspring (including 662 LGA infants) born to primiparous women from Screening for Pregnancy Endpoints (SCOPE). LGA was defined as birth weight >90th centile as per INTERGROWTH 21st standards, with reference group being infants ⩽90th centile. Associations between paternal factors and likelihood of an LGA infant were examined using univariable and multivariable models. Men who fathered LGA babies were 180 g heavier at birth (P<0.001) and were more likely to have been born macrosomic (P<0.001) than those whose infants were not LGA. Fathers of LGA infants were 2.1 cm taller (P<0.001), 2.8 kg heavier (P<0.001) and had similar body mass index (BMI). In multivariable models, increasing paternal birth weight and height were independently associated with greater odds of having an LGA infant, irrespective of maternal factors. One unit increase in paternal BMI was associated with 2.9% greater odds of having an LGA boy but not girl; however, this association disappeared after adjustment for maternal BMI. There were no associations between paternal demographic factors or clinical history and infant LGA. In conclusion, fathers who were heavier at birth and were taller were more likely to have an LGA infant, but maternal BMI had a dominant influence on LGA.
Herbicide active ingredients, formulation type, ambient temperature, and humidity can influence volatility. A method was developed using volatility chambers to compare relative volatility of different synthetic auxin herbicide formulations in controlled environments. 2,4-D or dicamba acid vapors emanating after application were captured in air-sampling tubes at 24, 48, 72, and 96 h after herbicide application. The 2,4-D or dicamba was extracted from sample tubes and quantified using liquid chromatography and tandem mass spectrometry. Volatility from 2,4-D dimethylamine (DMA) was determined to be greater than that of 2,4-D choline in chambers where temperatures were held at 30 or 40 C and relative humidity (RH) was 20% or 50%. Air concentration of 2,4-D DMA was 0.399 µg m−3 at 40 C and 20% RH compared with 0.005 µg m−3 for 2,4-D choline at the same temperature and humidity at 24 h after application. Volatility from 2,4-D DMA and 2,4-D choline increased as temperature increased from 30 to 40 C. However, volatility from 2,4-D choline was lower than observed from 2,4-D DMA. Volatility from 2,4-D choline at 40 C increased from 0.00458 to 0.0263 µg m−3 and from 0.00341 to 0.025 µg m−3 when humidity increased from 20% to 50% at 72 and 96 h after treatment, respectively, whereas, volatility from 2,4-D DMA tended to be higher at 20% RH compared with 50% RH. Air concentration of dicamba diglycolamine was similar at all time points when measured at 40 C and 20% RH. By 96 h after treatment, there was a trend for lower air concentration of dicamba compared with earlier timings. This method using volatility chambers provided good repeatability with low variability across replications, experiments, and herbicides.
The two major approaches to studying macroevolution in deep time are the fossil record and reconstructed relationships among extant taxa from molecular data. Results based on one approach sometimes conflict with those based on the other, with inconsistencies often attributed to inherent flaws of one (or the other) data source. Any contradiction between the molecular and fossil records represents a failure of our ability to understand the imperfections of our data, as both are limited reflections of the same evolutionary history. We therefore need to develop conceptual and mathematical models that jointly explain our observations in both records. Fortunately, the different limitations of each record provide an opportunity to test or calibrate the other, and new methodological developments leverage both records simultaneously. However, we must reckon with the distinct relationships between sampling and time in the fossil record and molecular phylogenies. These differences impact our recognition of baselines and the analytical incorporation of age estimate uncertainty.
SNP in the vitamin D receptor (VDR) gene is associated with risk of lower respiratory infections. The influence of genetic variation in the vitamin D pathway resulting in susceptibility to upper respiratory infections (URI) has not been investigated. We evaluated the influence of thirty-three SNP in eleven vitamin D pathway genes (DBP, DHCR7, RXRA, CYP2R1, CYP27B1, CYP24A1, CYP3A4, CYP27A1, LRP2, CUBN and VDR) resulting in URI risk in 725 adults in London, UK, using an additive model with adjustment for potential confounders and correction for multiple comparisons. Significant associations in this cohort were investigated in a validation cohort of 737 children in Manchester, UK. In all, three SNP in VDR (rs4334089, rs11568820 and rs7970314) and one SNP in CYP3A4 (rs2740574) were associated with risk of URI in the discovery cohort after adjusting for potential confounders and correcting for multiple comparisons (adjusted incidence rate ratio per additional minor allele ≥1·15, Pfor trend ≤0·030). This association was replicated for rs4334089 in the validation cohort (Pfor trend=0·048) but not for rs11568820, rs7970314 or rs2740574. Carriage of the minor allele of the rs4334089 SNP in VDR was associated with increased susceptibility to URI in children and adult cohorts in the United Kingdom.
Families of children born with CHD face added stress owing to uncertainty about the magnitude of the financial burden for medical costs they will face. This study seeks to assess the family responsibility for healthcare bills during the first 12 months of life for commercially insured children undergoing surgery for severe CHD.
The MarketScan® database from Truven was used to identify commercially insured infants in 39 states from 2010 to 2012 with an ICD-9 diagnosis code for transposition of the great arteries, tetralogy of Fallot, or truncus arteriosus, as well as the corresponding procedure code for complete repair. Data extraction identified payment responsibilities of the patients’ families in the form of co-payments, deductibles, and co-insurance during the 1st year of life.
There were 481 infants identified who met the criteria. Average family responsibility for healthcare bills during the 1st year of life was $2928, with no difference between the three groups. The range of out-of-pocket costs was $50–$18,167. Initial hospitalisation and outpatient care accounted for the majority of these responsibilities.
Families of commercially insured children with severe CHD requiring corrective surgery face an average of ~$3000 in out-of-pocket costs for healthcare bills during the first 12 months of their child’s life, although the amount varied considerably. This information provides a framework to alleviate some of the uncertainty surrounding healthcare financial responsibilities, and further examination of the origination of these expenditures may be useful in informing future healthcare policy discussion.
Access to transition-related medical interventions (TRMIs) for transgender veterans has been the subject of substantial public interest and debate. To better inform these important conversations, the current study investigated whether undergoing hormone or surgical transition intervention(s) relates to the frequency of recent suicidal ideation (SI) and symptoms of depression in transgender veterans.
This study included a cross-sectional, national sample of 206 self-identified transgender veterans. They self-reported basic demographics, TRMI history, recent SI, and symptoms of depression through an online survey.
Significantly lower levels of SI experienced in the past year and 2-weeks were seen in veterans with a history of both hormone intervention and surgery on both the chest and genitals in comparison with those who endorsed a history of no medical intervention, history of hormone therapy but no surgical intervention, and those with a history of hormone therapy and surgery on either (but not both) the chest or genitals when controlling for sample demographics (e.g., gender identity and annual income). Indirect effect analyses indicated that lower depressive symptoms experienced in the last 2-weeks mediated the relationship between the history of surgery on both chest and genitals and SI in the last 2-weeks.
Results indicate the potential protective effect that TRMI may have on symptoms of depression and SI in transgender veterans, particularly when both genitals and chest are affirmed with one's gender identity. Implications for policymakers, providers, and researchers are discussed.
Accurate and reproducible patient positioning is a critical step in radiotherapy for breast cancer. This has seen the use of permanent skin markings becoming standard practice in many centres. Permanent skin markings may have a negative impact on long-term cosmetic outcome, which may in turn, have psychological implications in terms of body image. The aim of this study was to investigate the feasibility of using a semi-permanent tattooing device for the administration of skin marks for breast radiotherapy set-up.
Materials and methods
This was designed as a phase II double-blinded randomised-controlled study comparing our standard permanent tattoos with the Precision Plus Micropigmentation (PPMS) device method. Patients referred for radical breast radiotherapy were eligible for the study. Each study participant had three marks applied using a randomised combination of the standard permanent and PPMS methods and was blinded to the type of each mark. Follow up was at routine appointments until 24 months post radiotherapy. Participants and a blind assessor were invited to score the visibility of each tattoo at each follow-up using a Visual Analogue Scale. Tattoo scores at each time point and change in tattoo scores at 24 months were analysed by a general linear model using the patient as a fixed effect and the type of tattoo (standard or research) as covariate. A simple questionnaire was used to assess radiographer feedback on using the PPMS.
In total, 60 patients were recruited to the study, of which 55 were available for follow-up at 24 months. Semi-permanent tattoos were more visible at 24 months than the permanent tattoos. Semi-permanent tattoos demonstrated a greater degree of fade than the permanent tattoos at 24 months (final time point) post completion of radiotherapy. This was not statistically significant, although it was more apparent for the patient scores (p=0·071) than the blind assessor scores (p=0·27). No semi-permanent tattoos required re-marking before the end of radiotherapy and no adverse skin reactions were observed.
The PPMS presents a safe and feasible alternative to our permanent tattooing method. An extended period of follow-up is required to fully assess the extent of semi-permanent tattoo fade.
Experimental studies have shown that human macronutrient regulation minimizes variation in absolute protein intake and consequently energy intake varies passively with dietary protein density (‘protein leverage’). According to the ‘protein leverage hypothesis’ (PLH), protein leverage interacts with a reduction in dietary protein density to drive energy overconsumption and obesity. Worldwide increase in consumption of ultra-processed foods (UPF) has been hypothesized to be an important determinant of dietary protein dilution, and consequently an ecological driving force of energy overconsumption and the obesity pandemic. The present study examined the relationships between dietary contribution of UPF, dietary proportional protein content and the absolute intakes of protein and energy.
National representative cross-sectional study.
National Health and Nutrition Examination Survey 2009–2010.
Participants (n 9042) aged ≥2 years with at least one day of 24 h dietary recall data.
We found a strong inverse relationship between consumption of UPF and dietary protein density, with mean protein content dropping from 18·2 to 13·3 % between the lowest and highest quintiles of dietary contribution of UPF. Consistent with the PLH, increase in the dietary contribution of UPF (previously shown to be inversely associated with protein density) was also associated with a rise in total energy intake, while absolute protein intake remained relatively constant.
The protein-diluting effect of UPF might be one mechanism accounting for their association with excess energy intake. Reducing UPF contribution in the US diet may be an effective way to increase its dietary protein concentration and prevent excessive energy intake.