To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We surveyed emergency department and urgent care clinicians to assess patterns of use and perceived usefulness of a local antibiotic stewardship application to deliver institution-specific prescribing guidance. Among 114 eligible respondents, the application was widely utilized, and it was perceived to be a useful clinical resource that improved prescribing.
Little is known about the neural substrates of suicide risk in mood disorders. Improving the identification of biomarkers of suicide risk, as indicated by a history of suicide-related behavior (SB), could lead to more targeted treatments to reduce risk.
Participants were 18 young adults with a mood disorder with a history of SB (as indicated by endorsing a past suicide attempt), 60 with a mood disorder with a history of suicidal ideation (SI) but not SB, 52 with a mood disorder with no history of SI or SB (MD), and 82 healthy comparison participants (HC). Resting-state functional connectivity within and between intrinsic neural networks, including cognitive control network (CCN), salience and emotion network (SEN), and default mode network (DMN), was compared between groups.
Several fronto-parietal regions (k > 57, p < 0.005) were identified in which individuals with SB demonstrated distinct patterns of connectivity within (in the CCN) and across networks (CCN-SEN and CCN-DMN). Connectivity with some of these same regions also distinguished the SB group when participants were re-scanned after 1–4 months. Extracted data defined SB group membership with good accuracy, sensitivity, and specificity (79–88%).
These results suggest that individuals with a history of SB in the context of mood disorders may show reliably distinct patterns of intrinsic network connectivity, even when compared to those with mood disorders without SB. Resting-state fMRI is a promising tool for identifying subtypes of patients with mood disorders who may be at risk for suicidal behavior.
Nonhuman primate (NHP) studies are crucial to biomedical research. NHPs are the species most similar to humans in lifespan, body size, and hormonal profiles. Planning research requires statistical power evaluation, which is difficult to perform when lacking directly relevant preliminary data. This is especially true for NHP developmental programming studies, which are scarce. We review the sample sizes reported, challenges, areas needing further work, and goals of NHP maternal nutritional programming studies. The literature search included 27 keywords, for example, maternal obesity, intrauterine growth restriction, maternal high-fat diet, and maternal nutrient reduction. Only fetal and postnatal offspring studies involving tissue collection or imaging were included. Twenty-eight studies investigated maternal over-nutrition and 33 under-nutrition; 23 involved macaques and 38 baboons. Analysis by sex was performed in 19; minimum group size ranged from 1 to 8 (mean 4.7 ± 0.52, median 4, mode 3) and maximum group size from 3 to 16 (8.3 ± 0.93, 8, 8). Sexes were pooled in 42 studies; minimum group size ranged from 2 to 16 (mean 5.3 ± 0.35, median 6, mode 6) and maximum group size from 4 to 26 (10.2 ± 0.92, 8, 8). A typical study with sex-based analyses had group size minimum 4 and maximum 8 per sex. Among studies with sexes pooled, minimum group size averaged 6 and maximum 8. All studies reported some significant differences between groups. Therefore, studies with group sizes 3–8 can detect significance between groups. To address deficiencies in the literature, goals include increasing age range, more frequently considering sex as a biological variable, expanding topics, replicating studies, exploring intergenerational effects, and examining interventions.
Systematic, national surveillance of outbreaks of intestinal infectious disease has been undertaken by Public Health England (PHE) since 1992. Between 1992 and 2002, there were 19 outbreaks linked to raw drinking milk (RDM) or products made using raw milk, involving 229 people; 36 of these were hospitalised. There followed an eleven-year period (2003–2013) where no outbreaks linked to RDM were reported. However, since 2014 seven outbreaks of Escherichia coli O157:H7 (n = 3) or Campylobacter jejuni (n = 4) caused by contaminated RDM were investigated and reported. Between 2014 and 2017, there were 114 cases, five reported hospitalisations and one death. The data presented within this review indicated that the risk of RDM has increased since 2014. Despite the labelling requirements and recommendations that children should not consume RDM, almost a third of outbreak cases were children. In addition, there has been an increase in consumer popularity and in registered RDM producers in the UK. The Food Standards Agency (FSA) continue to provide advice on RDM to consumers and have recently made additional recommendations to enhance existing controls around registration and hygiene of RDM producers.
Replicate radiocarbon (14C) measurements of organic and inorganic control samples, with known Fraction Modern values in the range Fm = 0–1.5 and mass range 6 μg–2 mg carbon, are used to determine both the mass and radiocarbon content of the blank carbon introduced during sample processing and measurement in our laboratory. These data are used to model, separately for organic and inorganic samples, the blank contribution and subsequently “blank correct” measured unknowns in the mass range 25–100 μg. Data, formulas, and an assessment of the precision and accuracy of the blank correction are presented.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Oats can be processed in a variety of ways ranging from minimally processed such as steel-cut oats (SCO), to mildly processed such as large-flake oats (old fashioned oats, OFO), moderately processed such as instant oats (IO) or highly processed in ready-to-eat oat cereals such as Honey Nut Cheerios (HNC). Although processing is believed to increase glycaemic and insulinaemic responses, the effect of oat processing in these respects is unclear. Thus, we compared the glycaemic and insulinaemic responses elicited by 628 kJ portions of SCO, OFO, IO and HNC and a portion of Cream of Rice cereal (CR) containing the same amount of available-carbohydrate (23 g) as the oatmeals. Healthy males (n 18) and females (n 12) completed this randomised, cross-over trial. Blood was taken fasting and at intervals for 3 h following test-meal consumption. Glucose and insulin peak-rises and incremental AUC (iAUC) were subjected to repeated-measures ANOVA using Tukey’s test (two-sided P<0·05) to compare individual means. Glucose peak-rise (primary endpoint, mean (sem) mmol/l) after OFO, 2·19 (sem 0·11), was significantly less than after CR, 2·61 (sem 0·13); and glucose peak-rise after SCO, 1·93 (sem 0·13), was significantly less than after CR, HNC, 2·49 (sem 0·13) and IO 2·47 (sem 0·13). Glucose iAUC was significantly lower after SCO than CR and HNC. Insulin peak rise was similar among the test meals, but insulin iAUC was significantly less after SCO than IO. Thus, the results show that oat processing affects glycaemic and insulinaemic responses with lower responses associated with less processing.
Cougar Mountain Cave is located in Oregon's Fort Rock Basin. In 1958, avocationalist John Cowles excavated most of the cave's deposits and recovered abundant fiber, lithic, wood, and osseous artifacts. A crew from the University of California, Davis returned to the site in 1966 to evaluate the potential for further research, collecting additional lithic and fiber artifacts from disturbed deposits and in situ charcoal from apparently undisturbed deposits. Because Cowles took few notes or photographs, the Cougar Mountain Cave collection—most of which is housed at the Favell Museum in Klamath Falls, Oregon—has largely gone unstudied even though it contains diagnostic artifacts spanning the Holocene and, potentially, the terminal Pleistocene. We recently submitted charcoal and basketry from the site for radiocarbon dating, providing the first reliable sense of when Cougar Mountain Cave was first occupied. Our results indicate at least a Younger Dryas age for initial occupation. The directly dated basketry has provided new information about the age ranges and spatial distributions of diagnostic textile types in the northwestern Great Basin.
Recent commercialization of auxin herbicide–based weed control systems has led to increased off-target exposure of susceptible cotton cultivars to auxin herbicides. Off-target deposition of dilute concentrations of auxin herbicides can occur on cotton at any stage of growth. Field experiments were conducted at two locations in Mississippi from 2014 to 2016 to assess the response of cotton at various growth stages after exposure to a sublethal 2,4-D concentration of 8.3 g ae ha−1. Herbicide applications occurred weekly from 0 to 14 weeks after emergence (WAE). Cotton exposure to 2,4-D at 2 to 9 WAE resulted in up to 64% visible injury, whereas 2,4-D exposure 5 to 6 WAE resulted in machine-harvested yield reductions of 18% to 21%. Cotton maturity was delayed after exposure 2 to 10 WAE, and height was increased from exposure 6 to 9 WAE due to decreased fruit set after exposure. Total hand-harvested yield was reduced from 2,4-D exposure 3, 5 to 8, and 13 WAE. Growth stage at time of exposure influenced the distribution of yield by node and position. Yield on lower and inner fruiting sites generally decreased from exposure, and yield partitioned to vegetative or aborted positions and upper fruiting sites increased. Reductions in gin turnout, micronaire, fiber length, fiber-length uniformity, and fiber elongation were observed after exposure at certain growth stages, but the overall effects on fiber properties were small. These results indicate that cotton is most sensitive to low concentrations of 2,4-D during late vegetative and squaring growth stages.
Frequent calls to 911 and requests for emergency services by individuals place a costly burden on emergency response systems and emergency departments (EDs) in the United States. Many of the calls by these individuals are non-emergent exacerbations of chronic conditions and could be treated more effectively and cost efficiently through another health care service. Mobile integrated community health (MICH) programs present a possible partial solution to the over-utilization of emergency services by addressing factors which contribute to a patient’s likelihood of frequent Emergency Medical Services (EMS) use. To provide effective care to eligible individuals, MICH providers must have a working understanding of the common conditions they will encounter.
The purpose of this descriptive study was to evaluate the diagnosis prevalence and comorbidity among participants in the Queen Anne’s County (Maryland USA) MICH Program. This fundamental knowledge of the most common medical conditions within the MICH Program will inform future mobile integrated health programs and providers.
This study examined preliminary data from the MICH Program, as well as 2017 Maryland census data. It involved secondary analysis of de-identified patient records and descriptive statistical analysis of the disease prevalence, degree of comorbidity, insurance coverage, and demographic characteristics among 97 program participants. Diagnoses were grouped by their ICD-9 classification codes to determine the most common categories of medical conditions. Multiple linear regression models and chi-squared tests were used to assess the association between age, sex, race, ICD-9 diagnosis groups, and comorbidity among program enrollees.
Results indicated the most prevalent diagnoses included hypertension, high cholesterol, esophageal reflux, and diabetes mellitus. Additionally, 94.85% of MICH patients were comorbid; the number of comorbidities per patient ranged from one to 13 conditions, with a mean of 5.88 diagnoses per patient (SD=2.74).
Overall, patients in the MICH Program are decidedly medically complex and may be well-suited to additional community intervention to better manage their many conditions. The potential for MICH programs to simultaneously improve patient outcomes and reduce health care costs by expanding into larger public health and addressing the needs of the most vulnerable citizens warrants further study.
ScharfBM, BissellRA, TrevittJL, JenkinsJL.Diagnosis Prevalence and Comorbidity in a Population of Mobile Integrated Community Health Care PatientsPrehosp Disaster Med. 2019;34(1):46–55.
The introduction of auxin herbicide weed control systems has led to increased occurrence of crop injury in susceptible soybeans and cotton. Off-target exposure to sublethal concentrations of dicamba can occur at varying growth stages, which may affect crop response. Field experiments were conducted in Mississippi in 2014, 2015, and 2016 to characterize cotton response to a sublethal concentration of dicamba equivalent to 1/16X the labeled rate. Weekly applications of dicamba at 35 g ae ha−1 were made to separate sets of replicated plots immediately following planting until 14 wk after emergence (WAE). Exposure to dicamba from 1 to 9 WAE resulted in up to 32% visible injury, and exposure from 7 to 10 WAE delayed crop maturity. Exposure from 8 to 10 and 13 WAE led to increased cotton height, while an 18% reduction in machine-harvested yield resulted from exposure at 6 WAE. Cotton exposure at 3 to 9 WAE reduced the seed cotton weight partitioned to position 1 fruiting sites, while exposure at 3 to 6 WAE also reduced yield in position 2 fruiting sites. Exposure at 2, 3, and 5 to 7 WAE increased the percent of yield partitioned to vegetative branches. An increase in percent of yield partitioned to plants with aborted terminals occurred following exposure from 3 to 7 WAE and corresponded with reciprocal decreases in yield partitioned to positional fruiting sites. Minimal effects were observed on fiber quality, except for decreases in fiber length uniformity resulting from exposure at 9 and 10 WAE.
The American College of Cardiology Quality Network enables national benchmarking and collaborative quality improvement through vetted metrics. We describe here our initial experience with the Quality Network.
Quarterly data for metrics pertaining to chest pain, Kawasaki disease, tetralogy of Fallot, elevated body mass index, and others were shared with the collaboratives for benchmarking. National improvement efforts focussed on counselling for elevated body mass index and 22q11.2 testing in tetralogy of Fallot. Improvement strategies included developing multi-disciplinary workgroups, educational materials, and electronic health record advances.
Chest pain metric performance was high compared with national means: obtaining family history (90–100% versus 51–77%), electrocardiogram (100% versus 89–99%), and echocardiogram for exertional complaints (95–100% versus 74–96%). Kawasaki metric performance was high, including obtaining coronary measurements (100% versus 85–97%), prescribing aspirin (100% versus 86–99%), follow-up with imaging (100% versus 85–98%), and documenting no activity restriction without coronary aneurysms (83–100% versus 64–93%). Counselling for elevated body mass index was variable (25–75% versus 31–50%) throughout quality improvement efforts. Testing for 22q11.2 deletion in tetralogy of Fallot patients was consistently above the national mean (60–85% versus 54–68%) with improved genetics data capture.
The Quality Network promotes meaningful benchmarking and collaborative quality improvement. Our high performance for chest pain and Kawasaki metrics is likely related to previous improvement efforts in chest pain management and a dedicated Kawasaki team. Uptake of counselling for elevated body mass index is variable; stronger engagement among numerous providers is needed. Recommendations for 22q11.2 testing in tetralogy of Fallot were widely recognised and implemented.
Tuberculosis (TB) is the leading global infectious cause of death. Understanding TB transmission is critical to creating policies and monitoring the disease with the end goal of TB elimination. To our knowledge, there has been no systematic review of key transmission parameters for TB. We carried out a systematic review of the published literature to identify studies estimating either of the two key TB transmission parameters: the serial interval (SI) and the reproductive number. We identified five publications that estimated the SI and 56 publications that estimated the reproductive number. The SI estimates from four studies were: 0.57, 1.42, 1.44 and 1.65 years; the fifth paper presented age-specific estimates ranging from 20 to 30 years (for infants <1 year old) to <5 years (for adults). The reproductive number estimates ranged from 0.24 in the Netherlands (during 1933–2007) to 4.3 in China in 2012. We found a limited number of publications and many high TB burden settings were not represented. Certain features of TB dynamics, such as slow transmission, complicated parameter estimation, require novel methods. Additional efforts to estimate these parameters for TB are needed so that we can monitor and evaluate interventions designed to achieve TB elimination.
Smartphones are increasingly used to access clinical decision support, and many medical applications provide antimicrobial prescribing guidance. However, these applications do not account for local antibiotic resistance patterns and formularies. We implemented an institution-specific antimicrobial stewardship smartphone application and studied patterns of use over a 1-year period.
Konjac glucomannan (KGM) is a viscous dietary fibre that can form a solid, low-energy gel when hydrated and is commonly consumed in a noodle form (KGM-gel). Increased meal viscosity from gel-forming fibres have been associated with prolonged satiety, but no studies to date have evaluated this effect with KGM-gel. Thus, our objective was to evaluate subsequent food intake and satiety of KGM-gel noodles when replacing a high-carbohydrate preload, in a dose–response manner. Utilising a randomised, controlled, cross-over design, sixteen healthy individuals (twelve females/four males; age: 26·0 (sd 11·8) years; BMI: 23·1 (sd 3·2) kg/m2) received 325 ml volume-matched preloads of three KGM-gel noodle substitution levels: (i) all pasta with no KGM-gel (1849 kJ (442 kcal), control), half pasta and half KGM-gel (1084 kJ (259 kcal), 50-KGM) or no pasta and all KGM-gel (322 kJ (77 kcal), 100-KGM). Satiety was assessed over 90 min followed by an ad libitum dessert. Compared with control, cumulative energy intake was 47 % (−1761 kJ (−421 kcal)) and 23 % (−841 kJ (−201 kcal)) lower for 100-KGM and 50-KGM, respectively (both P<0·001), but no differences in subsequent energy intake was observed. Ratings of hunger were 31 % higher (P=0·03) for 100-KGM when compared with control, and were 19 % lower (P=0·04) for fullness and 28 % higher (P=0·04) for prospective consumption when comparing 100-KGM to 50-KGM. Palatability was similar across all treatments. Replacement of a high-carbohydrate preload with low-energy KGM-gel noodles did not promote additional food intake despite large differences in energy. The energy deficit incurred from partial KGM-gel substitution may have relevance in weight loss regimens, and should be further evaluated beyond the healthy population.
Because the Anthropocene by definition is an epoch during which environmental change is largely anthropogenic and driven by social, economic, psychological and political forces, environmental social scientists can effectively analyse human behaviour and knowledge systems in this context. In this subject review, we summarize key ways in which the environmental social sciences can better inform fisheries management policy and practice and marine conservation in the Anthropocene. We argue that environmental social scientists are particularly well positioned to synergize research to fill the gaps between: (1) local behaviours/needs/worldviews and marine resource management and biological conservation concerns; and (2) large-scale drivers of planetary environmental change (globalization, affluence, technological change, etc.) and local cognitive, socioeconomic, cultural and historical processes that shape human behaviour in the marine environment. To illustrate this, we synthesize the roles of various environmental social science disciplines in better understanding the interaction between humans and tropical marine ecosystems in developing nations where issues arising from human–coastal interactions are particularly pronounced. We focus on: (1) the application of the environmental social sciences in marine resource management and conservation; (2) the development of ‘new’ socially equitable marine conservation; (3) repopulating the seascape; (4) incorporating multi-scale dynamics of marine social–ecological systems; and (5) envisioning the future of marine resource management and conservation for producing policies and projects for comprehensive and successful resource management and conservation in the Anthropocene.
Management of volunteer horseradish is a challenge when it is grown in rotation with other crops, such as corn and soybean. Although volunteer horseradish may not cause yield loss, these plants serve as hosts for various soilborne pathogens that damage subsequent horseradish crops. In addition to volunteer horseradish, glyphosate-resistant Palmer amaranth is becoming difficult to control in southwestern Illinois, as a consequence of the plant’s ability to withstand glyphosate and drought, produce many seeds, and grow rapidly. Field studies were conducted to evaluate the effect of glyphosate and dicamba on volunteer horseradish and Palmer amaranth control in 2014 and 2015. Glyphosate alone (1,265 and 1,893 g ae ha−1) and glyphosate plus dicamba at the high rate (1,680 g ae ha−1) provided the greatest volunteer horseradish control, ranging from 81% to 89% and 90% to 93%, respectively. Measures of root biomass reduction followed similar trends. Glyphosate alone provided the greatest reduction in volunteer horseradish root viability (79% to 100%) but was similar in efficacy to applications of glyphosate plus dicamba in most comparisons. Efficacy of PRE-only applications on Palmer amaranth control ranged from 92% to 99% control in 2014 and 68% to 99% in 2015. However, PRE-only applications were often similar in efficacy to PRE followed by (fb) glyphosate plus dicamba applied POST. Treatments containing flumioxazin did not control Palmer amaranth as well as other treatments. POST applications alone were not effective in managing Palmer amaranth. Many of the PRE fb POST treatment options tested will improve resistance management over PRE-only treatments, provide control of Palmer amaranth, and allow horseradish to be planted the following spring.
Fetal growth restriction (FGR) and preterm birth are frequent co-morbidities, both are independent risks for brain injury. However, few studies have examined the mechanisms by which preterm FGR increases the risk of adverse neurological outcomes. We aimed to determine the effects of prematurity and mechanical ventilation (VENT) on the brain of FGR and appropriately grown (AG, control) lambs. We hypothesized that FGR preterm lambs are more vulnerable to ventilation-induced acute brain injury. FGR was surgically induced in fetal sheep (0.7 gestation) by ligation of a single umbilical artery. After 4 weeks, preterm lambs were euthanized at delivery or delivered and ventilated for 2 h before euthanasia. Brains and cerebrospinal fluid (CSF) were collected for analysis of molecular and structural indices of early brain injury. FGRVENT lambs had increased oxidative cell damage and brain injury marker S100B levels compared with all other groups. Mechanical ventilation increased inflammatory marker IL-8 within the brain of FGRVENT and AGVENT lambs. Abnormalities in the neurovascular unit and increased blood–brain barrier permeability were observed in FGRVENT lambs, as well as an altered density of vascular tight junctions markers. FGR and AG preterm lambs have different responses to acute injurious mechanical ventilation, changes which appear to have been developmentally programmed in utero.