To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with single-ventricle CHD undergo a series of palliative surgeries that culminate in the Fontan procedure. While the Fontan procedure allows most patients to survive to adulthood, the Fontan circulation can eventually lead to multiple cardiac complications and multi-organ dysfunction. Care for adolescents and adults with a Fontan circulation has begun to transition from a primarily cardiac-focused model to care models, which are designed to monitor multiple organ systems, and using clues from this screening, identify patients who are at risk for adverse outcomes. The complexity of care required for these patients led our centre to develop a multidisciplinary Fontan Management Programme with the primary goals of earlier detection and treatment of complications through the development of a cohesive network of diverse medical subspecialists with Fontan expertise.
Little is known about the neural substrates of suicide risk in mood disorders. Improving the identification of biomarkers of suicide risk, as indicated by a history of suicide-related behavior (SB), could lead to more targeted treatments to reduce risk.
Participants were 18 young adults with a mood disorder with a history of SB (as indicated by endorsing a past suicide attempt), 60 with a mood disorder with a history of suicidal ideation (SI) but not SB, 52 with a mood disorder with no history of SI or SB (MD), and 82 healthy comparison participants (HC). Resting-state functional connectivity within and between intrinsic neural networks, including cognitive control network (CCN), salience and emotion network (SEN), and default mode network (DMN), was compared between groups.
Several fronto-parietal regions (k > 57, p < 0.005) were identified in which individuals with SB demonstrated distinct patterns of connectivity within (in the CCN) and across networks (CCN-SEN and CCN-DMN). Connectivity with some of these same regions also distinguished the SB group when participants were re-scanned after 1–4 months. Extracted data defined SB group membership with good accuracy, sensitivity, and specificity (79–88%).
These results suggest that individuals with a history of SB in the context of mood disorders may show reliably distinct patterns of intrinsic network connectivity, even when compared to those with mood disorders without SB. Resting-state fMRI is a promising tool for identifying subtypes of patients with mood disorders who may be at risk for suicidal behavior.
Replicate radiocarbon (14C) measurements of organic and inorganic control samples, with known Fraction Modern values in the range Fm = 0–1.5 and mass range 6 μg–2 mg carbon, are used to determine both the mass and radiocarbon content of the blank carbon introduced during sample processing and measurement in our laboratory. These data are used to model, separately for organic and inorganic samples, the blank contribution and subsequently “blank correct” measured unknowns in the mass range 25–100 μg. Data, formulas, and an assessment of the precision and accuracy of the blank correction are presented.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Oats can be processed in a variety of ways ranging from minimally processed such as steel-cut oats (SCO), to mildly processed such as large-flake oats (old fashioned oats, OFO), moderately processed such as instant oats (IO) or highly processed in ready-to-eat oat cereals such as Honey Nut Cheerios (HNC). Although processing is believed to increase glycaemic and insulinaemic responses, the effect of oat processing in these respects is unclear. Thus, we compared the glycaemic and insulinaemic responses elicited by 628 kJ portions of SCO, OFO, IO and HNC and a portion of Cream of Rice cereal (CR) containing the same amount of available-carbohydrate (23 g) as the oatmeals. Healthy males (n 18) and females (n 12) completed this randomised, cross-over trial. Blood was taken fasting and at intervals for 3 h following test-meal consumption. Glucose and insulin peak-rises and incremental AUC (iAUC) were subjected to repeated-measures ANOVA using Tukey’s test (two-sided P<0·05) to compare individual means. Glucose peak-rise (primary endpoint, mean (sem) mmol/l) after OFO, 2·19 (sem 0·11), was significantly less than after CR, 2·61 (sem 0·13); and glucose peak-rise after SCO, 1·93 (sem 0·13), was significantly less than after CR, HNC, 2·49 (sem 0·13) and IO 2·47 (sem 0·13). Glucose iAUC was significantly lower after SCO than CR and HNC. Insulin peak rise was similar among the test meals, but insulin iAUC was significantly less after SCO than IO. Thus, the results show that oat processing affects glycaemic and insulinaemic responses with lower responses associated with less processing.
Cougar Mountain Cave is located in Oregon's Fort Rock Basin. In 1958, avocationalist John Cowles excavated most of the cave's deposits and recovered abundant fiber, lithic, wood, and osseous artifacts. A crew from the University of California, Davis returned to the site in 1966 to evaluate the potential for further research, collecting additional lithic and fiber artifacts from disturbed deposits and in situ charcoal from apparently undisturbed deposits. Because Cowles took few notes or photographs, the Cougar Mountain Cave collection—most of which is housed at the Favell Museum in Klamath Falls, Oregon—has largely gone unstudied even though it contains diagnostic artifacts spanning the Holocene and, potentially, the terminal Pleistocene. We recently submitted charcoal and basketry from the site for radiocarbon dating, providing the first reliable sense of when Cougar Mountain Cave was first occupied. Our results indicate at least a Younger Dryas age for initial occupation. The directly dated basketry has provided new information about the age ranges and spatial distributions of diagnostic textile types in the northwestern Great Basin.
Recent commercialization of auxin herbicide–based weed control systems has led to increased off-target exposure of susceptible cotton cultivars to auxin herbicides. Off-target deposition of dilute concentrations of auxin herbicides can occur on cotton at any stage of growth. Field experiments were conducted at two locations in Mississippi from 2014 to 2016 to assess the response of cotton at various growth stages after exposure to a sublethal 2,4-D concentration of 8.3 g ae ha−1. Herbicide applications occurred weekly from 0 to 14 weeks after emergence (WAE). Cotton exposure to 2,4-D at 2 to 9 WAE resulted in up to 64% visible injury, whereas 2,4-D exposure 5 to 6 WAE resulted in machine-harvested yield reductions of 18% to 21%. Cotton maturity was delayed after exposure 2 to 10 WAE, and height was increased from exposure 6 to 9 WAE due to decreased fruit set after exposure. Total hand-harvested yield was reduced from 2,4-D exposure 3, 5 to 8, and 13 WAE. Growth stage at time of exposure influenced the distribution of yield by node and position. Yield on lower and inner fruiting sites generally decreased from exposure, and yield partitioned to vegetative or aborted positions and upper fruiting sites increased. Reductions in gin turnout, micronaire, fiber length, fiber-length uniformity, and fiber elongation were observed after exposure at certain growth stages, but the overall effects on fiber properties were small. These results indicate that cotton is most sensitive to low concentrations of 2,4-D during late vegetative and squaring growth stages.
Frequent calls to 911 and requests for emergency services by individuals place a costly burden on emergency response systems and emergency departments (EDs) in the United States. Many of the calls by these individuals are non-emergent exacerbations of chronic conditions and could be treated more effectively and cost efficiently through another health care service. Mobile integrated community health (MICH) programs present a possible partial solution to the over-utilization of emergency services by addressing factors which contribute to a patient’s likelihood of frequent Emergency Medical Services (EMS) use. To provide effective care to eligible individuals, MICH providers must have a working understanding of the common conditions they will encounter.
The purpose of this descriptive study was to evaluate the diagnosis prevalence and comorbidity among participants in the Queen Anne’s County (Maryland USA) MICH Program. This fundamental knowledge of the most common medical conditions within the MICH Program will inform future mobile integrated health programs and providers.
This study examined preliminary data from the MICH Program, as well as 2017 Maryland census data. It involved secondary analysis of de-identified patient records and descriptive statistical analysis of the disease prevalence, degree of comorbidity, insurance coverage, and demographic characteristics among 97 program participants. Diagnoses were grouped by their ICD-9 classification codes to determine the most common categories of medical conditions. Multiple linear regression models and chi-squared tests were used to assess the association between age, sex, race, ICD-9 diagnosis groups, and comorbidity among program enrollees.
Results indicated the most prevalent diagnoses included hypertension, high cholesterol, esophageal reflux, and diabetes mellitus. Additionally, 94.85% of MICH patients were comorbid; the number of comorbidities per patient ranged from one to 13 conditions, with a mean of 5.88 diagnoses per patient (SD=2.74).
Overall, patients in the MICH Program are decidedly medically complex and may be well-suited to additional community intervention to better manage their many conditions. The potential for MICH programs to simultaneously improve patient outcomes and reduce health care costs by expanding into larger public health and addressing the needs of the most vulnerable citizens warrants further study.
ScharfBM, BissellRA, TrevittJL, JenkinsJL.Diagnosis Prevalence and Comorbidity in a Population of Mobile Integrated Community Health Care PatientsPrehosp Disaster Med. 2019;34(1):46–55.
The introduction of auxin herbicide weed control systems has led to increased occurrence of crop injury in susceptible soybeans and cotton. Off-target exposure to sublethal concentrations of dicamba can occur at varying growth stages, which may affect crop response. Field experiments were conducted in Mississippi in 2014, 2015, and 2016 to characterize cotton response to a sublethal concentration of dicamba equivalent to 1/16X the labeled rate. Weekly applications of dicamba at 35 g ae ha−1 were made to separate sets of replicated plots immediately following planting until 14 wk after emergence (WAE). Exposure to dicamba from 1 to 9 WAE resulted in up to 32% visible injury, and exposure from 7 to 10 WAE delayed crop maturity. Exposure from 8 to 10 and 13 WAE led to increased cotton height, while an 18% reduction in machine-harvested yield resulted from exposure at 6 WAE. Cotton exposure at 3 to 9 WAE reduced the seed cotton weight partitioned to position 1 fruiting sites, while exposure at 3 to 6 WAE also reduced yield in position 2 fruiting sites. Exposure at 2, 3, and 5 to 7 WAE increased the percent of yield partitioned to vegetative branches. An increase in percent of yield partitioned to plants with aborted terminals occurred following exposure from 3 to 7 WAE and corresponded with reciprocal decreases in yield partitioned to positional fruiting sites. Minimal effects were observed on fiber quality, except for decreases in fiber length uniformity resulting from exposure at 9 and 10 WAE.
Fetal growth restriction (FGR) and preterm birth are frequent co-morbidities, both are independent risks for brain injury. However, few studies have examined the mechanisms by which preterm FGR increases the risk of adverse neurological outcomes. We aimed to determine the effects of prematurity and mechanical ventilation (VENT) on the brain of FGR and appropriately grown (AG, control) lambs. We hypothesized that FGR preterm lambs are more vulnerable to ventilation-induced acute brain injury. FGR was surgically induced in fetal sheep (0.7 gestation) by ligation of a single umbilical artery. After 4 weeks, preterm lambs were euthanized at delivery or delivered and ventilated for 2 h before euthanasia. Brains and cerebrospinal fluid (CSF) were collected for analysis of molecular and structural indices of early brain injury. FGRVENT lambs had increased oxidative cell damage and brain injury marker S100B levels compared with all other groups. Mechanical ventilation increased inflammatory marker IL-8 within the brain of FGRVENT and AGVENT lambs. Abnormalities in the neurovascular unit and increased blood–brain barrier permeability were observed in FGRVENT lambs, as well as an altered density of vascular tight junctions markers. FGR and AG preterm lambs have different responses to acute injurious mechanical ventilation, changes which appear to have been developmentally programmed in utero.
Oregon's Fort Rock Cave is iconic in respect to both the archaeology of the northern Great Basin and the history of debate about when the Great Basin was colonized. In 1938, Luther Cressman recovered dozens of sagebrush bark sandals from beneath Mt. Mazama ash that were later radiocarbon dated to between 10,500 and 9350 cal B.P. In 1970, Stephen Bedwell reported finding lithic tools associated with a date of more than 15,000 cal B.P., a date dismissed as unreasonably old by most researchers. Now, with evidence of a nearly 15,000-year-old occupation at the nearby Paisley Five Mile Point Caves, we returned to Fort Rock Cave to evaluate the validity of Bedwell's claim, assess the stratigraphic integrity of remaining deposits, and determine the potential for future work at the site. Here, we report the results of additional fieldwork at Fort Rock Cave undertaken in 2015 and 2016, which supports the early Holocene occupation, but does not confirm a pre–10,500 cal B.P. human presence.
A 10-year descriptive analysis of morbidity and mortality associated with water-related activities in the Top End, Northern Territory (NT), Australia.
An outdoor, water-orientated lifestyle characterises the Top End due to its tropical climate, lengthy coastline, many inland-waterways, and common domestic-pool ownership. However, the water holds many dangers: from drowning to the prospect of crocodile attacks.
Data were retrospectively collected from two sources: the Trauma Registry (TR), Royal Darwin Hospital, NT and the National Coronial Information System. Inclusion criteria: all mortality or injury with an Injury Severity Score (ISS) ≥9 from water-related activity in the Top End. Exclusion criteria: envenomation. Data included: demographics, geographical location, time/mechanism of injury, injury narrative/outcome, alcohol consumption, ISS, and Indigenous race.
Ninety-five deaths occurred from 1/1/2005–12/31/2014; 87 prehospital (92%). The leading three mechanisms of injury for the 138 TR admissions were drowning (40%), falling/diving (35%), and watercraft events (14%). Median age 27 (0-90); 78% males. There were 74 children (<16 years) including 20 deaths. Indigenous Australians represent 30% of the NT population, but had 43% of deaths and 12% of admissions. Deaths from crocodile attacks are increasing with 14 deaths from 2005-2014, compared to 10 deaths from 1971-2004 (Caldicutt). Alcohol was recorded in 31% of admissions and 52% of deaths in those age >16. The Top End’s crude rate of drowning averaged over 10 years was 4.36/100,000/annum, compared to 1.31/100,000/annum in Australia.
Alcohol plays a major role in the Top End’s water-related harm, associated with all mechanisms and over one-half of adult deaths. Also striking is increasing crocodile fatalities, possibly caused by population recovery from endangered to plentiful, since hunting ceased in 1971. Local authorities/advocates push water-safety and crocodile-awareness programs. However, the lure of tropical waters combined with alcohol remains a risk to life and limb. Further public health campaigns focusing on these issues are called for.
Although the Great Basin of North America has produced some of the most robust and ancient fiber artifact assemblages in the world, many were recovered with poor chronological controls. Consequently, this class of artifacts has seldom been effectively incorporated into general discussions of early chronological and cultural patterns. In recent years, the Great Basin Textile Dating Project has accumulated direct AMS dates on textiles (bags, sandals, mats, cordage, and basketry) from dry caves in the Great Basin, particularly in the northern and western areas. We focus here on the terminal Pleistocene/early Holocene, to identify chronological patterns in this class of artifacts and to evaluate Adovasio’s characterization of the region’s earliest basketry as simple and undecorated. New AMS dates now suggest that the region’s earliest people had sophisticated textile traditions that incorporated numerous decorative elaborations. Some distinctive structures, including Fort Rock sandals and weft-faced plaited textiles, have limited early temporal ranges and may serve as diagnostic indicators for terminal Pleistocene/early Holocene times. Other basketry forms and structures that appear by about 9000 cal B.P. persist into the historic period, suggesting a stronger thread of continuity (especially in the north) from this time than is apparent in lithic traditions
Culture does not change because we desire to change it. Culture changes when the organization is transformed; the culture reflects the realities of people working together every day.
– Frances Hesselbein
Have you ever had a job you loved and felt empowered to fulfill your responsibilities? If so, what was it about your co-workers, your manager/supervisor, and your work environment that made your experience so positive? Perhaps you've never felt that way about a job and, instead, you've dreaded heading to work every morning. Your boss might have rarely recognized your efforts. It's possible you weren't sure how to perform your job, but felt uncomfortable asking for help. Your co-workers might have seemed like characters from the movie Mean Girls. In this perfect storm of the forces of disengagement, we suspect you didn't last too long at that job. Or you felt overwhelmed with too much to do, with too little support, as depicted in the illustration on the next page.
According to a 2013 survey, more than half of workers in the United States were dissatisfied with their jobs. This statistic is alarming; after all, we spend approximately one-third of our waking hours and energy at work, plus dissatisfied employees tend to find new employers. Because we spend so much of our time and energy at work, the organizational culture can have a profound impact on our lives and the lives of those around us. If work cultures support interdependent, prosocial behavior instead of individualism and competition, we believe the business world, indeed our everyday lives, will be more positive and productive for almost everyone.
Any organization's mission will benefit from employees who care about their work and their colleagues. It's a win-win scenario. What factors influence employee job satisfaction? Aside from the obvious – job security, pay, and benefits (e.g., health insurance) – employees report that feeling safe at work, having a positive relationship with their immediate supervisor, and communicating openly and cooperatively with other employees and senior management contribute significantly to their work satisfaction. The bad news: In many organizational cultures, managers/supervisors struggle with these very issues, resulting in unacceptably high rates of employee dissatisfaction and turnover and a climate of distrust.
Imagine these disgruntled employees as supervisors who are responsible for mentoring newly recruited employees.
Objectives: There is a well-known association between memory impairment and major depressive disorder (MDD). Additionally, recent studies are also showing resting-state functional magnetic resonance imaging (rsMRI) abnormalities in active and remitted MDD. However, no studies to date have examined both rs connectivity and memory performance in early course remitted MDD, nor the relationship between connectivity and semantically cued episodic memory. Methods: The rsMRI data from two 3.0 Tesla GE scanners were collected from 34 unmedicated young adults with remitted MDD (rMDD) and 23 healthy controls (HCs) between 18 and 23 years of age using bilateral seeds in the hippocampus. Participants also completed a semantically cued list-learning test, and their performance was correlated with hippocampal seed-based rsMRI. Regression models were also used to predict connectivity patterns from memory performance. Results: After correcting for sex, rMDD subjects performed worse than HCs on the total number of words recalled and recognized. rMDD demonstrated significant in-network hypoactivation between the hippocampus and multiple fronto-temporal regions, and multiple extra-network hyperconnectivities between the hippocampus and fronto-parietal regions when compared to HCs. Memory performance negatively predicted connectivity in HCs and positively predicted connectivity in rMDD. Conclusions Even when individuals with a history of MDD are no longer displaying active depressive symptoms, they continue to demonstrate worse memory performance, disruptions in hippocampal connectivity, and a differential relationship between episodic memory and hippocampal connectivity. (JINS, 2016, 22, 225–239)
Recent meta-analyses of resting-state networks in major depressive disorder (MDD) implicate network disruptions underlying cognitive and affective features of illness. Heterogeneity of findings to date may stem from the relative lack of data parsing clinical features of MDD such as phase of illness and the burden of multiple episodes.
Resting-state functional magnetic resonance imaging data were collected from 17 active MDD and 34 remitted MDD patients, and 26 healthy controls (HCs) across two sites. Participants were medication-free and further subdivided into those with single v. multiple episodes to examine disease burden. Seed-based connectivity using the posterior cingulate cortex (PCC) seed to probe the default mode network as well as the amygdala and subgenual anterior cingulate cortex (sgACC) seeds to probe the salience network (SN) were conducted.
Young adults with remitted MDD demonstrated hyperconnectivity of the left PCC to the left inferior frontal gyrus and of the left sgACC to the right ventromedial prefrontal cortex (PFC) and left hippocampus compared with HCs. Episode-independent effects were observed between the left PCC and the right dorsolateral PFC, as well as between the left amygdala and right insula and caudate, whereas the burden of multiple episodes was associated with hypoconnectivity of the left PCC to multiple cognitive control regions as well as hypoconnectivity of the amygdala to large portions of the SN.
This is the first study of a homogeneous sample of unmedicated young adults with a history of adolescent-onset MDD illustrating brain-based episodic features of illness.