To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The sternocleidomastoid can be used as a pedicled flap in head and neck reconstruction. It has previously been associated with high complication rates, likely due in part to the variable nature of its blood supply.
To provide clinicians with an up-to-date review of clinical outcomes of sternocleidomastoid flap surgery in head and neck reconstruction, integrated with a review of vascular anatomical studies of the sternocleidomastoid.
A literature search of the Medline and Web of Science databases was conducted. Complications were analysed for each study. The trend in success rates was analysed by date of the study.
Reported complication rates have improved over time. The preservation of two vascular pedicles rather than one may have contributed to improved outcomes.
The sternocleidomastoid flap is a versatile option for patients where prolonged free flap surgery is inappropriate. Modern vascular imaging techniques could optimise pre-operative planning.
Six on-farm studies determined the effects of a rolled rye cover crop, herbicide program, and planting technique on cotton stand, weed control, and cotton yield in Georgia. Treatments included: (1) rye drilled broadcast with 19-cm row spacing and a broadcast-herbicide program (2) rye drilled with a 25-cm rye-free zone in the cotton row and a broadcast-herbicide program (3) rye drilled with a 25-cm rye-free zone in the cotton row with PPI and PRE herbicides banded in the cotton planting row, and (4) no cover crop (i.e., weedy cover) with broadcast herbicides. At two locations, cotton stand was lowest with rye drilled broadcast; at these sites the rye-free zone maximized stand equal to the no-cover system. At a third location, cover crop systems resulted in greater stand, due to enhanced soil moisture preservation compared with the no-cover system. Treatments did not influence cotton stand at the other three locations and did not differ in the control of weeds other than Palmer amaranth at any location. Treatments controlled Palmer amaranth equally at three locations; however, differences were observed at the three locations having the greatest glyphosate-resistant plant densities. For these locations, when broadcasting herbicides, Palmer amaranth populations were reduced 82% to 86% in the broadcast rye and rye-free zone systems compared with the no-cover system at harvest. The system with banded herbicides was nearly 21 times less effective than the similar system broadcasting herbicides. At these locations, yields in the rye broadcast and rye-free zone systems with broadcast herbicides were increased 9% to 16% compared with systems with no cover or a rye-free zone with PPI and PRE herbicides banded. A rolled rye cover crop can lessen weed emergence and selection pressure while improving weed control and cotton yield, but herbicides should be broadcast in fields heavily infested with glyphosate-resistant Palmer amaranth.
Chemical bonding in native oxides of GaAs, before and after etching, is detected by X-Ray Photoelectron Spectroscopy (XPS). It is correlated with surface energy engineering (SEE), measured via Three Liquid Contact Angle Analysis (3LCAA), and oxygen coverage, measured by High Resolution Ion Beam Analysis (HR-IBA).
Before etching, GaAs native oxides are found to be hydrophobic with an average surface energy, γT, of 33 ± 1 mJ/m2, as measured by 3LCAA. After dilute NH4OH etching, GaAs becomes highly hydrophilic and its surface energy, γT, increases by a factor 2 to a reproducible value of 66 ± 1 mJ/m2. Using HR-IBA, oxygen coverage on GaAs is found to decrease from 7.2 ± 0.5 monolayers (ML) to 3.6 ± 0.5 ML. The 1.17 ratio of Ga to As, measured by HR-IBA, remains constant after etching.
XPS is used to measure oxidation of Ga and As, as well as surface stoichiometry on two locations of several GaAs(100) wafers before and after etching. The relative proportions of Ga and As are unaffected by adventitious carbon contamination. The 1.16 Ga:As ratio, measured by XPS, matches HR-IBA analysis. The proportions of oxidized Ga and As do not change significantly after etching. However, the initial ratio of As2O5 to As2O3, within the oxidized As, significantly decreases after etching from approximately 3:1 to 3:2.
Absolute oxygen coverage, as a function of surface processing, is determined within 0.5 ML by HR-IBA. XPS offers insight into these modifications by detecting electronic states and phase composition changes of GaAs oxides. The changes in surface chemistry are correlated to changes in hydro-affinity and surface energies measured by 3LCAA.
Project management expertise is employed across many professional sectors, including clinical research organizations, to ensure that efforts undertaken by the organization are completed on time and according to specifications and are capable of achieving the needed impact. Increasingly, project leaders (PLs) who possess this expertise are being employed in academic settings to support clinical and preclinical translational research team science. Duke University’s clinical and translational science enterprise has been an early adopter of project management to support clinical and preclinical programs. We review the history and evolution of project management and the PL role at Duke, examine case studies that illustrate their growing value to our academic research environment, and address challenges and solutions to employing project management in academia. Furthermore, we describe the critical role project leadership plays in accelerating and increasing the success of translational team science and team approaches frequently required for systems biology and “big data” scientific studies. Finally, we discuss perspectives from Duke project leadership professionals regarding the training needs and requirements for PLs working in academic clinical and translational science research settings.
Angiostrongylus cantonensis is a pathogenic nematode and the cause of neuroangiostrongyliasis, an eosinophilic meningitis more commonly known as rat lungworm disease. Transmission is thought to be primarily due to ingestion of infective third stage larvae (L3) in gastropods, on produce, or in contaminated water. The gold standard to determine the effects of physical and chemical treatments on the infectivity of A. cantonensis L3 larvae is to infect rodents with treated L3 larvae and monitor for infection, but animal studies are laborious and expensive and also raise ethical concerns. This study demonstrates propidium iodide (PI) to be a reliable marker of parasite death and loss of infective potential without adversely affecting the development and future reproduction of live A. cantonensis larvae. PI staining allows evaluation of the efficacy of test substances in vitro, an improvement upon the use of lack of motility as an indicator of death. Some potential applications of this assay include determining the effectiveness of various anthelmintics, vegetable washes, electromagnetic radiation and other treatments intended to kill larvae in the prevention and treatment of neuroangiostrongyliasis.
Replicate radiocarbon (14C) measurements of organic and inorganic control samples, with known Fraction Modern values in the range Fm = 0–1.5 and mass range 6 μg–2 mg carbon, are used to determine both the mass and radiocarbon content of the blank carbon introduced during sample processing and measurement in our laboratory. These data are used to model, separately for organic and inorganic samples, the blank contribution and subsequently “blank correct” measured unknowns in the mass range 25–100 μg. Data, formulas, and an assessment of the precision and accuracy of the blank correction are presented.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Few studies have used genomic epidemiology to understand tuberculosis (TB) transmission in rural and remote settings – regions often unique in history, geography and demographics. To improve our understanding of TB transmission dynamics in Yukon Territory (YT), a circumpolar Canadian territory, we conducted a retrospective analysis in which we combined epidemiological data collected through routine contact investigations with clinical and laboratory results. Mycobacterium tuberculosis isolates from all culture-confirmed TB cases in YT (2005–2014) were genotyped using 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) and compared to each other and to those from the neighbouring province of British Columbia (BC). Whole genome sequencing (WGS) of genotypically clustered isolates revealed three sustained transmission networks within YT, two of which also involved BC isolates. While each network had distinct characteristics, all had at least one individual acting as the probable source of three or more culture-positive cases. Overall, WGS revealed that TB transmission dynamics in YT are distinct from patterns of spread in other, more remote Northern Canadian regions, and that the combination of WGS and epidemiological data can provide actionable information to local public health teams.
All Fire and Emergency Services (FES) personnel must balance FES work with their other responsibilities. Given that women tend to take on a greater responsibility for management of household/domestic activities than men, the on-call component of their FES work may be associated with very different challenges. Despite this, women have rarely been the focus of on-call research.
To explore women’s on-call experiences in the FES by examining coping styles and strategies, with the goal of helping to innovate the way women are supported in FES roles.
Relevant findings from two studies are included. The first study involved FES personnel from two agencies in Australia (n=24) who participated in a semi-structured interview. The second study was an anonymous online survey to determine work characteristics, sleep, stress, and coping in on-call workers more broadly, with workers from all industries across Australia (n=228) invited to participate.
Interview data identified two major themes in terms of coping with on-call work. Support (from family, social, and work), planning, and preparation were identified as important in helping women cope in the context of on-call unpredictability. Results from the survey (43% women) showed that on-call workers were an engaged group in terms of their coping, with 67% classified as having a positive coping style and 58% of women indicating that they agreed/strongly agreed with the statement, “I cope well with on-call work.”
Taken together, these data highlight engagement with positive coping by women who do on-call work, including in the FES. Importantly, positive coping strategies, such as talking about emotions, problem-solving, and seeking support have been linked to increased shift work tolerance in other populations. Coping style and strategies represent modifiable variables which could be specifically applied to assist women to manage the unique challenges associated with on-call work in the FES.
Recent commercialization of auxin herbicide–based weed control systems has led to increased off-target exposure of susceptible cotton cultivars to auxin herbicides. Off-target deposition of dilute concentrations of auxin herbicides can occur on cotton at any stage of growth. Field experiments were conducted at two locations in Mississippi from 2014 to 2016 to assess the response of cotton at various growth stages after exposure to a sublethal 2,4-D concentration of 8.3 g ae ha−1. Herbicide applications occurred weekly from 0 to 14 weeks after emergence (WAE). Cotton exposure to 2,4-D at 2 to 9 WAE resulted in up to 64% visible injury, whereas 2,4-D exposure 5 to 6 WAE resulted in machine-harvested yield reductions of 18% to 21%. Cotton maturity was delayed after exposure 2 to 10 WAE, and height was increased from exposure 6 to 9 WAE due to decreased fruit set after exposure. Total hand-harvested yield was reduced from 2,4-D exposure 3, 5 to 8, and 13 WAE. Growth stage at time of exposure influenced the distribution of yield by node and position. Yield on lower and inner fruiting sites generally decreased from exposure, and yield partitioned to vegetative or aborted positions and upper fruiting sites increased. Reductions in gin turnout, micronaire, fiber length, fiber-length uniformity, and fiber elongation were observed after exposure at certain growth stages, but the overall effects on fiber properties were small. These results indicate that cotton is most sensitive to low concentrations of 2,4-D during late vegetative and squaring growth stages.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
A Veterans Affairs hospital.
This study included 75 patients in contact precautions for MRSA colonization or infection.
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
Adverse pregnancy outcomes including prematurity and low birth weight (LBW) have been associated with life-long chronic disease risk for the infant. Stress during pregnancy increases the risk of adverse pregnancy outcomes. Many studies have reported the incidence of adverse pregnancy outcomes in Indigenous populations and a smaller number of studies have measured rates of stress and depression in these populations. This study sought to examine the potential association between stress during pregnancy and the rate of adverse pregnancy outcomes in Australian Indigenous women residing in rural and remote communities in New South Wales. This study found a higher rate of post-traumatic stress disorder, depression and anxiety symptoms during pregnancy than the general population. There was also a higher incidence of prematurity and LBW deliveries. Unfortunately, missing post-traumatic stress disorder and depressive symptomatology data impeded the examination of associations of interest. This was largely due to the highly sensitive nature of the issues under investigation, and the need to ensure adequate levels of trust between Indigenous women and research staff before disclosure and recording of sensitive research data. We were unable to demonstrate a significant association between the level of stress and the incidence of adverse pregnancy outcomes at this stage. We recommend this longitudinal study continue until complete data sets are available. Future research in this area should ensure prioritization of building trust in participants and overestimating sample size to ensure no undue pressure is placed upon an already stressed participant.
The introduction of auxin herbicide weed control systems has led to increased occurrence of crop injury in susceptible soybeans and cotton. Off-target exposure to sublethal concentrations of dicamba can occur at varying growth stages, which may affect crop response. Field experiments were conducted in Mississippi in 2014, 2015, and 2016 to characterize cotton response to a sublethal concentration of dicamba equivalent to 1/16X the labeled rate. Weekly applications of dicamba at 35 g ae ha−1 were made to separate sets of replicated plots immediately following planting until 14 wk after emergence (WAE). Exposure to dicamba from 1 to 9 WAE resulted in up to 32% visible injury, and exposure from 7 to 10 WAE delayed crop maturity. Exposure from 8 to 10 and 13 WAE led to increased cotton height, while an 18% reduction in machine-harvested yield resulted from exposure at 6 WAE. Cotton exposure at 3 to 9 WAE reduced the seed cotton weight partitioned to position 1 fruiting sites, while exposure at 3 to 6 WAE also reduced yield in position 2 fruiting sites. Exposure at 2, 3, and 5 to 7 WAE increased the percent of yield partitioned to vegetative branches. An increase in percent of yield partitioned to plants with aborted terminals occurred following exposure from 3 to 7 WAE and corresponded with reciprocal decreases in yield partitioned to positional fruiting sites. Minimal effects were observed on fiber quality, except for decreases in fiber length uniformity resulting from exposure at 9 and 10 WAE.
The Canadian Stroke Best Practice Recommendations suggests that patients suspected of transient ischemic attack (TIA)/minor stroke receive urgent brain imaging, preferably computed tomography angiography (CTA). Yet, high requisition rates for non-cerebrovascular patients overburden limited radiological resources, putting patients at risk. We hypothesize that our clinical decision support tool (CDST) developed for risk stratification of TIA in the emergency department (ED), and which incorporates Canadian guidelines, could improve CTA utilization.
Retrospective study design with clinical information gathered from ED patient referrals to an outpatient TIA unit in Victoria, BC, from 2015-2016. Actual CTA orders by ED and TIA unit staff were compared to hypothetical CTA ordering if our CDST had been used in the ED upon patient arrival.
For 1,679 referrals, clinicians ordered 954 CTAs. Our CDST would have ordered a total of 977 CTAs for these patients. Overall, this would have increased the number of imaged-TIA patients by 89 (10.1%) while imaging 98 (16.1%) fewer non-cerebrovascular patients over the 2-year period. Our CDST would have ordered CTA for 18 (78.3%) of the recurrent stroke patients in the sample.
Our CDST could enhance CTA utilization in the ED for suspected TIA patients, and facilitate guideline-based stroke care. Use of our CDST would increase the number of TIA patients receiving CTA before ED discharge (rather than later at TIA units) and reduce the burden of imaging stroke mimics in radiological departments.
What constitutes a ‘good place to grow old’? This study aimed to characterise salient features of built and social environments that are essential to support low-income ageing residents. Seated and mobile interviews were conducted with community-dwelling older participants (aged 55–92, mean = 71 years) in three distinct socio-economic and geographic samples of the Minneapolis (Minnesota, United States of America) metropolitan area. The interviews prompted participants to evaluate their homes and neighbourhoods, and probed for particular socio-spatial characteristics that impact residential wellbeing. Qualitative thematic analyses focused on 38 individuals living in subsidised housing and homeless shelters. Four interrelated themes encompassed essential residential qualities: (a) safety and comfort, (b) service access, (c) social connection, and (d) stimulation. These broad ideals, when achieved, enabled participants to cultivate residential wellbeing and fulfilling place attachment. Analyses of the empirical data complicate theoretical assumptions by recognising unequal access to, irregular opportunities for and potential dangers of place attachment. Rich descriptions of participant homelessness, health hazards, crime, lack of supportive infrastructure and social isolation illustrate how place attachment is not inherently positive or necessarily attainable; rather, it is problematic and can involve risk. This article extends geographical gerontology's address of socio-spatial inequalities by focusing on disadvantaged ageing individuals.
Objectives: Anecdotal reports suggest that following traumatic brain injury (TBI) retrograde memories are initially impaired and recover in order of remoteness. However, there has been limited empirical research investigating whether a negative gradient in retrograde amnesia—relative preservation of remote over recent memory—exists during post-traumatic amnesia (PTA) compared with the acute phase post-emergence. This study used a repeated-measures design to examine the pattern of personal semantic (PS) memory performance during PTA and within two weeks of emergence to improve understanding of the nature of the memory deficit during PTA and its relationship with recovery. Methods: Twenty patients with moderate-severe TBI and 20 healthy controls (HCs) were administered the Personal Semantic Schedule of the Autobiographical Memory Interview. The TBI group was assessed once during PTA and post-emergence. Analysis of variance was used to compare the gradient across lifetime periods during PTA relative to post-emergence, and between groups. Results: PS memory was significantly lower during PTA than post-emergence from PTA, with no relative preservation of remote memories. The TBI group was still impaired relative to HCs following emergence from PTA. Lower overall PS memory scores during PTA were associated with increased days to emerge from PTA post-interview. Conclusions: These results suggest a global impairment in PS memory across lifetime periods particularly during PTA, but still present within 2 weeks of emergence from PTA. PS memory performance may be sensitive to the diffuse nature of TBI and may, therefore, function as a clinically valuable indicator of the likely time to emerge from PTA. (JINS, 2018, 24, 1064–1072)
Individuals with schizophrenia have deficits in social cognition that are associated with poor functional outcome. Unfortunately, current treatments result in only modest improvement in social cognition. Oxytocin, a neuropeptide with pro-social effects, has significant benefits for social cognition in the general population. However, studies examining the efficacy of oxytocin in schizophrenia have yielded inconsistent results. One reason for inconsistency may be that oxytocin has typically not been combined with psychosocial interventions. It may be necessary for individuals with schizophrenia to receive concurrent psychosocial treatment while taking oxytocin to have the context needed to make gains in social cognitive skills.
The current study tested this hypothesis in a 24-week (48 session) double-blind, placebo-controlled trial that combined oxytocin and Cognitive-Behavioral Social Skills Training (CBSST), which included elements from Social Cognition and Interaction Training (SCIT). Participants included 62 outpatients diagnosed with schizophrenia (placebo n = 31; oxytocin n = 31) who received 36 IU BID, with supervised administration 45 min prior to sessions on CBSST group therapy days. Participants completed a battery of measures administered at 0, 12, and 24 weeks that assessed social cognition.
CBSST generally failed to enhance social cognition from baseline to end of study, and there was no additive benefit of oxytocin beyond the effects of CBSST alone.
Findings suggest that combined CBSST and oxytocin had minimal benefit for social cognition, adding to the growing literature indicating null effects of oxytocin in multi-dose trials. Methodological and biological factors may contribute to inconsistent results across studies.