To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Racial disparities in colorectal cancer (CRC) can be addressed through increased adherence to screening guidelines. In real-life encounters, patients may be more willing to follow screening recommendations delivered by a race concordant clinician. The growth of telehealth to deliver care provides an opportunity to explore whether these effects translate to a virtual setting. The primary purpose of this pilot study is to explore the relationships between virtual clinician (VC) characteristics and CRC screening intentions after engagement with a telehealth intervention leveraging technology to deliver tailored CRC prevention messaging.
Using a posttest-only design with three factors (VC race-matching, VC gender, intervention type), participants (N = 2267) were randomised to one of eight intervention treatments. Participants self-reported perceptions and behavioral intentions.
The benefits of matching participants with a racially similar VC trended positive but did not reach statistical significance. Specifically, race-matching positively influenced screening intentions for Black participants but not for Whites (b = 0.29, p = 0.10). Importantly, perceptions of credibility, attractiveness, and message relevance significantly influenced screening intentions and the relationship with race-matching.
To reduce racial CRC screening disparities, investments are needed to identify patient-focused interventions to address structural barriers to screening. This study suggests that telehealth interventions that match Black patients with a Black VC can enhance perceptions of credibility and message relevance, which may then improve screening intentions. Future research is needed to examine how to increase VC credibility and attractiveness, as well as message relevance without race-matching.
Using an ensemble of close- and long-range remote sensing, lake bathymetry and regional meteorological data, we present a detailed assessment of the geometric changes of El Morado Glacier in the Central Andes of Chile and its adjacent proglacial lake between 1932 and 2019. Overall, the results revealed a period of marked glacier down wasting, with a mean geodetic glacier mass balance of −0.39 ± 0.15 m w.e.a−1 observed for the entire glacier between 1955 and 2015 with an area loss of 40% between 1955 and 2019. We estimate an ice elevation change of −1.00 ± 0.17 m a−1 for the glacier tongue between 1932 and 2019. The increase in the ice thinning rates and area loss during the last decade is coincident with the severe drought in this region (2010–present), which our minimal surface mass-balance model is able to reproduce. As a result of the glacier changes observed, the proglacial lake increased in area substantially between 1955 and 2019, with bathymetry data suggesting a water volume of 3.6 million m3 in 2017. This study highlights the need for further monitoring of glacierised areas in the Central Andes. Such efforts would facilitate a better understanding of the downstream impacts of glacier downwasting.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
The Zika virus was largely unknown to many health care systems before the outbreak of 2015. The unique public health threat posed by the Zika virus and the evolving understanding of its pathology required continuous communication between a health care delivery system and a local public health department. By leveraging an existing relationship, NYC Health+Hospitals worked closely with New York City Department of Health and Mental Hygiene to ensure that Zika-related processes and procedures within NYC Health+Hospitals facilities aligned with the most current Zika virus guidance. Support given by the public health department included prenatal clinical and laboratory support and the sharing of data on NYC Health+Hospitals Zika virus screening and testing rates, thus enabling this health care delivery system to make informed decisions and practices. The close coordination, collaboration, and communication between the health care delivery system and the local public health department examined in this article demonstrate the importance of working together to combat a complex public health emergency and how this relationship can serve as a guide for other jurisdictions to optimize collaboration between external partners during major outbreaks, emerging threats, and disasters that affect public health. (Disaster Med Public Health Preparedness. 2018;12:689-691)
Control of cucurbit pests, such as striped cucumber beetle (Acalymma vittatum), spotted cucumber beetle (Diabrotica undecimpunctata howardi) and squash bug (Anasa tristis), in organic systems is difficult due to a lack of effective insecticide options. This has led to the development of many integrated pest management techniques, such as use of row covers, crop rotation and cover crops. This study explored the novel use of strip tillage and row covers to reduce pest pressure in summer squash (Cucurbita pepe) and muskmelon (Cucumis melo) production systems. Results showed that although strip tillage reduced striped cucumber beetle and squash bug numbers, there was a yield reduction in both crops compared with the plasticulture system. Row cover increased marketable yield in both systems, with the highest yield being in the plasticulture system. Unmarketable fruit directly attributed to insect damage was higher in the plasticulture systems, but was not significantly different when compared with the strip tillage system. Although there are many documented positive attributes of strip tillage, results from this study indicate that a combination of plasticulture and row cover may be a superior system for organic cucurbit production.
The British Library and open bibliographic metadata
The British Library is the national library of the UK. Among its core responsibilities set out in the British Library Act 1972 is that of disseminating metadata describing its rich collections and UK publishing output since 1950 via the British National Bibliography (www.bl.uk/bibliographic/natbib.html). This requirement resulted in the Library offering bibliographic metadata services from its foundation. These services were originally operated commercially and were primarily aimed solely at the library community. However, in 2010 the British Library began to develop an open metadata strategy in response to calls from the UK government, such as Putting the Frontline First, which encouraged increased access to public sector data in order to promote transparency, economic growth and research. At the same time there was growing interest in the potential of linked data for improving reach to new users and exploiting new information sources. Such opportunities were felt compelling enough to warrant action despite the significant technical and licensing issues that needed to be addressed.
The new open metadata strategy aimed to remove constraints imposed by restrictive licensing and domain specific library standards (e.g. MARC21) and to develop new modes of access with communities using the metadata. It was believed that proactively enabling the reuse of metadata could increase its community value, improve access to information and culture, while reinforcing the relevance of libraries. However, in order to justify and sustain the initiative in a period of diminishing funding it was important to try to achieve institutional recognition via any licence model selected to support reuse. In addition, a number of risks required active management, notably:
• legal risks, for example complex copyright and licensing frameworks that require proactive management of derived metadata licensing to protect against possible liabilities
• reputational risks, for example possible perception that the British Library is not satisfying government and community expectations due to variant definitions of ‘open’ data.
Rather than create a targeted service that satisfied only one audience segment or a generic offering that risked satisfying none, a multi-track approach was adopted to address the requirements of three core user groups: researchers, linked data users and developers, and libraries.
Access, standards and licence options were then tailored to the specific needs of these groups.
This study investigated organisational factors impacting disability support worker (DSW) psychosocial wellbeing and work safety to understand the relationship between wellbeing, using measures of burnout and job satisfaction, and work conditions and safety performance. This study also investigated factors predicting wellbeing using the Job Demand-Control-Support (JDCS) model. A sample of 87 DSWs completed normed measures of burnout, work conditions, and safety climate. Results showed DSWs experienced significantly higher personal and work-related burnout but significantly lower client-related burnout. Although the JDCS model components did not all predict any single wellbeing measure, they each predicted aspects of burnout and job satisfaction, with these wellbeing measures associated with safety performance. Findings highlighted the importance of monitoring worker job demands, support availability, and job control to improve safety performance. Compared to normative data, DSWs were experiencing significantly higher role conflict, the negative impact of which was effectively moderated by support for personal and work-related burnout and job satisfaction. Findings suggest the need to consider DSW work conditions, and particularly work practices contributing to role conflict, as well as increasing support for DSWs to prevent the development of personal and work-related burnout. Findings suggest further research associated with client-related burnout is required.
In hypoplastic left heart syndrome, thrombosis of the native ascending aorta is rare and often fatal; there are no previously reported cases presenting with acute heart block. We review a case of native ascending aorta thrombosis in a 2-year-old boy with hypoplastic left heart syndrome, presenting with acute heart block. This case highlights the benefit of multi-modality imaging in complex cases.
In 1976, David Sugden and Brian John developed a classification for Antarctic landscapes of glacial erosion based upon exposed and eroded coastal topography, providing insight into the past glacial dynamics of the Antarctic ice sheets. We extend this classification to cover the continental interior of Antarctica by analysing the hypsometry of the subglacial landscape using a recently released dataset of bed topography (BEDMAP2). We used the existing classification as a basis for first developing a low-resolution description of landscape evolution under the ice sheet before building a more detailed classification of patterns of glacial erosion. Our key finding is that a more widespread distribution of ancient, preserved alpine landscapes may survive beneath the Antarctic ice sheets than has been previously recognized. Furthermore, the findings suggest that landscapes of selective erosion exist further inland than might be expected, and may reflect the presence of thinner, less extensive ice in the past. Much of the selective nature of erosion may be controlled by pre-glacial topography, and especially by the large-scale tectonic structure and fluvial valley network. The hypotheses of landscape evolution presented here can be tested by future surveys of the Antarctic ice sheet bed.
Understanding the nutritional demands on serving military personnel is critical to inform training schedules and dietary provision. Troops deployed to Afghanistan face austere living and working environments. Observations from the military and those reported in the British and US media indicated possible physical degradation of personnel deployed to Afghanistan. Therefore, the present study aimed to investigate the changes in body composition and nutritional status of military personnel deployed to Afghanistan and how these were related to physical fitness. In a cohort of British Royal Marines (n 249) deployed to Afghanistan for 6 months, body size and body composition were estimated from body mass, height, girth and skinfold measurements. Energy intake (EI) was estimated from food diaries and energy expenditure measured using the doubly labelled water method in a representative subgroup. Strength and aerobic fitness were assessed. The mean body mass of volunteers decreased over the first half of the deployment ( − 4·6 (sd 3·7) %), predominately reflecting fat loss. Body mass partially recovered (mean +2·2 (sd 2·9) %) between the mid- and post-deployment periods (P< 0·05). Daily EI (mean 10 590 (sd 3339) kJ) was significantly lower than the estimated daily energy expenditure (mean 15 167 (sd 1883) kJ) measured in a subgroup of volunteers. However, despite the body mass loss, aerobic fitness and strength were well maintained. Nutritional provision for British military personnel in Afghanistan appeared sufficient to maintain physical capability and micronutrient status, but providing appropriate nutrition in harsh operational environments must remain a priority.
The success of central line-associated bloodstream infection (CLABSI) prevention programs in intensive care units (ICUs) has led to the expansion of surveillance at many hospitals. We sought to compare non-ICU CLABSI (nCLABSI) rates with national reports and describe methods of surveillance at several participating US institutions.
Design and Setting.
An electronic survey of several medical centers about infection surveillance practices and rate data for non-ICU Patients.
Ten tertiary care hospitals.
In March 2011, a survey was sent to 10 medical centers. The survey consisted of 12 questions regarding demographics and CLABSI surveillance methodology for non-ICU patients at each center. Participants were also asked to provide available rate and device utilization data.
Hospitals ranged in size from 238 to 1,400 total beds (median, 815). All hospitals reported using Centers for Disease Control and Prevention (CDC) definitions. Denominators were collected by different means: counting patients with central lines every day (5 hospitals), indirectly estimating on the basis of electronic orders (n = 4), or another automated method (n = 1). Rates of nCLABSI ranged from 0.2 to 4.2 infections per 1,000 catheter-days (median, 2.5). The national rate reported by the CDC using 2009 data from the National Healthcare Surveillance Network was 1.14 infections per 1,000 catheter-days.
Only 2 hospitals were below the pooled CLABSI rate for inpatient wards; all others exceeded this rate. Possible explanations include differences in average central line utilization or hospital size in the impact of certain clinical risk factors notably absent from the definition and in interpretation and reporting practices. Further investigation is necessary to determine whether the national benchmarks are low or whether the hospitals surveyed here represent a selection of outliers.
To describe the difficulties and differing techniques in the transcatheter placement of amplatz ventricular septal defect devices to close perimembranous ventricular septal defects and place these in the context of the expanding literature on ventricular septal defect catheter closure.
Surgery remains the established first-line therapy for closure of haemodynamically significant perimembranous ventricular septal defects. Transcatheter techniques appeared to promise a possible alternative, obviating the need for cardiac surgery. However, significant technical and anatomical constraints coupled with ongoing reports of a high incidence of heart block have prevented these hopes from being realised to any significant extent. It is likely that there are important methodological reasons for the high complication rates observed. The potential advantages of transcatheter perimembranous ventricular septal defect closure over surgery warrant further exploration of differing transcatheter techniques.
Between August, 2004 and November, 2009, 21 patients had a perimembranous ventricular septal defect closed with transcatheter techniques. Of these, 14 were closed with a muscular amplatz ventricular septal defect device. The median age and weight at device placement were 8 years, ranging from 2 to 19 years, and 18.6 kilograms, ranging from 10 to 21 kilograms, respectively.
There were 25 procedures performed on 23 patients using 21 amplatz ventricular septal defect devices. Median defect size on angiography was 7.8 millimetres, ranging from 4 to 14.3 millimetres, with a median device size of 8 millimetres, ranging from 4 to 18 millimetres, and a defect/device ratio of 1.1, with a range from 0.85 to 1.33. Median procedure time was 100 minutes, with a range from 38 to 235 minutes. Adverse events included device embolisation following haemolysis in one, and new aortic incompetence in another, but there were no cases of heart block. Median follow-up was 41.7 months, with a rangefrom 2 to 71 months.
Evaluating transcatheter closure of perimembranous ventricular septal defect using amplatz ventricular septal defect devices remains important, if a technically feasible method with low and acceptable complication rates is to be identified. Incidence of heart block may be minimised by avoiding oversized devices, using muscular devices, and accepting defeat if an appropriately selected device pulls through. Given the current transcatheter technologies, the closure of perimembranous ventricular septal defects should generally be performed in children when they weigh at least 10 kilograms.
Scimitar syndrome is a rare condition often with a separate systemic arterial supply from the abdominal aorta. Occlusion of this systemic arterial supply is frequently performed, though it can be difficult in small patients or in those with tortuous vessels. This case documents use of the new Amplatz vascular plug IV for arterial occlusion. It has major advantages in being able to deliver the device through a 4F catheter without the need to upsize to a dedicated delivery sheath. This is particularly appealing to paediatric practice or to those with difficult anatomy in older patients.
Medusahead is one of the most problematic rangeland weeds in the western United States. In previous studies, prescribed burning has been used successfully to control medusahead in some situations, but burning has failed in other circumstances. In this study, trials were conducted using the same protocol at four locations in central to northern California to evaluate plant community response to two consecutive years of summer burning and to determine the conditions resulting in successful medusahead control. During 2002 through 2003 large-scale experiments were established at two low-elevation, warm-winter sites (Fresno and Yolo counties) and two higher elevation, cool-winter sites (Siskiyou and Modoc counties). Plant species cover was estimated using point-intercept transects, and biomass samples were taken in each plot. After 2 yr of burning, medusahead cover was reduced by 99, 96, and 93% for Fresno, Yolo, and Siskiyou counties, respectively, compared to unburned control plots. Other annual grasses were also reduced, but less severely, and broadleaf species increased at all three sites. In contrast, 2 yr of burning resulted in a 55% increase in medusahead at the coolest winter site in Modoc County. In the second season after the final burn, medusahead cover remained low in burned plots at Fresno and Yolo counties (1 and 12% of cover in unburned controls, respectively), but at the Siskiyou site medusahead recovered to 45% relative to untreated controls. The success of prescribed burning was correlated with biomass of annual grasses, excluding medusahead, preceding a burn treatment. It is hypothesized that greater production of combustible forage resulted in increased fire intensity and greater seed mortality in exposed inflorescences. These results demonstrate that burning can be an effective control strategy for medusahead in low elevation, warm-winter areas characterized by high annual grass biomass production, but may not be successful in semiarid cool winter areas.
The aim was to investigate the effects of a 48 h period of fluid, energy or combined fluid and energy restriction on salivary IgA (s-IgA) responses at rest and after exercise. Thirteen healthy males (age 21 (sem 1) years) participated in four randomised 48 h trials. In the control trial participants received their estimated energy (12 154 (sem 230) kJ/d) and water (3912 (sem 140) ml/d) requirements. On fluid restriction (FR) participants received their energy requirements and 193 (sem 19) ml water/d to drink and on energy restriction (ER) participants received their water requirements and 1214 (sem 25) kJ/d. Fluid and energy restriction (F+ER) was a combination of FR and ER. After 48 h, participants performed a 30 min treadmill time trial (TT) followed by rehydration (0–2 h) and refeeding (2–6 h). Unstimulated saliva was collected at 0, 24 and 48 h, post-TT, and 2 and 6 h post-TT. Saliva flow rate (sflw) and s-IgA (ELISA) remained unchanged in control conditions and on ER. However, 48 h on FR decreased sflw (64 %) which most probably accounted for the increase in s-IgA concentration (P < 0·01). Despite a decrease in sflw (54 %), s-IgA concentration did not increase on F+ER, resulting in a decreased s-IgA secretion rate by 24 h (0 h: 20 (sem 2); 24 h: 12 (sem 2) μg/min; P < 0·01). Post-TT s-IgA secretion rate was not lower compared with 48 h on any trial. s-IgA secretion rate returned to within 0 h values by 6 h post-TT on F+ER. In conclusion, a 24–48 h period of combined F+ER decreased s-IgA secretion rate but normalisation occurred upon refeeding.
The ability to controllably position individual phosphorus dopant atoms in silicon sur-faces is a critical first step in creating nanoscale electronic devices in silicon, for example a phosphorus in silicon quantum computer. While individual P atom placement in Si(001) has been achieved, the ability to routinely position P atoms in Si for large-scale device fabrication requires a more detailed understanding of the physical and chemical processes leading to P atom incorporation. Here we present an atomic-resolution scanning tunneling microscopy study of the interaction of the P precursor molecule phosphine (PH3) with the Si(001) surface. In particular, we present the direct observation of PH3 dissociation and diffusion on Si(001) at room temperature and show that this dissociation is occasionally complete, leaving a P monomer bound to the surface. Such surface bound P monomers are important because they are the most likely entry point for P atoms to incorporate into the substrate surface at elevated temperature.