We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Telomeres are nucleoprotein complexes that form the ends of eukaryotic chromosomes where they protect DNA from genomic instability, prevent end-to-end fusion and limit cellular replicative capabilities. Increased telomere attrition rates, and relatively shorter telomere length, is associated with genomic instability and has been linked with several chronic diseases, malignancies and reduced longevity. Telomeric DNA is highly susceptible to oxidative damage and dietary habits may make an impact on telomere attrition rates through the mediation of oxidative stress and chronic inflammation. The aim of this study was to examine the association between leucocyte telomere length (LTL) with both the Dietary Inflammatory Index® 2014 (DII®) and the Alternative Healthy Eating Index 2010 (AHEI-2010). This is a cross-sectional analysis using baseline data from 263 postmenopausal women from the Alberta Physical Activity and Breast Cancer Prevention (ALPHA) Trial, in Calgary and Edmonton, Alberta, Canada. No statistically significant association was detected between LTL z-score and the AHEI-2010 (P = 0·20) or DII® (P = 0·91) in multivariable adjusted models. An exploratory analysis of AHEI-2010 and DII® parameters and LTL revealed anthocyanidin intake was associated with LTL (P < 0·01); however, this association was non-significant after a Bonferroni correction was applied (P = 0·27). No effect modification by age, smoking history, or recreational physical activity was detected for either relationship. Increased dietary antioxidant and decreased oxidant intake were not associated with LTL in this analysis.
Objective: The objective of this study was to evaluate the impact of directed and sustained attention on the allocation of visuospatial attention. Healthy people often have left lateral and upward vertical spatial attentional biases. However, it is not known whether there will be an increase in bias toward the attended portion of the stimulus when volitional spatial attention is allocated to a portion of a stimulus, whether there are asymmetrical spatial alterations of these biases, and how sustained attention influences these biases. Methods: We assessed spatial bias in 36 healthy, right-handed participants using a variant of horizontal and vertical line bisections. Participants were asked to focus on one or the other end of vertical or horizontal lines or entire vertical or horizontal lines, and then to bisect the line either immediately or after a 20 second delay. Results: We found a significant main effect of attentional focus and an interaction between attentional focus and prolonged viewing with delayed bisection. Focusing on a certain portion of the line resulting in a significant deviation toward the attended portion and prolonged viewing of the line prior to bisection significantly enhanced the degree of deviation toward the attended portion. Conclusions: The enhanced bias with directed and sustained attention may be useful modifications of the line bisection test, particularly in clinical populations. Thus, future studies should determine whether prolonged viewing with delayed bisection and spatially focused attention reveals attentional biases in patients with hemispheric lesions who perform normally on the traditional line bisection test. (JINS, 2019, 25, 65–71)
BACKGROUND: Exercise has been shown to benefit health-related fitness, psychosocial health, and disease outcomes in cancer survivors. PURPOSE: To review the evidence on exercise for individuals diagnosed with Neurological Cancer (NC); present data on NC participants in the ACE pilot and ongoing implementation study; and propose a framework to incorporate exercise into the care of NC survivors in Alberta. METHODS: The ACE program is open to survivors with any cancer diagnosis at any stage of treatment. Exercise programming consists of two training sessions per week, with the pilot and implementation studies being 8 and 12 weeks in duration respectively. Outcomes are assessed at study baseline, post-exercise intervention, and 24-week follow-up, and include recruitment and follow-up rates, health-related fitness, psychosocial outcomes, and cancer symptoms. RESULTS: NC survivors represented 7 of 80 participants in the ACE pilot; however, only 3 of the 7 (43%) completed the study. Findings suggested a need for consideration of supervised exercise for some survivors with NC. To date, 14 NC survivors have enrolled in the ACE implementation study. Participants are screened and then referred to either supervised clinic-based or community-based exercise. Seven of 9 participants have completed the ACE intervention, and 5 of 5 have completed the 24-week follow-up. NC participants improved or maintained health-related physical fitness, and reported reduced symptom burden and fatigue. CONCLUSION: Preliminary results suggest exercise training is feasible and beneficial for NC survivors. To optimize recruitment and outcomes, efforts are needed to better identify, screen, and refer survivors to appropriate exercise programming.
Introduction: Outside of key conditions such as cardiac arrest and trauma, little is known about the epidemiology of mortality of all transported EMS patients. The objective of this study is to describe characteristics of EMS patients who after transport die in a health care facility. Methods: EMS transport events over one year (April, 2015-16) from a BLS/ALS system serving an urban/rural population of approximately 2 million were linked with in-hospital datasets to determine proportion of all-cause in-hospital mortality by Medical Priority Dispatch System (MPDS) determinant (911 call triage system), age in years (>=18 yrs. - adult, <=17 yrs. - pediatric), sex, day of week, season, time (in six hour periods), and emergency department Canadian Triage and Acuity Scale (CTAS). The MPDS card, patient chief complaint, and ED diagnosis category (International Classification of Disease v.10 - Canadian) with the highest proportion of mortality are also reported. Analyses included two-sided t-test or chi-square with alpha <0.05. Results: A total of 239,534 EMS events resulted in 159,507 patient transports; 141,114 were included for analysis after duplicate removal (89.1% linkage), with 127,867 reporting final healthcare system outcome. There were 4,269 who died (3.3%; 95%CI 3.2%, 3.4%). The proportion of mortality by MPDS determinant was, from most to least critical 911 call, Echo (7.3%), Delta (37.2%), Charlie (31.3%), Bravo (5.8%), Alpha (18.3%), and Omega (0.3%). For adults the mean age of survivors was less than non-survivors (57.7 vs. 75.8; p<0.001), but pediatric survivors were older than non-survivors (8.7 vs. 2.8; p<0.001). There were more males that died than females (53.0% vs. 47.0%; p<0.001). There was no statistically significant difference in the day of week (p=0.592), but there was by season with the highest mortality in winter (27.1%; p=0.045). The highest mortality occurred with patients presenting to EMS between 0600-1200 hours (34.6%), and the lowest between 0000-0600 hours (11.8%; p<0.001). Mortality by CTAS was category 1 (27.1%), 2 (36.7%), 3 (29.9%), 4 (4.3%), and 5 (0.5%). The highest mortality was seen in MPDS card 26-Sick Person (specific diagnosis) (19.1%), chief complaint shortness of breath (19.3%), and ED diagnoses pertaining to the circulatory system (31.1%). Conclusion: Significant all-cause in-hospital mortality differences were found between event, patient, and clinical characteristics. These data provide foundational and hypothesis generating knowledge regarding mortality in transported EMS patients that can be used to guide research and training. Future research should further explore the characteristics of those that access health care through the EMS system.
Introduction: Undertreated pain is known to cause short and long-term harm in children. Limb injuries are a common painful condition in emergency department (ED) patients, accounting for 12% of ED visits by children. Our city has one pediatric ED in a freestanding children’s hospital and 3 general ED’s that treat both adults and children. 68% of pediatric limb injuries in our city are treated in the pediatric ED and 32% are treated in a general ED. A quality improvement (QI) initiative was developed at the children’s hospital ED in April 2015 focusing on “Commitment to Comfort.” After achieving aims at the childrens hospital, a QI collaborative was formed among the pediatric ED and the 3 general ED’s to 1) improve the proportion of children citywide receiving analgesia for limb injuries from 27% to 40% and 2) reduce the median time to analgesia from 37 minutes to 15 minutes, during the time period of April-September, 2016. Methods: Data were obtained from computerized order entry records for children 0-17.99 years visiting any participating ED with a chief complaint of limb injury. Project teams from each site met monthly to discuss aims, develop key driver diagrams, plan tests of change, and share learnings. Implementation strategies were based on the Model for Improvement with PDSA cycles. Patient and family consultation was obtained. Process measures included the proportion of children treated with analgesic medication and time to analgesia; balancing measures were duration of triage and length of stay for limb injury and all patients. Site-specific run charts were used to detect special cause variation. Data from all sites were combined at study end to measure city-wide impact using 2 and interrupted time series analysis. Results: During the 3.5-year time period studied (April 1, 2014-September 30, 2017), there were 45,567 visits to the participating ED’s by children 0-17.99 years with limb injury. All visits were included in analysis. Special cause was detected in run charts of all process measures. Interrupted time series analysis comparing the year prior to implementation at the childrens hospital in April 2015 to the year following completion of implementation at the 3 general hospitals in October 2016 demonstrated that the proportion of patients with limb injury receiving analgesia increased from 27% to 40% (p<0.01), and the median time from arrival to analgesia decreased from 37 to 11 minutes (p<0.01). Balancing measure analysis is in progress. Conclusion: This multisite initiative emphasizing “Commitment to Comfort” was successful in improving pain outcomes for all children with limb injuries seen in city-wide ED’s, and was sustained for one year following implementation. A QI collaborative can be an effective method for spreading improvement. The project team is now spreading the Commitment to Comfort initiative to over 30 rural and regional EDs throughout the province through establishment of a provincial QI collaborative.
Introduction: Community Paramedics (CPs) require access to timely blood analysis in the field to guide treatment and transport decisions. Point of care testing (POCT), as opposed to traditional laboratory analysis, may offer a solution, but limited research exists on CP POCT. The objective of this study is to compare the validity of two POCT devices (Abbott i-STAT® and Alere epoc®) and their use by CPs in the community. Methods: In a CP programme responding to 6,000 annual patient care events, a split sample validation of POCT against traditional laboratory analysis for seven analytes (sodium, potassium, chloride, creatinine, hemoglobin, hematocrit, and glucose) was conducted on a consecutive sample of patients. The difference of proportion of discrepant results between POCT and laboratory was compared using a two sample proportion test. Usability was analysed by survey of CP experience, an expert heuristic evaluation of devices, a review of device-logged errors, coded observations of POCT use during quality control testing, and a linear mixed effects model of Systems Usability Scale (SUS) adjusted for CP clinical and POCT experience. Results: Of 1,649 CP calls for service screened for enrollment, 174 had a blood draw, with 108 patient care encounters (62.1%) enrolled from 73 participants. Participants had a mean age of 58.7 years (SD16.3); 49% were female. In 4 of 646 (0.6%) individual comparisons, POCT reported a critical value that the laboratory did not; with no statistically significant difference in the number of discrepant critical values reported with epoc® compared to i-STAT®. There were no instances of the laboratory reporting a critical value when POCT did not. In 88 of 1,046 (8.4%) individual comparisons, the a priori defined acceptable difference between POCT and the laboratory was exceeded; occurring more often in epoc® (10.7%;95%CI:8.1%,13.3%) compared to i-STAT® (6.1%;95%CI:4.1%,8.2%)(p=0.007). Eighteen of 19 CP surveys were returned, with 11/18 (61.1%) preferring i-STAT® over epoc®. The i-STAT® had a higher mean SUS score (higher usability) compared to the epoc® (84.0/100 vs. 59.6/100; p=0.011). Fewer field blood analysis device-logged errors occurred in i-STAT® (7.8%;95%CI:2.9%,12.7%) compared to epoc® (15.5%;95%CI:9.3%,21.7%) although not statistically significant (p=0.063). Conclusion: CP programs can expect valid results from POCT. Usability assessment suggests a preference for i-STAT.
Introduction: Pediatric pain is often under-treated in emergency departments (EDs), which is known to cause short and long-term harm. A recent quality improvement collaborative (QIC) was successful in improving treatment of children’s pain across 4 EDs in our city. A new QIC was then formed among EDs across our province to improve treatment of presenting and procedural pain. Aims were to improve the proportion of children <12 years of age who receive topical anesthetic before needle procedures from 13% to 50%; and for children <17 years of age with fractures: to 1) improve the proportion who receive analgesic medication from 35% to 50%; 2) improve the proportion who have a documented pain score from 23% to 50%, and 3) reduce median time to analgesia from 59 minutes to 30 minutes, within 1 year. Methods: Invitations to participate in the QIC were sent to all 113 EDs in the province that treat children and had not participated in the previous QIC. Each site was asked to form a project team, participate in monthly webinars, develop key driver diagrams and project aims, undertake PDSA tests of change, and audit charts to assess performance. Sites are given a list of 20 randomly selected charts per month for audit. Audit data was entered into REDCap and uploaded to a provincial run chart dashboard. All participating sites received a “comfort kit” consisting of distraction items for children as well as educational materials. Measures of presenting pain included proportion of children <17 years with a diagnosis of fracture who have a documented pain score, proportion who receive an analgesic medication, and minutes to analgesia. The measure for procedural pain was the proportion of children <12 years who receive topical anesthetic prior to a needle procedure for a laboratory test. Length of stay for pediatric patients and all patients were balancing measures. Run charts were used to detect special cause. Difference in proportions were compared using 2. Final analysis will include interrupted time series. Results: 34 of 113 invited sites (30%) agreed to participate, including rural and regional representation from all geographic zones; 4222 visits since June 2016 were analyzed. Implementation began June 2017. Comparing the first 4 months following implementation to the preceding year, the proportion of children receiving topical anesthetic prior to needles increased from 13% to 25% (p<0.001). For children with fractures, the proportion with pain scores increased from 23% to 35% (p<0.001), proportion receiving analgesic medication increased from 35% to 42% (p<0.001), and median minutes to analgesia decreased from 59 to 43. Insufficient time points at this stage preclude identification of special cause. Conclusion: This province-wide QIC has already resulted in significant progress toward aims during the first 4 months of implementation. The QIC approach shows promise for improving pain outcomes in children visiting diverse EDs across a province.
Megafossils and macrofossils of terrestrial plants (trees, leaves, fruiting bodies, etc.) are found in sedimentary and pyroclastic units interbedded with lavas in many ancient lava fields worldwide, attesting to subaerial environments of eruption and the establishment of viable plant communities during periods of volcanic quiescence. Preservation within lava is relatively rare and generally confined to the more robust woody tissues of trees, which are then revealed in the form of charcoal, mineralised tissue or as trace fossil moulds (tree moulds) and casts of igneous rock (tree casts, s.s.).
In this contribution, we document several such fossil trees (s.l.), and the lavas with which they are associated, from the Palaeocene Mull Lava Field (MLF) on the Isle of Mull, NW Scotland. We present the first detailed geological account of a unique site within the Mull Plateau Lava Formation (MPLF) at Quinish in the north of the island and provide an appraisal of the famous upright fossil tree – MacCulloch's Tree – remotely located on the Ardmeanach Peninsula on the west coast of the island, and another large upright tree (the Carsaig Tree) near Malcolm's Point in the district of Brolass, SW Mull; both occurring within the earlier Staffa Lava Formation (SLF). The taphonomy of these megafossils, along with palynological and lithofacies assessments of associated strata, allows speculation of likely taxonomic affinity and the duration of hiatuses supporting the establishment of forest/woodland communities. The Ardmeanach and Carsaig specimens, because of their size and preservation as upright (? in situ) casts enveloped by spectacularly columnar-jointed basaltic lava, appear to be unique. The aspect of these trees, the thickness of the enveloping lavas and the arrangement of cooling joints adjacent to the trees, implies rapid emplacement, ponding and slow, static cooling of voluminous and highly fluid basaltic magma. The specimens from Quinish include two prostrate casts and several prostrate moulds that collectively have a preferred orientation, aligning approximately perpendicular to that of the regional Mull Dyke Swarm, the putative fissure source of the lavas, suggesting local palaeo-flow was directed towards the WSW. The Quinish Lava is an excellent example of a classic pāhoehoe (compound-braided) type, preserving some of the best examples of surface and internal features so far noted from the Hebridean Igneous Province (HIP) lava fields.
These Mull megafossils are some of the oldest recorded examples, remarkably well preserved, and form a significant feature of the island's geotourism industry.
The focal article by Bergman and Jean (2016) raises an important issue by documenting the underrepresentation of nonprofessional and nonmanagerial workers in industrial and organizational (I-O) research. They defined workers as, “people who were not executive, professional or managerial employees; who were low- to medium-skill; and/or who were wage earners rather than salaried” (p. 89). This definition encompasses a wide range of employee samples: from individuals working in blue-collar skilled trades like electricians and plumbers to police officers, soldiers, and call center representatives to low-skill jobs such as fast food, tollbooth operators, and migrant day workers. Because there is considerable variability in the pay, benefits, skill level, autonomy, job security, schedule flexibility, and working conditions that define these workers’ experiences, a more fine-grained examination of who these workers are is necessary to understand the scope of the problem and the specific subpopulations of workers represented (or not) in existing I-O research.
In an attempt to distill what we know about the effects of workplace mindfulness-based training, Hyland, Lee, and Mills (2015) cast a wide net with regard to the array of studies included in their review. For example, they include studies that investigate the benefits associated with workplace mindfulness training (e.g., Wolever et al., 2012) as well as training conducted for patients within primary care settings (e.g., Allen, Bromley, Kuyken, & Sonnenberg, 2009). In addition, their review includes studies based on self-reports of individual differences in mindfulness traits/skills (e.g., Hafenbrack, Kinias, & Barsade, 2014). Reviewing a broad cross-section of research is helpful to illustrate the wide-ranging nature of mindfulness research but also has the potential to obfuscate what we know about mindfulness as it pertains to workers and workplaces.
An abattoir-based study was undertaken between January and May 2013 to estimate the prevalence of Salmonella spp. and Yersinia spp. carriage and seroprevalence of antibodies to Toxoplasma gondii and porcine reproductive and respiratory syndrome virus (PRRSv) in UK pigs at slaughter. In total, 626 pigs were sampled at 14 abattoirs that together process 80% of the annual UK pig slaughter throughput. Sampling was weighted by abattoir throughput and sampling dates and pig carcasses were randomly selected. Rectal swabs, blood samples, carcass swabs and the whole caecum, tonsils, heart and tongue were collected. Salmonella spp. was isolated from 30·5% [95% confidence interval (CI) 26·5–34·6] of caecal content samples but only 9·6% (95% CI 7·3–11·9) of carcass swabs, which was significantly lower than in a UK survey in 2006–2007. S. Typhimurium and S. 4,[5],12:i:- were the most commonly isolated serovars, followed by S. Derby and S. Bovismorbificans. The prevalence of Yersinia enterocolitica carriage in tonsils was 28·7% (95% CI 24·8–32·7) whereas carcass contamination was much lower at 1·8% (95% CI 0·7–2·8). The seroprevalence of antibodies to Toxoplasma gondii and PRRSv was 7·4% (95% CI 5·3–9·5) and 58·3% (95% CI 53·1–63·4), respectively. This study provides a comparison to previous abattoir-based prevalence surveys for Salmonella and Yersinia, and the first UK-wide seroprevalence estimates for antibodies to Toxoplasma and PRRSv in pigs at slaughter.
We carried out an extensive photometric and spectroscopic investigation of the SPB binary, HD 25558 (see Fig. 1 for the time and geographic distribution of the observations). The ~2000 spectra obtained at 13 observatories during 5 observing seasons, the ground-based multi-colour light curves and the photometric data from the MOST satellite revealed that this object is a double-lined spectroscopic binary with a very long orbital period of about 9 years. We determined the physical parameters of the components, and have found that both lie within the SPB instability strip. Accordingly, both components show line-profile variations consistent with stellar pulsations. Altogether, 11 independent frequencies and one harmonic frequency were identified in the data. The observational data do not allow the inference of a reliable orbital solution, thus, disentangling cannot be performed on the spectra. Since the lines of the two components are never completely separated, the analysis is very complicated. Nevertheless, pixel-by-pixel variability analysis of the cross-correlated line profiles was successful, and we were able to attribute all the frequencies to the primary or secondary component. Spectroscopic and photometric mode-identification was also performed for several of these frequencies of both binary components. The spectroscopic mode-identification results suggest that the inclination and rotation of the two components are rather different. While the primary is a slow rotator with ~6 d rotation period, seen at ~60° inclination, the secondary rotates fast with ~1.2 d rotation period, and is seen at ~20° inclination. Our spectropolarimetric measurements revealed that the secondary component has a magnetic field with at least a few hundred Gauss strength, while no magnetic field was detected in the primary.
The detailed analysis and results of this study will be published elsewhere.
The twin summits of Preshal More and Preshal Beg, near Talisker, Isle of Skye, comprise the erosional remnants of a thick (at least 120 m) compound olivine tholeiite lava, or flow field, that ponded in palaeo-valleys within the Palaeocene lava field of west-central Skye. This unique flow field constitutes the Talisker Formation and is the youngest preserved extrusive unit of the Skye Lava Field. The lava inundated a complex of palaeo-valleys incised into the higher stratigraphical levels of the existing lava field, and remnants of the original sedimentary fill of these valleys still exist, the Preshal Beg Conglomerate Formation. The lava displays spectacularly well-developed two-tier (colonnade-entablature) columnar joint sets that formed as a consequence of slow, uninterrupted cooling through its base and sidewalls, aided by groundwater circulation and water ingress (from displaced drainage) directed into the lava's interior by master-joint systems. Intrusive phenomena developed at both the base and the top of the lava and there is evidence for the existence of subsurface feeder tubes. The tholeiitic composition of the Talisker Formation lava contrasts with the transitional, mildly alkaline characteristics of the remainder of the (older) lavas of Skye Lava Field. In broad terms, the Talisker Formation lava is compositionally very similar to the suite of cone-sheets emplaced into the oldest of the four intrusive centres that comprise the Skye Central Complex – the Cuillin Intrusive Centre – together with a high proportion of the Skye regional dyke swarm. The stratigraphical position, field relationships and compositional characteristics of the lava indicate that it was erupted and emplaced as an intracanyon-style flow field during the early shield-building stage in the growth of the (tholeiitic) Cuillin Volcano, which post-dates the main Skye ‘plateau’ Lava Field. Although the remnant outcrops are detached from their likely source area through erosion, this tholeiitic lava provides the first direct evidence linking the central complexes of the British Palaeogene Igneous Province and their eruptive products.
We used differential time to positivity between central and peripheral blood cultures to evaluate the positive predictive value (PPV) of the National Healthcare Safety Network central line–associated bloodstream infection (CLABSI) surveillance definition among hematology patients with febrile neutropenia. The PPV was 27.7%, which suggests that, when the definition is applied to this population, CLABSI rates will be substantially overestimated.
Field biologists adopted the term habituation from physiology, as the relatively persistent waning of a response as a result of repeated stimulation that is not followed by any kind of reinforcement (Thorpe, 1963). Repeated neutral contacts between primates and humans can lead to a reduction in fear, and ultimately to the ignoring of an observer. Historically, the techniques and processes involved were rarely described, as habituation was generally viewed as a means to an end (Tutin & Fernandez, 1991). As we become increasingly aware of the potential effects of observer presence on primate behaviour, and especially the potential risks of close proximity with humans, it behoves us to measure as much about the habituation process as possible. However, most recent studies that have quantified primate behaviour in relation to habituators have focussed on great apes (see, for example, Ando et al., 2008; Bertolani & Boesch, 2008; Blom et al., 2004; Cipolletta, 2003; Doran-Sheehy et al., 2007; Sommer et al., 2004; Werdenich et al., 2003), with little information available for other primate taxa (but see Jack et al., 2008).
There are limits to what studies of unhabituated primates can achieve: it is difficult to observe at close range, so subtle or cryptic behaviour such as facial expressions and soft vocalizations may be missed, and even individual identification may be difficult, resulting in analyses based only on age–sex classes.
Collaboration is used by the US National Security Council as a means to integrate inter-federal government agencies during planning and execution of common goals towards unified, national security. The concept of collaboration has benefits in the healthcare system by building trust, sharing resources, and reducing costs. The current terrorist threats have made collaborative medical training between military and civilian agencies crucial.
This review summarizes the long and rich history of collaboration between civilians and the military in various countries and provides support for the continuation and improvement of collaborative efforts. Through collaboration, advances in the treatment of injuries have been realized, deaths have been reduced, and significant strides in the betterment of the Emergency Medical System have been achieved. This review promotes collaborative medical training between military and civilian medical professionals and provides recommendations for the future based on medical collaboration.