To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The practice of medicine often requires procedures that cause pain and anxiety. With the advent of modern anaesthesia these procedures have become commonplace and tolerable. Procedures with the greatest degree of pain are frequently accomplished during a state of general anaesthesia. Many procedures, however, are performed under sedation and analgesia. In contrast to general anaesthesia, sedation and analgesia use short acting medications to alleviate pain and anxiety while leaving patients capable of maintaining their airway and basic physiological functions.
Translocation and rehabilitation programmes are critical tools for wildlife conservation. These methods achieve greater impact when integrated in a combined strategy for enhancing population or ecosystem restoration. During 2002–2016 we reared 37 orphaned southern sea otter Enhydra lutris nereis pups, using captive sea otters as surrogate mothers, then released them into a degraded coastal estuary. As a keystone species, observed increases in the local sea otter population unsurprisingly brought many ecosystem benefits. The role that surrogate-reared otters played in this success story, however, remained uncertain. To resolve this, we developed an individual-based model of the local population using surveyed individual fates (survival and reproduction) of surrogate-reared and wild-captured otters, and modelled estimates of immigration. Estimates derived from a decade of population monitoring indicated that surrogate-reared and wild sea otters had similar reproductive and survival rates. This was true for males and females, across all ages (1–13 years) and locations evaluated. The model simulations indicated that reconstructed counts of the wild population are best explained by surrogate-reared otters combined with low levels of unassisted immigration. In addition, the model shows that 55% of observed population growth over this period is attributable to surrogate-reared otters and their wild progeny. Together, our results indicate that the integration of surrogacy methods and reintroduction of juvenile sea otters helped establish a biologically successful population and restore a once-impaired ecosystem.
We analyzed antibiotic use data from 29 southeastern US hospitals over a 5-year period to determine changes in antibiotic use after the fluoroquinolone US Food and Drug Administration (FDA) advisory update in 2016. Fluoroquinolone use declined both before and after the FDA announcement, and the use of select, alternative antibiotics increased after the announcement.
Fluoroquinolones are among the 4 most commonly prescribed antibiotic classes.1,2 Postmarketing reports of serious adverse events linked to fluoroquinolones include tendonitis, neuropathy, hypoglycemia, psychiatric side effects, and possible aortic vessel rupture, leading to safety label changes in July 2008 and August 2013.3 In July 2016, the US Food and Drug Administration (FDA) strengthened the “black box” warning following an initial safety announcement in May 2016, recommending avoidance of fluoroquinolones for uncomplicated infections such as acute exacerbation of chronic bronchitis, uncomplicated urinary tract infections, and acute bacterial sinusitis.4 Concerns over safety and the association with Clostridiodes difficile infection have led inpatient antimicrobial stewardship programs (ASPs) to develop initiatives to promote avoidance of quinolones. The objective of this study was to quantify the effect of the 2016 FDA “black box” update on inpatient antibiotic use among a cohort of southeastern US hospitals.
The objective of the CAEP Global Emergency Medicine (EM) panel was to identify successes, challenges, and barriers to engaging in global health in Canadian academic emergency departments, formulate recommendations for increasing engagement of faculty, and guide departments in developing a Global EM program.
A panel of academic Global EM practitioners and residents met regularly via teleconference in the year leading up to the CAEP 2018 Academic Symposium. Recommendations were drafted based on a literature review, three mixed methods surveys (CAEP general members, Canadian Global EM practitioners, and Canadian academic emergency department leaders), and panel members’ experience. Recommendations were presented at the CAEP 2018 Academic Symposium in Calgary and further refined based on feedback from the Academic Section.
A total of nine recommendations are presented here. Seven of these are directed towards Canadian academic departments and divisions and intend to increase their engagement in Global EM by recognizing it as an integral part of the practice of emergency medicine, deliberately incorporating it into strategic plans, identifying local leaders, providing tangible supports (i.e., research, administration or financial support, shift flexibility), mitigating barriers, encouraging collaboration, and promoting academic deliverables. The final two recommendations pertain to CAEP increasing its own engagement and support of Global EM.
These recommendations serve as guidance for Canadian academic emergency departments and divisions to increase their engagement in Global EM.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
The widespread use of herbicides in cropping systems has led to the evolution of resistance in major weeds. The resultant loss of herbicide efficacy is compounded by a lack of new herbicide sites of action, driving demand for alternative weed control technologies. While there are many alternative methods for control, identifying the most appropriate method to pursue for commercial development has been hampered by the inability to compare techniques in a fair and equitable manner. Given that all currently available and alternative weed control methods share an intrinsic energy consumption, the aim of this review was to compare methods based on energy consumption. Energy consumption was compared for chemical, mechanical, and thermal weed control technologies when applied as broadcast (whole-field) and site-specific treatments. Tillage systems, such as flex-tine harrow (4.2 to 5.5 MJ ha−1), sweep cultivator (13 to 14 MJ ha−1), and rotary hoe (12 to 17 MJ ha−1) consumed the least energy of broadcast weed control treatments. Thermal-based approaches, including flaming (1,008 to 4,334 MJ ha−1) and infrared (2,000 to 3,887 MJ ha−1), are more appropriate for use in conservation cropping systems; however, their energy requirements are 100- to 1,000-fold greater than those of tillage treatments. The site-specific application of weed control treatments to control 2-leaf-stage broadleaf weeds at a density of 5 plants m−2 reduced energy consumption of herbicidal, thermal, and mechanical treatments by 97%, 99%, and 97%, respectively. Significantly, this site-specific approach resulted in similar energy requirements for current and alternative technologies (e.g., electrocution [15 to 19 MJ ha−1], laser pyrolysis [15 to 249 MJ ha−1], hoeing [17 MJ ha−1], and herbicides [15 MJ ha−1]). Using similar energy sources, a standardized energy comparison provides an opportunity for estimation of weed control costs, suggesting site-specific weed management is critical in the economically realistic implementation of alternative technologies.
Background: Cervical spondylotic myelopathy (CSM) is the leading cause of spinal cord impairment. In a public healthcare system, wait times to see spine specialists and eventually access surgical treatment for CSM can be substantial. The goals of this study were to determine consultation wait times (CWT) and surgical wait times (SWT), and identify predictors of wait time length. Methods: Consecutive patients enrolled in the Canadian Spine Outcomes and Research Network (CSORN) prospective and observational CSM study from March 2015 to July 2017 were included. A data-splitting technique was used to develop and internally validate multivariable models of potential predictors. Results: A CSORN query returned 264 CSM patients for CWT. The median was 46 days. There were 31% mild, 35% moderate, and 33% severe CSM. There was a statistically significant difference in median CWT between moderate and severe groups; 207 patients underwent surgical treatment. Median SWT was 42 days. There was a statistically significant difference in SWT between mild/moderate and severe groups. Short symptom duration, less pain, lower BMI, and lower physical component score of SF-12 were predictive of shorter CWT. Only baseline pain and medication duration were predictive of SWT. Both CWT and SWT were shorter compared to a concurrent cohort of lumbar stenosis patients (p <0.001). Conclusions: Patients with shorter duration (either symptoms or medication) and less neck pain waited less to see a spine specialist in Canada and to undergo surgical treatment. This study highlights some of the obstacles to overcome in expedited care for this patient population.
Although the thermal evolution of the mantle before c. 3.0 Ga remains unclear, since c. 3.0 Ga secular cooling has dominated over heat production—this is time's arrow. By contrast, the thermal history of the crust, which is preserved in the record of metamorphism, is more complex. Heat to drive metamorphism is generated by radioactive decay and viscous dissipation, and is augmented by the influx of heat from the mantle. Notwithstanding that reliable data are sparse before the Neoarchean, we use a dataset of temperature (T), pressure (P) and thermobaric ratio (T/P at the metamorphic ‘peak’), and age of metamorphism (t, the timing of the metamorphic ‘peak’) for rocks from 564 localities ranging in age from the Cenozoic to the Eoarchean eras to interrogate the crustal record of metamorphism as a proxy for the heat budget of the crust through time. On the basis of T/P, metamorphic rocks are classified into three natural groups: high T/P type (T/P >775°C/GPa, mean T/P ~1105°C/GPa), including common and ultrahigh-temperature granulites, intermediate T/P type (T/P between 775 and 375°C/GPa, mean T/P ~575°C/GPa), including high-pressure granulites and medium- and high-temperature eclogites, and low T/P type (T/P <375°C/GPa, mean T/P ~255°C/GPa), including blueschists, low-temperature eclogites and ultrahigh-pressure metamorphic rocks. A monotonic increase in the P of intermediate T/P metamorphism from the Neoarchean to the Neoproterozoic reflects strengthening of the lithosphere during secular cooling of the mantle—this is also time's arrow. However, temporal variation in the P of intermediate T/P metamorphism and in the moving means of T and T/P of high T/P metamorphism, combined with the clustered age distribution, demonstrate the cyclicity of collisional orogenesis and cyclic variations in the heat budget of the crust superimposed on secular cooling since c. 3.0 Ga—this is time's cycle. A first cycle began with the widespread appearance/survival of intermediate T/P and high T/P metamorphism in the Neoarchean rock record coeval with amalgamation of dispersed blocks of lithosphere to form protocontinents. This cycle was terminated by the fragmentation of the protocontinents into cratons in the early Paleoproterozoic, which signalled the start of a new cycle. The second cycle continued with the progressive amalgamation of the cratons into the supercontinent Columbia and extended until the breakup of the supercontinent Rodinia in the Neoproterozoic. This cycle represented a period of relative tectonic and environmental stability, and perhaps reduced subduction during at least part of the cycle. During most of the Proterozoic the moving means for both T and T/P of high T/P metamorphism exceeded the arithmetic means, reflecting insulation of the mantle beneath the quasi-integrated lithosphere of Columbia and, after a limited reorganisation, Rodinia. The third cycle began with the steep decline in thermobaric ratios of high T/P metamorphism to their lowest value, synchronous with the breakup of Rodinia and the formation of Pannotia, and the widespread appearance/preservation of low T/P metamorphism in the rock record. The thermobaric ratios for high T/P metamorphism rise to another peak associated with the Pan-African event, again reflecting insulation of the mantle. The subsequent steep decline in thermobaric ratios of high T/P metamorphism associated with the breakup of Pangea at c. 0.175 Ga may indicate the start of a fourth cycle. The limited occurrence of high and intermediate T/P metamorphism before the Neoarchean suggests either that suitable tectonic environments to generate these types of metamorphism were not widely available before then or that the rate of survival was low. We interpret the first cycle to record stabilisation of subduction and the emergence of a network of plate boundaries in a plate tectonics regime once the balance between heat production and heat loss changed in favour of secular cooling, possibly as early as c. 3.0 Ga in some areas. This is inferred to have been a globally linked system by the early Paleoproterozoic, but whether it remained continuous to the present is unclear. The second cycle was characterised by stability from the formation of Columbia to the breakup of Rodinia, generating higher than average T and T/P of high T/P metamorphism. The third cycle reflects colder collisional orogenesis and deep subduction of the continental crust, features that are characteristic of modern plate tectonics, which became possible once the average temperature of the asthenospheric mantle had declined to <100°C warmer than the present day after c. 1.0 Ga.
Bowel cancer risk is strongly influenced by lifestyle factors including diet and physical activity. Several studies have investigated the effects of adherence to the World Cancer Research Fund (WCRF)/American Institute for Cancer Research (AICR) cancer prevention recommendations on outcomes such as all-cause and cancer-specific mortality, but the relationships with molecular mechanisms that underlie the effects on bowel cancer risk are unknown. This study aimed to investigate the relationships between adherence to the WCRF/AICR cancer prevention recommendations and wingless/integrated (WNT)-pathway-related markers of bowel cancer risk, including the expression of WNT pathway genes and regulatory microRNA (miRNA), secreted frizzled-related protein 1 (SFRP1) methylation and colonic crypt proliferative state in colorectal mucosal biopsies. Dietary and lifestyle data from seventy-five healthy participants recruited as part of the DISC Study were used. A scoring system was devised including seven of the cancer prevention recommendations and smoking status. The effects of total adherence score and scores for individual recommendations on the measured outcomes were assessed using Spearman’s rank correlation analysis and unpaired t tests, respectively. Total adherence score correlated negatively with expression of Myc proto-oncogene (c-MYC) (P=0·039) and WNT11 (P=0·025), and high adherers had significantly reduced expression of cyclin D1 (CCND1) (P=0·042), WNT11 (P=0·012) and c-MYC (P=0·048). Expression of axis inhibition protein 2 (AXIN2), glycogen synthase kinase (GSK3β), catenin β1 (CTNNB1) and WNT11 and of the oncogenic miRNA miR-17 and colonic crypt kinetics correlated significantly with scores for individual recommendations, including body fatness, red meat intake, plant food intake and smoking status. The findings from this study provide evidence for positive effects of adherence to the WCRF/AICR cancer prevention recommendations on WNT-pathway-related markers of bowel cancer risk.
Ralph Waldo Emerson is known to have said, “the greatest wonder is that we can see these trees and not wonder more.” As industrial and organizational (I-O) psychologists, we often encounter this very dilemma when we examine how numerous professions rise and fall in relevance. More recently, however, we have encountered this dilemma from an existential perspective as we strive to understand the evolution of our own profession and the situational characteristics making change inevitable. We have fallen into a trap—we, too, now look at all of our practices, aiming to reconfigure the makeup of our profession while losing sight of the macrotrends affecting more than just our evolved existence. Rather than focusing on the smaller issue first, we need to start by examining the broader issues affecting it.
To evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.
A before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.
A 694-bed teaching hospital.
We administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.
The study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4–0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.
Despite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.
We surveyed resident physicians at 2 academic medical centers regarding urinary testing and treatment as they progressed through training. Demographics and self-reported confidence were compared to overall knowledge using clinical vignette-based questions. Overall knowledge was 40% in 2011 and increased to 48%, 55%, and 63% in subsequent years (P<.001).
To determine the impact of recurrent Clostridium difficile infection (RCDI) on patient behaviors following illness.
Using a computer algorithm, we searched the electronic medical records of 7 Chicago-area hospitals to identify patients with RCDI (2 episodes of CDI within 15 to 56 days of each other). RCDI was validated by medical record review. Patients were asked to complete a telephone survey. The survey included questions regarding general health, social isolation, symptom severity, emotional distress, and prevention behaviors.
In total, 119 patients completed the survey (32%). On average, respondents were 57.4 years old (standard deviation, 16.8); 57% were white, and ~50% reported hospitalization for CDI. At the time of their most recent illness, patients rated their diarrhea as high severity (58.5%) and their exhaustion as extreme (30.7%). Respondents indicated that they were very worried about getting sick again (41.5%) and about infecting others (31%). Almost 50% said that they have washed their hands more frequently (47%) and have increased their use of soap and water (45%) since their illness. Some of these patients (22%–32%) reported eating out less, avoiding certain medications and public areas, and increasing probiotic use. Most behavioral changes were unrelated to disease severity.
Having had RCDI appears to increase prevention-related behaviors in some patients. While some behaviors are appropriate (eg, handwashing), others are not supported by evidence of decreased risk and may negatively impact patient quality of life. Providers should discuss appropriate prevention behaviors with their patients and should clarify that other behaviors (eg, eating out less) will not affect their risk of future illness.
OBJECTIVES/SPECIFIC AIMS: Obesity is a rapidly growing epidemic and long-term interventions aimed to reduce body weight are largely unsuccessful due to an increased drive to eat and a reduced metabolic rate established during weight loss. Previously, our lab demonstrated that exercise has beneficial effects on weight loss maintenance by increasing total energy expenditure above and beyond the cost of an exercise bout and reducing the drive to eat when allowed to eat ad libitum (relapse). We hypothesized that exercise’s ability to counter these obesogenic-impetuses are mediated via improvements in skeletal muscle oxidative capacity, and tested this using a mouse model with augmented oxidative capacity in skeletal muscle. METHODS/STUDY POPULATION: We recapitulated the exercise-induced improvements in oxidative capacity using FVB mice that overexpress lipoprotein lipase in skeletal muscle (mLPL). mLPL and wild type (WT) mice were put through a weight-loss-weight-regain paradigm consisting of a high fat diet challenge for 13 weeks, with a subsequent 1-week calorie-restricted medium fat diet to induce a ~15% weight loss. This newly established weight was maintained for 2 weeks and followed with a 24-hour relapse. Metabolic phenotype was characterized by indirect calorimetry during each phase. At the conclusion of the relapse day, mice were sacrificed and tissues were harvested for molecular analysis. RESULTS/ANTICIPATED RESULTS: During weight loss maintenance, mLPL mice had a higher metabolic rate (p=0.0256) that was predominantly evident in the dark cycle (p=0.0015). Furthermore, this increased metabolic rate was not due to differences in activity (p=0.2877) or resting metabolic rate (p=0.4881). During relapse, mLPL mice ingested less calories and were protected from rapid weight regain (p=0.0235), despite WT mice exhibiting higher metabolic rates during the light cycle (p=0.0421). DISCUSSION/SIGNIFICANCE OF IMPACT: These results highlight the importance of muscular oxidative capacity in preventing a depression in total energy expenditure during weight loss maintenance, and in curbing overfeeding and weight regain during a relapse. Moreover, our data suggest that the thermic effect of food is responsible for the differences in metabolic rate, because no differences were found in activity or resting metabolic rate. Additional studies are warranted to determine the molecular mechanisms driving the ability of oxidative capacity to assist with weight loss maintenance.
We present color-magnitude diagrams (CMDs) based on HST F555W (“V”) and F814W (“I”) observations of three old LMC clusters: NGC 2210, NGC 1786, and Reticulum. The fiducial derived from the CMD of NGC 2257, another LMC cluster, provided a good fit to the data for the new clusters. Because NGC 2257 has a similar metallicity ([Fe/H]∼ −1.8) to NGC 2210, NGC 1786, and Reticulum, the agreement between the CMDs of all four clusters indicates that they have the same age. This preliminary analysis suggests that any age differences are smaller than 2 Gyr. These new results mean that there are now 11 old LMC clusters with similar ages. An initial epoch of star cluster formation therefore happened in a short period over a large volume of space, a volume much larger than is now covered by the present-day optical LMC.