We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
With the exception of near-occlusion, CEA is of overall benefit for selected patients with recent symptomatic carotid stenosis =50% (NASCET method), provided surgical stroke/death risk is low. The benefit is greater with greater stenosis, men, the elderly (aged =75y), most recent ischaemic event within 2w, irregular plaque surface, and impaired cerebral perfusion reserve. Patients with recent symptomatic carotid territory ischaemic events should be screened by Doppler ultrasonography, MRA, or CTA, confirming substantial stenosis with a second non-invasive investigation. Catheter angiography may be required to confirm uncertain results. The surgical peri-operative stroke and death rate (7% in RCTs) is higher in women, hypertension, peripheral arterial disease, and occlusion of the contralateral ICA or ipsilateral ECA. The experience of the surgeon and hospital are crucial, and audited peri-operative complication rates should be publically available. Carotid stenting is less invasive than CEA and causes fewer local complications (cranial neuropathy and neck haematoma), but carries a higher procedural risk of stroke. Stenting should be considered in younger patients, or those at increased risk from CEA. While stenting is of high risk for intracranial vertebral artery stenosis, risk is low for extracranial stenosis and should be considered for recurrent symptoms despite optimal medical therapy.
Intracerebral haemorrhage and subarachnoid haemorrhage are associated with considerable morbidity and mortality. Too often the focus is on acute treatment after a haemorrhage has occurred, instead of primary and secondary prevention. Medical therapies to control hypertension, achieve tobacco abstinence, and avoid excessive alcohol consumption can confer broad reductions in haemorrhage risk across pathophysiological subtypes. Judicious restriction of antiplatelet and anticoagulant therapies to only those individuals and those intensities for which they are indicated also can substantially reduce haemorrhagic stroke frequency. Specific endovascular and surgical therapies, judiciously employed, will further reduce risk of first or recurrent haemorrhage from structural vascular anomalies, including arteriovenous malformation, cavernous malformations, and saccular aneurysms. For unruptured intracranial aneurysms, features that favour consideration of preventive occlusion include include younger patient age, prior subarachnoid haemorrhage from a different aneurysm, familial intracranial aneurysms, large aneurysm size, irregular shape, basilar or vertebral artery location, and aneurysm growth on serial imaging. Among individuals who are technical candidates for either coiling or clipping, endovascular coiling is associated with a reduction in procedural morbidity and mortality but has a higher risk of recurrence.
Typical enteropathogenic Escherichia coli (tEPEC) infection is a major cause of diarrhoea and contributor to mortality in children <5 years old in developing countries. Data were analysed from the Global Enteric Multicenter Study examining children <5 years old seeking care for moderate-to-severe diarrhoea (MSD) in Kenya. Stool specimens were tested for enteric pathogens, including by multiplex polymerase chain reaction for gene targets of tEPEC. Demographic, clinical and anthropometric data were collected at enrolment and ~60-days later; multivariable logistic regressions were constructed. Of 1778 MSD cases enrolled from 2008 to 2012, 135 (7.6%) children tested positive for tEPEC. In a case-to-case comparison among MSD cases, tEPEC was independently associated with presentation at enrolment with a loss of skin turgor (adjusted odds ratio (aOR) 2.08, 95% confidence interval (CI) 1.37–3.17), and convulsions (aOR 2.83, 95% CI 1.12–7.14). At follow-up, infants with tEPEC compared to those without were associated with being underweight (OR 2.2, 95% CI 1.3–3.6) and wasted (OR 2.5, 95% CI 1.3–4.6). Among MSD cases, tEPEC was associated with mortality (aOR 2.85, 95% CI 1.47–5.55). This study suggests that tEPEC contributes to morbidity and mortality in children. Interventions aimed at defining and reducing the burden of tEPEC and its sequelae should be urgently investigated, prioritised and implemented.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Before his incoherent response to the COVID-19 pandemic, the focus of President Trump's health policy agenda was the elimination of the Patient Protection and Affordable Care Act (ACA), which he has called a ‘disaster’. The attacks on the ACA included proposals to repeal the law through the legislative process, to erode it through a series of executive actions, and to ask the courts to declare it unconstitutional. Despite these ongoing challenges, the ACA remains largely intact as the U.S. heads into the 2020 election. The longer term fate of the law, however, is uncertain and the outcome of the 2020 election is likely to have a dramatic effect on the direction of health policy in the U.S.
Gender is a highly salient and important social group that shapes how children interact with others and how they are treated by others. In this Element, we offer an overview and review of the research on gender development in childhood from a developmental science perspective. We first define gender and the related concepts of sex and gender identity. Second, we discuss how variations in cultural context shape gender development around the world and how variations within gender groups add to the complexity of gender identity development. Third, we discuss major theoretical perspectives in developmental science for studying child gender. Fourth, we examine differences and similarities between girls and boys using the latest meta-analytic evidence. Fifth, we discuss the development of gender, gender identity, and gender socialization throughout infancy, early childhood, and middle childhood. We conclude with a discussion of future directions for the study of gender development in childhood.
The medical savings account model of health insurance in the United States combines a high-deductible health insurance plan‹ A high-deductible health plan is an insurance plan under which the beneficiary is responsible for a substantial amount of expense (the deductible) before the insurer begins paying benefits. US federal law as of 2015 requires a deductible of at least US$1300 for single coverage and US$2600 for family coverage for health saving account-qualified high-deductible health plans ().› with a dedicated savings account used to pay expenses incurred below the deductible. Savings in the plan can roll over from one year to the next and, after some predefined period during which they are dedicated to health spending, can be used for non-health-related expenses.‹ Use of savings for non-health expenses before this predefined period (age 65 under current law), incurs a tax and 20% penalty.› In principle, this model combines the incentives for frugal use of health services that exist in high-deductible health insurance with assurance that the funds required in the event of true medical need will be available.
Life can be a tale of woe so people contrive arrangements that insulate them from the consequences of some familiar feared mishaps. Insurance – a small present payment in promise of larger compensation in the event of future losses – is one such arrangement. Insurance contracts recompense policy-holders for, for instance, loss of or damage to their home in a fire or their car in an accident, or the death of a benefactor who took out life insurance. People may also buy insurance that will cover some (maybe most) of their medical costs if they or members of their family fall ill and need care.
Insights into the dynamics of electrochemical processes are critically needed to improve our fundamental understanding of electron, charge, and mass transfer mechanisms and reaction kinetics that influence a broad range of applications, from the functionality of electrical energy-storage and conversion devices (e.g., batteries, fuel cells, and supercapacitors), to materials degradation issues (e.g., corrosion and oxidation), and materials synthesis (e.g., electrodeposition). To unravel these processes, in situ electrochemical scanning/transmission electron microscopy (ec-S/TEM) was developed to permit detailed site-specific characterization of evolving electrochemical processes that occur at electrode–electrolyte interfaces in their native electrolyte environment, in real time and at high-spatial resolution. This approach utilizes “closed-form” microfabricated electrochemical cells that couple the capability for quantitative electrochemical measurements with high spatial and temporal resolution imaging, spectroscopy, and diffraction. In this article, we review the state-of-the-art instrumentation for in situ ec-S/TEM and how this approach has resulted in new observations of electrochemical processes.
Glyphosate is an important component of herbicide programs in orchard crops in California. It can be applied alone or in tank-mix combinations under the crop rows or to the entire field and often is used multiple times each year. There has been speculation about the potential impacts of repeated use of glyphosate in perennial crop systems, because of uptake from shallow root systems or indirectly because of effects on nutrient availability in soil. To address these concerns, research was conducted from 2013 to 2020 on key orchard crops to evaluate tree response to glyphosate regimens. Almond, cherry, and prune were evaluated in separate experiments. In each crop, the experimental design was a factorial arrangement of two soil types, four glyphosate rates (0, 1.1, 2.2, and 4.4 kg ae ha−1, applied three times annually), and two post-glyphosate application irrigation treatments. In the first 2 yr of the study, there was no clear impact of the glyphosate regimens on shikimate accumulation or leaf chlorophyll content, which suggested no direct effect on the crop. In the seventh year of the study, after six consecutive years of glyphosate application to the orchard floors, there were no negative impacts of glyphosate application on leaf nutrient concentration or on cumulative trunk growth in any of the three orchard crops. Lack of a negative growth impact even at the highest treatment rate, which included 18 applications of glyphosate totaling nearly 80 kg ae ha−1 glyphosate over the course of the experiment suggest there is not likely a significant risk to tree health of judicious use of the herbicide in these production systems. Given the economic importance of orchard crops in California, and grower and industry concerns about pesticides generally and specifically about glyphosate, these findings are timely contributions to weed management concerns in perennial specialty crops.
The radiocarbon (14C) calibration curve so far contains annually resolved data only for a short period of time. With accelerator mass spectrometry (AMS) matching the precision of decay counting, it is now possible to efficiently produce large datasets of annual resolution for calibration purposes using small amounts of wood. The radiocarbon intercomparison on single-year tree-ring samples presented here is the first to investigate specifically possible offsets between AMS laboratories at high precision. The results show that AMS laboratories are capable of measuring samples of Holocene age with an accuracy and precision that is comparable or even goes beyond what is possible with decay counting, even though they require a thousand times less wood. It also shows that not all AMS laboratories always produce results that are consistent with their stated uncertainties. The long-term benefits of studies of this kind are more accurate radiocarbon measurements with, in the future, better quantified uncertainties.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
This chapter provides an overview of economic and behavioral economic approaches to behavior change. The chapter begins with a description of the traditional or neoclassical economic view of decision-making using expected utility theory as its basis. Attempts by an external party (e.g., a government or agency) to change behavior are viewed as justifiable in a limited number of circumstances, such as when there are externalities or coordination failures. When behavior change is warranted, neoclassical economics has focused on four options: provide information, increase incentives, reduce prices, or increase subsidies, or impose regulations. To be successful, the approach must change the net benefits of the promoted behavior. The chapter then describes the rationale behind behavioral economic approaches to behavior change, emphasizing the role that “nudges” play in behavior change. Examples are provided of common heuristics and associated decision errors that can result, and how nudges are designed to overcome these decision errors. The underlying rationale and steps for developing nudges are summarized. Current evidence suggests that some nudges can be effective in changing behavior, but more research is needed to demonstrate the effectiveness of many nudge strategies. The chapter concludes with a discussion of the likely long-term impact of nudges in the field of behavior change.
OBJECTIVES/GOALS: Primary graft dysfunction (PGD) is acute lung injury in the first three days after lung transplant. Patients that experience PGD have increased mortality and an increased risk of chronic lung allograft dysfunction. The pathogenesis is thought to be an ischemia-reperfusion injury but is incompletely understood and there are no specific therapies. We investigated the role of the microbiome in PGD and associations with inflammation and markers of aspiration. METHODS/STUDY POPULATION: We collected airway lavage samples from lung transplant donors before procurement and recipients after reperfusion. We extracted DNA, amplified the bacterial 16S rRNA gene, and sequenced on the Illumina MiSeq platform. QIIME2 and Deblur were used for bioinformatic analysis. R packages were used for downstream analysis and visualizations. The host response was quantified using the Milipore 41-plex Luminex and an ELISA for pepsin. Clinical data was collected by the Penn Lung Transplant Outcomes Group. PGD was assessed by degree of hypoxemia and chest X-ray findings in the 72 hours after transplant. RESULTS/ANTICIPATED RESULTS: There was no significant difference in alpha diversity (Shannon index, p = 0.51), biomass (via comparison of 16S amplicon PicoGreen, p = 0.6), or beta diversity (Weighted UniFrac, p = 0.472, PERMANOVA) between subjects with PGD grade 3 (n = 36) and those that did not (n = 96). On taxonomic analysis, we found an enrichment of Prevotella in donor and recipient lungs that went on to develop PGD (p = 0.05). To follow up this finding we measured immune response and pepsin concentrations in recipient lungs. We found elevated levels in 35/41 cytokines measured in subjects that developed PGD as well as an elevation in pepsin and a correlation between pepsin concentration and Prevotella relative abundance (Figure 1). Additionally, Prevotella relative abundance had statistically significant positive correlations with multiple cytokines such as IL-6 (Pearson’s = 0.26, p = 0.009) and eotaxin (Pearson’s = 0.24, p = 0.016). DISCUSSION/SIGNIFICANCE OF IMPACT: There is an enrichment of oral anerobes in lung allografts that eventually develop PGD. This is associated with elevated levels of pepsin and markers of inflammation. These lines of evidence suggest aspiration contributes to priming the allograft for PGD.
The conopid fly Stylogaster neglecta Williston (Diptera: Conopidae) is a parasitoid with no known host. We report this species as the first recorded dipteran parasitoid of Oecanthus nigricornis Walker (Orthoptera: Gryllidae) (black-horned tree crickets). We reared field-collected O. nigricornis juveniles over several months in 2017 and found that larval S. neglecta emerged from them during late July into August. We estimated the incubation period for S. neglecta larvae to be around 30 days based on the length of time it took for them to emerge from the host and pupate (subsequently all hosts died). We documented several cases of multiple parasitism. In 2018, we dissected O. nigricornis sampled from four sites across southern Ontario, Canada and upstate New York, United States of America and found that the percentage of juvenile O. nigricornis parasitised ranged 2–39%. Further sampling will be necessary to determine whether this variation represents consistent population differences or between-year variation in parasitism.
Raw milk cheeses are commonly consumed in France and are also a common source of foodborne outbreaks (FBOs). Both an FBO surveillance system and a laboratory-based surveillance system aim to detect Salmonella outbreaks. In early August 2018, five familial FBOs due to Salmonella spp. were reported to a regional health authority. Investigation identified common exposure to a raw goats' milk cheese, from which Salmonella spp. were also isolated, leading to an international product recall. Three weeks later, on 22 August, a national increase in Salmonella Newport ST118 was detected through laboratory surveillance. Concomitantly isolates from the earlier familial clusters were confirmed as S. Newport ST118. Interviews with a selection of the laboratory-identified cases revealed exposure to the same cheese, including exposure to batches not included in the previous recall, leading to an expansion of the recall. The outbreak affected 153 cases, including six cases in Scotland. S. Newport was detected in the cheese and in the milk of one of the producer's goats. The difference in the two alerts generated by this outbreak highlight the timeliness of the FBO system and the precision of the laboratory-based surveillance system. It is also a reminder of the risks associated with raw milk cheeses.
Introduction: Paramedics commonly administer intravenous dextrose to severely hypoglycemic patients. Typically, the treatment provided is a 25g ampule of 50% dextrose (D50). This dose of D50 is meant to ensure a return to consciousness. However, this dose may be unnecessary and lead to harm or difficulties regulating blood glucose post treatment. We hypothesize that a lower dose such as dextrose 10% (D10) or titrating the D50 to desired level of consciousness may be optimal and avoid adverse events. Methods: We systematically searched Medline, Embase, CINAHL and Cochrane Central on June 5th 2019. PRISMA guidelines were followed. The GRADE methods and risk of bias assessments were applied to determine the certainty of the evidence. We included primary literature investigating the use of intravenous dextrose in hypoglycemic diabetic patients presenting to paramedics or the emergency department. Outcomes of interest were related to the safe and effective reversal of symptoms and blood glucose levels (BGL). Results: 660 abstracts were screened, 40 full text articles, with eight studies included. Data from three randomized controlled trials and five observational studies were analyzed. A single RCT comparing D10 to D50 was identified. The primary significant finding of the study was an increased post-treatment glycemic profile by 3.2 mmol/L in the D50 group; no other outcomes had significant differences between groups. When comparing pooled data from all the included studies we find higher symptom resolution in the D10 group compared to the D50 group; at 99.8% and 94.9% respectively. However, the mean time to resolution was approximately 4 minutes longer in the D10 group (4.1 minutes (D50) and 8 minutes (D10)). There was more need for subsequent doses in the D10 group at 23.0% versus 16.5% in the D50 group. The post treatment glycemic profile was lower in the D10 group at 5.9 mmol/L versus 8.5 mmol/L in the D50 group. Both treatments had nearly complete resolution of hypoglycemia; 98.7% (D50) and 99.2% (D10). No adverse events were observed in the D10 group (0/871) compared to 12/133 adverse events in the D50 group. Conclusion: D10 may be as effective as D50 at resolving symptoms and correcting hypoglycemia. Although the desired effect can take several minutes longer there appear to be fewer adverse events. The post treatment glycemic profile may facilitate less challenging ongoing glucose management by the patients.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.