To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Proglacial braided river systems discharge large volumes of meltwater from ice sheets and transport coarse-grained sediments from the glaciated areas to the oceans. Here, we test the hypothesis if high-energy hydrological events can leave distinctive signatures in the sedimentary record of braided river systems. We characterize the morphology and infer a mode of formation of a 25 km long and 1–3 km wide Early Pleistocene incised valley recently imaged in 3-D seismic data in the Hoop area, SW Barents Sea. The fluvial system, named Bjørnelva River Valley, carved 20 m deep channels into Lower Cretaceous bedrock at a glacial paleo-surface and deposited 28 channel bars along a paleo-slope gradient of ~0.64 m km−1. The landform morphologies and position relative to the paleo-surface support that Bjørnelva River Valley was formed in the proglacial domain of the Barents Sea Ice Sheet. Based on valley width and valley depth, we suggest that Bjørnelva River Valley represents a braided river system fed by violent outburst floods from a glacial lake, with estimated outburst discharges of ~160 000 m3 s−1. The morphological configuration of Bjørnelva River Valley can inform geohazard assessments in areas at risk of outburst flooding today and is an analogue for landscapes evolving in areas currently covered by the Greenland and Antarctic ice sheets.
The introduced meadow knapweed (Centaurea × moncktonii C.E. Britton), a hybrid of black (Centaurea nigra L.) and brown (Centaurea jacea L.) knapweeds, is increasingly common in pastures, meadows, and waste areas across many U.S. states, including New York. We evaluated the effects of temperature, light, seed stratification, scarification, and population on percent germination in four experiments over 2 yr. Percent germination ranged from 3% to 100% across treatment combinations. Higher temperatures (30:20, 25:15, and sometimes 20:10 C day:night regimes compared with 15:5 C) promoted germination, especially when combined with the stimulatory effect of light (14:10 h L:D compared with continuous darkness). Under the three lowest temperature treatments, light increased percent germination by 15% to 86%. Cold-wet seed stratification also increased germination rates, especially at lower germination temperatures, but was not a prerequisite for germination. Scarification did not increase percent germination. Differences between C. × moncktonii populations were generally less significant than differences between temperature, light, and stratification treatments. Taken together, these results indicate that C. × moncktonii is capable of germinating under a broad range of environments, which may have facilitated this species’ range expansion in recent decades. However, C. × moncktonii also shows evidence of germination polymorphism: some seeds will germinate under suboptimal conditions, while others may remain dormant until the abiotic environment improves. Subtle differences in dormancy mechanisms and their relative frequencies may affect phenological traits like the timing of seedling emergence and ultimately shape the sizes and ranges of C. × moncktonii populations.
Through diversity of composition, sequence, and interfacial structure, hybrid materials greatly expand the palette of materials available to access novel functionality. The NSF Division of Materials Research recently supported a workshop (October 17–18, 2019) aiming to (1) identify fundamental questions and potential solutions common to multiple disciplines within the hybrid materials community; (2) initiate interfield collaborations between hybrid materials researchers; and (3) raise awareness in the wider community about experimental toolsets, simulation capabilities, and shared facilities that can accelerate this research. This article reports on the outcomes of the workshop as a basis for cross-community discussion. The interdisciplinary challenges and opportunities are presented, and followed with a discussion of current areas of progress in subdisciplines including hybrid synthesis, functional surfaces, and functional interfaces.
Introduction: Cases of anaphylaxis in children are often not appropriately managed by caregivers. We aimed to develop and to test the effectiveness of an education tool to help pediatric patients and their families better understand anaphylaxis and its management and to improve current knowledge and treatment guidelines adherence. Methods: The GEAR (Guidelines and Educational programs based on an Anaphylaxis Registry) is an initiative that recruits children with food-induced anaphylaxis who have visited the ED at the Montreal Children's Hospital and at The Children's Clinic located in Montreal, Quebec. The patients and parents, together, were asked to complete six questions related to the triggers, recognition and management of anaphylaxis at the time of presentation to the allergy clinic. Participants were automatically shown a 5-minute animated video addressing the main knowledge gaps related to the causes and management of anaphylaxis. At the end of the video, participants were redirected to same 6 questions to respond again. To test long-term knowledge retention, the questionnaire will be presented again in one year's time. A paired t-test was used to compare the difference between the baseline score and the follow-up score based on percentage of correct answers of the questionnaire. Results: From June to November 2019, 95 pediatric patients with diagnosed food-induced anaphylaxis were recruited. The median patient age was 4.5 years (Interquartile Range (IQR): 1.6–7.4) and half were male (51.6%). The mean questionnaire baseline score was 0.77 (77.0%, standard deviation (sd): 0.16) and the mean questionnaire follow-up score was 0.83 (83.0%, sd: 0.17). There was a significant difference between the follow-up score and baseline score (difference: 0.06, 95% CI: 0.04, 0.09). There were no associations of baseline questionnaire scores and change in scores with age and sex. Conclusion: Our video teaching method was successful in educating patients and their families to better understand anaphylaxis. The next step is to acquire long-term follow up scored to determine retention of knowledge.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Introduction: There are few large-scale studies assessing the true risk of epinephrine use during anaphylaxis in adults. We aimed to assess the demographics, clinical characteristics, and secondary effects of epinephrine treatment and to determine factors associated with major and minor secondary effects associated with epinephrine use among adults with anaphylaxis. Methods: From May 2012 to February 2018, adults presenting to the Hôpital du Sacré-Coeur de Montréal (HSCM) emergency department (ED) with anaphylaxis were recruited prospectively as part of the Cross-Canada Anaphylaxis Registry (C-CARE). Missed cases were identified through a previously validated algorithm. Data were collected on demographics, clinical characteristics, and management of anaphylaxis using a structured chart review. Multivariate logistic regression models were compared to estimate factors associated with side effects of epinephrine administration. Results: Over a 6-year period, 402 adult patients presented to the ED at HSCM with anaphylaxis. The median age was 38 years (Interquartile Range [IQR]: 27, 52) and 40.4% were males. The main trigger for anaphylaxis was food (53.0%). A total of 286 patients (71.1%) received epinephrine treatment, of which 23.9% were treated in the pre-hospital setting, 47.0% received treatment in the ED, and 5.0% received epinephrine in both settings. Among patients treated with epinephrine, major secondary effects were rare (1.4% of patients), including new changes to electrocardiogram, arrhythmia, and neurological symptoms. Minor secondary effects due to epinephrine were reported in 50.0% of patients, mainly inappropriate sinus tachycardia (defined as a rate over 100 beats/minute in 30.1%). Major cardiovascular secondary effects were associated with regular use of beta-blockers (aOR 1.10 [95%CI, 1.02, 1.18]), regular use of ACE-inhibitors (aOR 1.16 [95%CI, 1.07, 1.27]), and receiving more than two doses of epinephrine (aOR 1.09 [95%CI, 1.00, 1.18]). The model was adjusted for age, history of ischemic heart disease, trigger of anaphylaxis, presence of asthma, sex, and reaction severity. Inappropriate sinus tachycardia was more likely in females (aOR 1.18 [95%CI, 1.04, 1.33]) and palpitations, tremors, and psychomotor agitation were more likely in females (aOR 1.09 [95%CI, 1.00, 1.19]) and among those receiving more than two doses of epinephrine (aOR 1.49 [95%CI, 1.14, 1.96]). The models were adjusted for age, regular use of medications, history of ischemic heart disease, triggers of anaphylaxis, presence of asthma, reaction severity, and IV administration of epinephrine. Conclusion: The low rate of occurrence of major secondary effects of epinephrine in the treatment of anaphylaxis in our study demonstrates the overall safety of epinephrine use.
Background: Traditionally, radiologists have routinely recommended oral contrast agents (such as Telebrix®) for patients undergoing a computed tomography of the abdomen/pelvis (CTAP), but recent evidence has shown limited diagnostic benefits for most emergency department (ED) patients. Additionally, the use of oral contrast has numerous drawbacks, including patient nausea/vomiting, risk of aspiration and delays to CTAP completion and increased ED length of stay (LOS). Aim Statement: The aim was to safely reduce the number of ED patients receiving oral contrast prior to undergoing CTAP and thereby reduce ED length of stay. Measures & Design: An evidence-based ED protocol was developed in collaboration with radiology. PDSA cycle #1 was implementation at a pilot site to identify potential barriers. Challenges identified included the need to change the electronic order sets to reflect the new protocol, improved communication with frontline providers and addition of an online BMI calculator. PDSA cycle #2 was widespread implementation across all 4 ED's in the Calgary zone. The protocol was incorporated into all relevant electronic ED order sets to act as a physician prompt. Using administrative data, we extracted and analyzed data using descriptive and inferential statistics for the outcomes and balancing measures from a period of 12 months pre- and 12 months post-intervention. Evaluation/Results: A total of 14,868 and 17,995 CTAP exams were included in the pre and post periods, respectively. There was a reduction in usage of oral contrast from 71% to 30% (P < 0.0001) in the pre- and post-study period, respectively. This corresponded to a reduction in average time of CT requisition to CT report completed from 3.30 hours to 2.31 hours (-0.99 hrs, P = 0.001) and a reduction in average ED LOS from 11.01 hours to 9.92 hours (-1.08 hrs, P < 0.0001). The protocol resulted in a reduction of 19,434.6 patient hrs in the ED. Run charts demonstrate change was sustained over time. Our protocol did not demonstrate an increase in rates of repeat CTAP (P = 0.563) at 30 days, nor an increase in patient re-admission within 7 days (P = 0.295). Discussion/Impact: Successful implementation of an ED and radiology developed protocol significantly reduced the use of oral contrast in patients requiring enhanced CTAP as part of their diagnostic work up and, thereby, reduced overall ED LOS without increasing the need for repeat examinations within 30 days or re-admission within 7 days.
Little is known about the experiences of people living alone with dementia in the community and their non-resident relatives and friends who support them. In this paper, we explore their respective attitudes and approaches to the future, particularly regarding the future care and living arrangements of those living with dementia. The study is based on a qualitative secondary analysis of interviews with 24 people living alone with early-stage dementia in North Wales, United Kingdom, and one of their relatives or friends who supported them. All but four of the dyads were interviewed twice over 12 months (a total of 88 interviews). In the analysis, it was observed that several people with dementia expressed the desire to continue living at home for ‘as long as possible’. A framework approach was used to investigate this theme in more depth, drawing on concepts from the existing studies of people living with dementia and across disciplines. Similarities and differences in the future outlook and temporal orientation of the participants were identified. The results support previous research suggesting that the future outlook of people living with early-stage dementia can be interpreted in part as a response to their situation and a way of coping with the threats that it is perceived to present, and not just an impaired view of time. Priorities for future research are highlighted in the discussion.
Electroconvulsive therapy (ECT) is recommended in treatment guidelines as an efficacious therapy for treatment-resistant depression. However, it has been associated with loss of autobiographical memory and short-term reduction in new learning.
To provide clinically useful guidelines to aid clinicians in informing patients regarding the cognitive side-effects of ECT and in monitoring these during a course of ECT, using complex data.
A Committee of clinical and academic experts from Australia and New Zealand met to the discuss the key issues pertaining to ECT and cognitive side-effects. Evidence regarding cognitive side-effects was reviewed, as was the limited evidence regarding how to monitor them. Both issues were supplemented by the clinical experience of the authors.
Meta-analyses suggest that new learning is impaired immediately following ECT but that group mean scores return at least to baseline by 14 days after ECT. Other cognitive functions are generally unaffected. However, the finding of a mean score that is not reduced from baseline cannot be taken to indicate that impairment, particularly of new learning, cannot occur in individuals, particularly those who are at greater risk. Therefore, monitoring is still important. Evidence suggests that ECT does cause deficits in autobiographical memory. The evidence for schedules of testing to monitor cognitive side-effects is currently limited. We therefore make practical recommendations based on clinical experience.
Despite modern ECT techniques, cognitive side-effects remain an important issue, although their nature and degree remains to be clarified fully. In these circumstances it is useful for clinicians to have guidance regarding what to tell patients and how to monitor these side-effects clinically.
Brain-derived neurotrophic factor (BDNF) gene variants may potentially influence behaviour. In order to test this hypothesis, we investigated the relationship between BDNF Val66Met polymorphism and aggressive behaviour in a population of schizophrenic patients. Our results showed that increased number of BDNF Met alleles was associated with increased aggressive behaviour.
Individuals with schizophrenia who participated in a psychosocial and educative rehabilitation programme showed a 46% improvement in quality of life in the absence of any significant change in symptom severity. In contrast, there was no significant change in quality of life for individuals who continued with supportive rehabilitation. Our preliminary findings highlight the ‘quality of life’ benefits of psychosocial and educative rehabilitation for individuals with schizophrenia who are clinically stable and living in the community.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Dystrobrevin binding protein 1 (DTNBP1) is a schizophrenia susceptibility gene involved with neurotransmission regulation (especially dopamine and glutamate) and neurodevelopment. The gene is known to be associated with cognitive deficit phenotypes within schizophrenia. In our previous studies, DTNBP1 was found associated not only with schizophrenia but with other psychiatric disorders including psychotic depression, post-traumatic stress disorder, nicotine dependence and opiate dependence. These findings suggest that DNTBP1 may be involved in pathways that lead to multiple psychiatric phenotypes. In this study, we explored the association between DTNBP1 SNPs (single nucleotide polymorphisms) and multiple psychiatric phenotypes included in the Diagnostic Interview of Psychosis (DIP).
Five DTNBP1 SNPs, rs17470454, rs1997679, rs4236167, rs9370822 and rs9370823, were genotyped in 235 schizophrenia subjects screened for various phenotypes in the domains of depression, mania, hallucinations, delusions, subjective thought disorder, behaviour and affect, and speech disorder. SNP-phenotype association was determined with ANOVA under general, dominant/recessive and over-dominance models.
Post hoc tests determined that SNP rs1997679 was associated with visual hallucination; SNP rs4236167 was associated with general auditory hallucination as well as specific features including non-verbal, abusive and third-person form auditory hallucinations; and SNP rs9370822 was associated with visual and olfactory hallucinations. SNPs that survived correction for multiple testing were rs4236167 for third-person and abusive form auditory hallucinations; and rs9370822 for olfactory hallucinations.
These data suggest that DTNBP1 is likely to play a role in development of auditory related, visual and olfactory hallucinations which is consistent with evidence of DTNBP1 activity in the auditory processing regions, in visual processing and in the regulation of glutamate and dopamine activity.
Training in Psychiatry is especially debated since the biological, psychological and social perspectives need to be integrated in the education of Early-Career Psychiatrists (ECP).
To describe the opinion of ECP about the training received and to evaluate their self-confidence in therapeutic interventions.
A training event for ECP from all over Italy takes place in Rome yearly. A 30-item ad hoc questionnaire with both yes/no and rating scale answers has been administered to all the participants in the event.
Over the past three years 224 questionnaires were collected from 216 last-year trainees and 8 recently qualified psychiatrists (68.5% women, mean age 30.5 ± 3.5). Only 13% of participants was globally satisfied with his/her training program in psychiatry, the most of them were only partially or a little satisfied (51.4% and 32.0% respectively). the most critical training areas were Forensic Psychiatry and Psychotherapy followed by Psychiatric Rehabilitation. Conversely, Clinical Psychiatry and Psychopharmacology were the most satisfying areas of training. Likewise, ECP felt themselves most confident in Clinical Psychiatry (87.9%) and Psychopharmacology (48.7%); whereas the most uncomfortable areas were Forensic Psychiatry (62.5%), Child and Adolescent Psychiatry (37.2%), and Dual Diagnosis/Substance-Abuse Related Disorders (33.9%).
The 45% of ECP complained that Psychotherapy is a critical issue. Despite the 46.4% of participants had supervision within the training program (less than two hours per week), the 87.4% seek help from external psychotherapeutic training programs.
To achieve a satisfactory educational standard and an adequate self confidence, network programs (within Italy and/or Europe) might be helpful.
We conduct the first broad-based international study on bank-level failures covering 92 countries over 2000–2014, investigating national cultural variables as failure determinants. We find individualism and masculinity are positively associated with bank failure, but they operate through different channels. Managers in individualist countries assume more portfolio risk, while governments in masculine countries allow banks to operate with less liquidity and less often bail out troubled institutions. Findings are robust to accounting for endogeneity, different techniques and measures, and additional controls. Results have implications for prudential policies, including regulation, supervision, and bailout strategies, that may partially mitigate some negative effects of culture.
Cardiovascular risk prediction tools are important for cardiovascular disease (CVD) prevention, however, which algorithms are appropriate for people with severe mental illness (SMI) is unclear.
To determine the cost-effectiveness using the net monetary benefit (NMB) approach of two bespoke SMI-specific risk algorithms compared to standard risk algorithms for primary CVD prevention in those with SMI, from an NHS perspective.
A microsimulation model was populated with 1000 individuals with SMI from The Health Improvement Network Database, aged 30–74 years without CVD. Four cardiovascular risk algorithms were assessed; (1) general population lipid, (2) general population BMI, (3) SMI-specific lipid and (4) SMI-specific BMI, compared against no algorithm. At baseline, each cardiovascular risk algorithm was applied and those high-risk (> 10%) were assumed to be prescribed statin therapy, others received usual care. Individuals entered the model in a ‘healthy’ free of CVD health state and with each year could retain their current health state, have cardiovascular events (non-fatal/fatal) or die from other causes according to transition probabilities.
The SMI-specific BMI and general population lipid algorithms had the highest NMB of the four algorithms resulting in 12 additional QALYs and a cost saving of approximately £37,000 (US$ 58,000) per 1000 patients with SMI over 10 years.
The general population lipid and SMI-specific BMI algorithms performed equally well. The ease and acceptability of use of a SMI-specific BMI algorithm (blood tests not required) makes it an attractive algorithm to implement in clinical settings.
Disclosure of interest
The authors have not supplied their declaration of competing interest.