To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Numerous developmental scholars have been influenced by the research, policies, and thinking of the late Edward Zigler, who was instrumental in founding Head Start and Early Head Start. In line with the research and advocacy work of Zigler, we discuss two models that support the development of the whole child. We begin by reviewing how adverse and protective experiences “get under the skin” and affect developmental trajectories and risk and resilience processes. We then present research and examples of how experiences affect the whole child, the heart and the head (social, emotional, cognitive, and physical development), and consider development within context and across domains. We discuss examples of interventions that strengthen nurturing relationships as the mechanism of change. We offer a public health perspective on promoting optimal development through nurturing relationships and access to resources during early childhood. We end with a discussion of the myth that our current society is child-focused and argue for radical, essential change to make promoting optimal development for all children the cornerstone of our society.
Delineating the proximal urethra can be critical for radiotherapy planning but is challenging on computerised tomography (CT) imaging.
Materials and methods:
We trialed a novel non-invasive technique to allow visualisation of the proximal urethra using a rapid sequence magnetic resonance imaging (MRI) protocol to visualise the urinary flow in patients voiding during the simulation scan.
Of the seven patients enrolled, four were able to void during the MRI scan. For these four patients, direct visualisation of urinary flow through the proximal urethra was achieved. The average volume of the proximal urethra contoured on voiding MRI was significantly higher than the proximal urethra contoured on CT, 4·07 and 1·60 cc, respectively (p = 0·02). The proximal urethra location also differed; the Dice coefficient average was 0·28 (range 0–0·62).
In this small, proof-of-concept prospective clinical trial, the volume and location of the proximal urethra differed significantly when contoured on a voiding MRI scan compared to that determined by a conventional CT simulation. The shape of the proximal urethra on voiding MRI may be more anatomically correct compared to the proximal urethra shape determined with a semi-rigid catheter in place.
The Real Time Mesoscale Analysis (RTMA), a two-dimensional variational analysis algorithm, is used to provide hourly analyses of surface sensible weather elements for situational awareness at spatial resolutions of 3 km over Alaska. In this work we focus on the analysis of horizontal visibility in Alaska, which is a region prone to weather related aviation accidents that are in part due to a relatively sparse observation network. In this study we evaluate the impact of assimilating estimates of horizontal visibility derived from a novel network of web cameras in Alaska with the RTMA. Results suggest that the web camera-derived estimates of visibility can capture low visibility conditions and have the potential to improve the RTMA visibility analysis under conditions of low instrument flight rules and instrument flight rules.
Proglacial braided river systems discharge large volumes of meltwater from ice sheets and transport coarse-grained sediments from the glaciated areas to the oceans. Here, we test the hypothesis if high-energy hydrological events can leave distinctive signatures in the sedimentary record of braided river systems. We characterize the morphology and infer a mode of formation of a 25 km long and 1–3 km wide Early Pleistocene incised valley recently imaged in 3-D seismic data in the Hoop area, SW Barents Sea. The fluvial system, named Bjørnelva River Valley, carved 20 m deep channels into Lower Cretaceous bedrock at a glacial paleo-surface and deposited 28 channel bars along a paleo-slope gradient of ~0.64 m km−1. The landform morphologies and position relative to the paleo-surface support that Bjørnelva River Valley was formed in the proglacial domain of the Barents Sea Ice Sheet. Based on valley width and valley depth, we suggest that Bjørnelva River Valley represents a braided river system fed by violent outburst floods from a glacial lake, with estimated outburst discharges of ~160 000 m3 s−1. The morphological configuration of Bjørnelva River Valley can inform geohazard assessments in areas at risk of outburst flooding today and is an analogue for landscapes evolving in areas currently covered by the Greenland and Antarctic ice sheets.
The introduced meadow knapweed (Centaurea × moncktonii C.E. Britton), a hybrid of black (Centaurea nigra L.) and brown (Centaurea jacea L.) knapweeds, is increasingly common in pastures, meadows, and waste areas across many U.S. states, including New York. We evaluated the effects of temperature, light, seed stratification, scarification, and population on percent germination in four experiments over 2 yr. Percent germination ranged from 3% to 100% across treatment combinations. Higher temperatures (30:20, 25:15, and sometimes 20:10 C day:night regimes compared with 15:5 C) promoted germination, especially when combined with the stimulatory effect of light (14:10 h L:D compared with continuous darkness). Under the three lowest temperature treatments, light increased percent germination by 15% to 86%. Cold-wet seed stratification also increased germination rates, especially at lower germination temperatures, but was not a prerequisite for germination. Scarification did not increase percent germination. Differences between C. × moncktonii populations were generally less significant than differences between temperature, light, and stratification treatments. Taken together, these results indicate that C. × moncktonii is capable of germinating under a broad range of environments, which may have facilitated this species’ range expansion in recent decades. However, C. × moncktonii also shows evidence of germination polymorphism: some seeds will germinate under suboptimal conditions, while others may remain dormant until the abiotic environment improves. Subtle differences in dormancy mechanisms and their relative frequencies may affect phenological traits like the timing of seedling emergence and ultimately shape the sizes and ranges of C. × moncktonii populations.
Through diversity of composition, sequence, and interfacial structure, hybrid materials greatly expand the palette of materials available to access novel functionality. The NSF Division of Materials Research recently supported a workshop (October 17–18, 2019) aiming to (1) identify fundamental questions and potential solutions common to multiple disciplines within the hybrid materials community; (2) initiate interfield collaborations between hybrid materials researchers; and (3) raise awareness in the wider community about experimental toolsets, simulation capabilities, and shared facilities that can accelerate this research. This article reports on the outcomes of the workshop as a basis for cross-community discussion. The interdisciplinary challenges and opportunities are presented, and followed with a discussion of current areas of progress in subdisciplines including hybrid synthesis, functional surfaces, and functional interfaces.
Introduction: Cases of anaphylaxis in children are often not appropriately managed by caregivers. We aimed to develop and to test the effectiveness of an education tool to help pediatric patients and their families better understand anaphylaxis and its management and to improve current knowledge and treatment guidelines adherence. Methods: The GEAR (Guidelines and Educational programs based on an Anaphylaxis Registry) is an initiative that recruits children with food-induced anaphylaxis who have visited the ED at the Montreal Children's Hospital and at The Children's Clinic located in Montreal, Quebec. The patients and parents, together, were asked to complete six questions related to the triggers, recognition and management of anaphylaxis at the time of presentation to the allergy clinic. Participants were automatically shown a 5-minute animated video addressing the main knowledge gaps related to the causes and management of anaphylaxis. At the end of the video, participants were redirected to same 6 questions to respond again. To test long-term knowledge retention, the questionnaire will be presented again in one year's time. A paired t-test was used to compare the difference between the baseline score and the follow-up score based on percentage of correct answers of the questionnaire. Results: From June to November 2019, 95 pediatric patients with diagnosed food-induced anaphylaxis were recruited. The median patient age was 4.5 years (Interquartile Range (IQR): 1.6–7.4) and half were male (51.6%). The mean questionnaire baseline score was 0.77 (77.0%, standard deviation (sd): 0.16) and the mean questionnaire follow-up score was 0.83 (83.0%, sd: 0.17). There was a significant difference between the follow-up score and baseline score (difference: 0.06, 95% CI: 0.04, 0.09). There were no associations of baseline questionnaire scores and change in scores with age and sex. Conclusion: Our video teaching method was successful in educating patients and their families to better understand anaphylaxis. The next step is to acquire long-term follow up scored to determine retention of knowledge.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Introduction: There are few large-scale studies assessing the true risk of epinephrine use during anaphylaxis in adults. We aimed to assess the demographics, clinical characteristics, and secondary effects of epinephrine treatment and to determine factors associated with major and minor secondary effects associated with epinephrine use among adults with anaphylaxis. Methods: From May 2012 to February 2018, adults presenting to the Hôpital du Sacré-Coeur de Montréal (HSCM) emergency department (ED) with anaphylaxis were recruited prospectively as part of the Cross-Canada Anaphylaxis Registry (C-CARE). Missed cases were identified through a previously validated algorithm. Data were collected on demographics, clinical characteristics, and management of anaphylaxis using a structured chart review. Multivariate logistic regression models were compared to estimate factors associated with side effects of epinephrine administration. Results: Over a 6-year period, 402 adult patients presented to the ED at HSCM with anaphylaxis. The median age was 38 years (Interquartile Range [IQR]: 27, 52) and 40.4% were males. The main trigger for anaphylaxis was food (53.0%). A total of 286 patients (71.1%) received epinephrine treatment, of which 23.9% were treated in the pre-hospital setting, 47.0% received treatment in the ED, and 5.0% received epinephrine in both settings. Among patients treated with epinephrine, major secondary effects were rare (1.4% of patients), including new changes to electrocardiogram, arrhythmia, and neurological symptoms. Minor secondary effects due to epinephrine were reported in 50.0% of patients, mainly inappropriate sinus tachycardia (defined as a rate over 100 beats/minute in 30.1%). Major cardiovascular secondary effects were associated with regular use of beta-blockers (aOR 1.10 [95%CI, 1.02, 1.18]), regular use of ACE-inhibitors (aOR 1.16 [95%CI, 1.07, 1.27]), and receiving more than two doses of epinephrine (aOR 1.09 [95%CI, 1.00, 1.18]). The model was adjusted for age, history of ischemic heart disease, trigger of anaphylaxis, presence of asthma, sex, and reaction severity. Inappropriate sinus tachycardia was more likely in females (aOR 1.18 [95%CI, 1.04, 1.33]) and palpitations, tremors, and psychomotor agitation were more likely in females (aOR 1.09 [95%CI, 1.00, 1.19]) and among those receiving more than two doses of epinephrine (aOR 1.49 [95%CI, 1.14, 1.96]). The models were adjusted for age, regular use of medications, history of ischemic heart disease, triggers of anaphylaxis, presence of asthma, reaction severity, and IV administration of epinephrine. Conclusion: The low rate of occurrence of major secondary effects of epinephrine in the treatment of anaphylaxis in our study demonstrates the overall safety of epinephrine use.
Background: Traditionally, radiologists have routinely recommended oral contrast agents (such as Telebrix®) for patients undergoing a computed tomography of the abdomen/pelvis (CTAP), but recent evidence has shown limited diagnostic benefits for most emergency department (ED) patients. Additionally, the use of oral contrast has numerous drawbacks, including patient nausea/vomiting, risk of aspiration and delays to CTAP completion and increased ED length of stay (LOS). Aim Statement: The aim was to safely reduce the number of ED patients receiving oral contrast prior to undergoing CTAP and thereby reduce ED length of stay. Measures & Design: An evidence-based ED protocol was developed in collaboration with radiology. PDSA cycle #1 was implementation at a pilot site to identify potential barriers. Challenges identified included the need to change the electronic order sets to reflect the new protocol, improved communication with frontline providers and addition of an online BMI calculator. PDSA cycle #2 was widespread implementation across all 4 ED's in the Calgary zone. The protocol was incorporated into all relevant electronic ED order sets to act as a physician prompt. Using administrative data, we extracted and analyzed data using descriptive and inferential statistics for the outcomes and balancing measures from a period of 12 months pre- and 12 months post-intervention. Evaluation/Results: A total of 14,868 and 17,995 CTAP exams were included in the pre and post periods, respectively. There was a reduction in usage of oral contrast from 71% to 30% (P < 0.0001) in the pre- and post-study period, respectively. This corresponded to a reduction in average time of CT requisition to CT report completed from 3.30 hours to 2.31 hours (-0.99 hrs, P = 0.001) and a reduction in average ED LOS from 11.01 hours to 9.92 hours (-1.08 hrs, P < 0.0001). The protocol resulted in a reduction of 19,434.6 patient hrs in the ED. Run charts demonstrate change was sustained over time. Our protocol did not demonstrate an increase in rates of repeat CTAP (P = 0.563) at 30 days, nor an increase in patient re-admission within 7 days (P = 0.295). Discussion/Impact: Successful implementation of an ED and radiology developed protocol significantly reduced the use of oral contrast in patients requiring enhanced CTAP as part of their diagnostic work up and, thereby, reduced overall ED LOS without increasing the need for repeat examinations within 30 days or re-admission within 7 days.
Little is known about the experiences of people living alone with dementia in the community and their non-resident relatives and friends who support them. In this paper, we explore their respective attitudes and approaches to the future, particularly regarding the future care and living arrangements of those living with dementia. The study is based on a qualitative secondary analysis of interviews with 24 people living alone with early-stage dementia in North Wales, United Kingdom, and one of their relatives or friends who supported them. All but four of the dyads were interviewed twice over 12 months (a total of 88 interviews). In the analysis, it was observed that several people with dementia expressed the desire to continue living at home for ‘as long as possible’. A framework approach was used to investigate this theme in more depth, drawing on concepts from the existing studies of people living with dementia and across disciplines. Similarities and differences in the future outlook and temporal orientation of the participants were identified. The results support previous research suggesting that the future outlook of people living with early-stage dementia can be interpreted in part as a response to their situation and a way of coping with the threats that it is perceived to present, and not just an impaired view of time. Priorities for future research are highlighted in the discussion.
Electroconvulsive therapy (ECT) is recommended in treatment guidelines as an efficacious therapy for treatment-resistant depression. However, it has been associated with loss of autobiographical memory and short-term reduction in new learning.
To provide clinically useful guidelines to aid clinicians in informing patients regarding the cognitive side-effects of ECT and in monitoring these during a course of ECT, using complex data.
A Committee of clinical and academic experts from Australia and New Zealand met to the discuss the key issues pertaining to ECT and cognitive side-effects. Evidence regarding cognitive side-effects was reviewed, as was the limited evidence regarding how to monitor them. Both issues were supplemented by the clinical experience of the authors.
Meta-analyses suggest that new learning is impaired immediately following ECT but that group mean scores return at least to baseline by 14 days after ECT. Other cognitive functions are generally unaffected. However, the finding of a mean score that is not reduced from baseline cannot be taken to indicate that impairment, particularly of new learning, cannot occur in individuals, particularly those who are at greater risk. Therefore, monitoring is still important. Evidence suggests that ECT does cause deficits in autobiographical memory. The evidence for schedules of testing to monitor cognitive side-effects is currently limited. We therefore make practical recommendations based on clinical experience.
Despite modern ECT techniques, cognitive side-effects remain an important issue, although their nature and degree remains to be clarified fully. In these circumstances it is useful for clinicians to have guidance regarding what to tell patients and how to monitor these side-effects clinically.
Brain-derived neurotrophic factor (BDNF) gene variants may potentially influence behaviour. In order to test this hypothesis, we investigated the relationship between BDNF Val66Met polymorphism and aggressive behaviour in a population of schizophrenic patients. Our results showed that increased number of BDNF Met alleles was associated with increased aggressive behaviour.
Individuals with schizophrenia who participated in a psychosocial and educative rehabilitation programme showed a 46% improvement in quality of life in the absence of any significant change in symptom severity. In contrast, there was no significant change in quality of life for individuals who continued with supportive rehabilitation. Our preliminary findings highlight the ‘quality of life’ benefits of psychosocial and educative rehabilitation for individuals with schizophrenia who are clinically stable and living in the community.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Dystrobrevin binding protein 1 (DTNBP1) is a schizophrenia susceptibility gene involved with neurotransmission regulation (especially dopamine and glutamate) and neurodevelopment. The gene is known to be associated with cognitive deficit phenotypes within schizophrenia. In our previous studies, DTNBP1 was found associated not only with schizophrenia but with other psychiatric disorders including psychotic depression, post-traumatic stress disorder, nicotine dependence and opiate dependence. These findings suggest that DNTBP1 may be involved in pathways that lead to multiple psychiatric phenotypes. In this study, we explored the association between DTNBP1 SNPs (single nucleotide polymorphisms) and multiple psychiatric phenotypes included in the Diagnostic Interview of Psychosis (DIP).
Five DTNBP1 SNPs, rs17470454, rs1997679, rs4236167, rs9370822 and rs9370823, were genotyped in 235 schizophrenia subjects screened for various phenotypes in the domains of depression, mania, hallucinations, delusions, subjective thought disorder, behaviour and affect, and speech disorder. SNP-phenotype association was determined with ANOVA under general, dominant/recessive and over-dominance models.
Post hoc tests determined that SNP rs1997679 was associated with visual hallucination; SNP rs4236167 was associated with general auditory hallucination as well as specific features including non-verbal, abusive and third-person form auditory hallucinations; and SNP rs9370822 was associated with visual and olfactory hallucinations. SNPs that survived correction for multiple testing were rs4236167 for third-person and abusive form auditory hallucinations; and rs9370822 for olfactory hallucinations.
These data suggest that DTNBP1 is likely to play a role in development of auditory related, visual and olfactory hallucinations which is consistent with evidence of DTNBP1 activity in the auditory processing regions, in visual processing and in the regulation of glutamate and dopamine activity.