To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Positive symptoms are a useful predictor of aggression in schizophrenia. Although a similar pattern of abnormal brain structures related to both positive symptoms and aggression has been reported, this observation has not yet been confirmed in a single sample.
To study the association between positive symptoms and aggression in schizophrenia on a neurobiological level, a prospective meta-analytic approach was employed to analyze harmonized structural neuroimaging data from 10 research centers worldwide. We analyzed brain MRI scans from 902 individuals with a primary diagnosis of schizophrenia and 952 healthy controls.
The result identified a widespread cortical thickness reduction in schizophrenia compared to their controls. Two separate meta-regression analyses revealed that a common pattern of reduced cortical gray matter thickness within the left lateral temporal lobe and right midcingulate cortex was significantly associated with both positive symptoms and aggression.
These findings suggested that positive symptoms such as formal thought disorder and auditory misperception, combined with cognitive impairments reflecting difficulties in deploying an adaptive control toward perceived threats, could escalate the likelihood of aggression in schizophrenia.
Translocation and rehabilitation programmes are critical tools for wildlife conservation. These methods achieve greater impact when integrated in a combined strategy for enhancing population or ecosystem restoration. During 2002–2016 we reared 37 orphaned southern sea otter Enhydra lutris nereis pups, using captive sea otters as surrogate mothers, then released them into a degraded coastal estuary. As a keystone species, observed increases in the local sea otter population unsurprisingly brought many ecosystem benefits. The role that surrogate-reared otters played in this success story, however, remained uncertain. To resolve this, we developed an individual-based model of the local population using surveyed individual fates (survival and reproduction) of surrogate-reared and wild-captured otters, and modelled estimates of immigration. Estimates derived from a decade of population monitoring indicated that surrogate-reared and wild sea otters had similar reproductive and survival rates. This was true for males and females, across all ages (1–13 years) and locations evaluated. The model simulations indicated that reconstructed counts of the wild population are best explained by surrogate-reared otters combined with low levels of unassisted immigration. In addition, the model shows that 55% of observed population growth over this period is attributable to surrogate-reared otters and their wild progeny. Together, our results indicate that the integration of surrogacy methods and reintroduction of juvenile sea otters helped establish a biologically successful population and restore a once-impaired ecosystem.
Treatment-resistant schizophrenia, affecting approximately 20–30% of patients with schizophrenia, has a high burden both for patients and healthcare services. There is a need to identify treatment resistance earlier in the course of the illness, in order that effective treatment, such as clozapine, can be offered promptly. We conducted a systemic literature review of prospective longitudinal studies with the aim of identifying predictors of treatment-resistant schizophrenia from the first episode. From the 545 results screened, we identified 12 published studies where data at the first episode was used to predict treatment resistance. Younger age of onset was the most consistent predictor of treatment resistance. We discuss the gaps in the literature and how future prediction models can identify predictors of treatment response more robustly.
Invasive shrubs like Tamarix spp. are ecological and economic threats in the U.S. Southwest and West, as they displace native vegetation and require innovative management approaches. Tamarix control typically consists of chemical and mechanical removal, but these methods may have negative ecological and economic impacts. Tamarisk leaf beetles (Diorhabda spp.) released for biocontrol are becoming increasingly established within Western river systems and can provide additional control. Previous Diorhabda research studied integration of beetle herbivory with fire and with mechanical management methods and herbicide application (e.g., cut stump), but little research has been conducted on integration with mowing and foliar herbicide application, which cause minimal soil disturbance. At Caballo Reservoir in southern New Mexico, we addressed the question: “How does Tamarix respond to chemical and mechanical control when Diorhabda is well established at a site?” A field experiment was conducted by integrating mowing and foliar imazapyr herbicide at standard (3.6 g ae L−1 [0.75% v/v] and low (1.2 g ae L−1 [0.25% v/v]) rates with herbivory. Treatments were replicated five times at two sites—a dry site and a seasonally flooded site. Beetles and larvae were counted and green foliage was measured over 2 yr. Mowing and full herbicide rates reduced green foliage and limited regrowth compared with low herbicide rate and beetles alone. Integrating conventional management such as mowing and herbicide with biocontrol could improve Tamarix management by providing stresses in addition to herbivory alone.
Pelvic internal organs change in volume and position during radiotherapy. This may compromise the efficacy of treatment or worsen its toxicity. There may be limitations to fully correcting these changes using online image guidance; therefore, effective and consistent patient preparation and positioning remain important. This review aims to provide an overview of the extent of pelvic organ motion and strategies to manage this motion.
Methods and Materials:
Given the breadth of this topic, a systematic review was not undertaken. Instead, existing systematic reviews and individual high-quality studies addressing strategies to manage pelvic organ motion have been discussed. Suggested levels of evidence and grades of recommendation for each strategy have been applied.
Various strategies to manage rectal changes have been investigated including diet and laxatives, enemas and rectal emptying tubes and rectal displacement with endorectal balloons (ERBs) and rectal spacers. Bladder-filling protocols and bladder ultrasound have been used to try to standardise bladder volume. Positioning the patient supine, using a full bladder and positioning prone with or without a belly board, has been examined in an attempt to reduce the volume of irradiated small bowel. Some randomised trials have been performed, with evidence to support the use of ERBs, rectal spacers, bladder-filling protocols and the supine over prone position in prostate radiotherapy. However, there was a lack of consistent high-quality evidence that would be applicable to different disease sites within the pelvis. Many studies included small numbers of patients were non-randomised, used less conformal radiotherapy techniques or did not report clinical outcomes such as toxicity.
There is uncertainty as to the clinical benefit of many of the commonly adopted interventions to minimise pelvic organ motion. Given this and the limitations in online image guidance compensation, further investigation of adaptive radiotherapy strategies is required.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
The current review aimed to synthesise the literature on food literacy interventions among adolescents in secondary schools, the attitudes and perceptions of food literacy interventions in secondary schools, and their effects on dietary outcomes.
The systematic review searched five electronic databases from the earliest record to present.
The studies selected for the review were from sixteen countries: Australia (n 10), Canada (n 1), China (n 1), France (n 1), Greece (n 2), Iran (n 1), South Africa (n 1), South India (n 1), Kenya (n 1), Norway (n 2), Portugal (n 1), Denmark (n 1), Northern Ireland (n 1), USA (n 17), UK (n 1) and Sweden (n 2).
Adolescents aged 10–19 years.
Forty-four studies were eligible for inclusion. Adolescents with greater nutritional knowledge and food skills showed healthier dietary practices. Studies found a mixed association between food literacy and long-term healthy dietary behaviour. Two studies showed an improvement in adolescents’ cooking skills and food safety knowledge; six studies showed an improvement in overall food safety knowledge; six studies showed an improvement in overall food and nutritional knowledge; and two studies showed an improvement in short-term healthy dietary behaviour.
Food literacy interventions conducted in a secondary-school setting have demonstrated a positive impact on healthy food and nutritional knowledge. However, there appears to be limited evidence supporting food literacy interventions and long-term dietary behaviours in adolescents. More evidence-based research is required to adequately measure all domains of food literacy and more age-specific food literacy interventions.
Background: The classic ketogenic diet is the main non-pharmacological treatment for refractory epilepsy; however, adherence is often challenging. The low glycemic index diet (LGID) is less strict, almost equally effective, and associated with improved adherence. Little is known about the quality of life of children treated with LGID. The objective of this study was to explore changes in the quality of life of children with epilepsy transitioning to the LGID. Methods: Patients on LGID and their parents filled out Pediatric Quality of Life Epilepsy Module questionnaires; one while being on the LGID, and one retrospectively for the time prior to starting the LGID. Results: Data was collected from five children ages 3-13 and their parents. Complete seizure control was seen in two children, >50% seizure reduction in one, and no change in two children. Parental reported quality of life while on the LGID increased with two participants but decreased in all child self reports. Conclusions: Although the LGID led to improved seizure control in three out of five patients, the child-reported quality of life decreased in all children. Larger prospective studies are warranted to reliably assess the impact of the LGID on the quality of life in children with epilepsy.
Background: Spinal muscular atrophy (SMA) is a children’s neuromuscular disorder. Although motor neuron loss is a major feature of the disease, we have identified fatty acid abnormalities in SMA patients and in preclinical animal models, suggesting metabolic perturbation is also an important component of SMA. Methods: Biochemical, histological, proteomic, and high resolution respirometry were used. Results: SMA patients are more susceptible to dyslipidemia than the average population as determined by a standard lipid profile in a cohort of 72 pediatric patients. As well, we observed a non-alcoholic liver disease phenotype in apreclinical mouse model. Denervation alone was not sufficient to induce liver steatosis, as a mouse model of ALS, did not develop fatty liver. Hyperglucagonemia in Smn2B/-mice could explain the hepatic steatosis by increasing plasma substrate availability via glycogen depletion and peripheral lipolysis. Proteomic analysis identified mitochondrion and lipid metabolism as major clusters. Alterations in mitochondrial function were revealed by high-resolution respirometry. Finally, low-fat diets led to increased survival in Smn2B/-mice. Conclusions: These results provide strong evidence for lipid metabolism defects in SMA. Further investigation will be required to establish the primary mechanism of these alterations and understand how they lead to additional co-morbidities in SMA patients.
Cold-water coral reefs are biodiversity hotspots of the deep sea. The most dominant reef-building cold-water coral in the Atlantic is Lophelia pertusa, which builds vast and structurally complex habitats. Studying the behaviours of deep-sea species is challenging due to the technological difficulties in making prolonged observations in situ, so little is known about the behavioural ecology of this important species. Observations in laboratory studies can help to enhance our understanding of the range of behaviours these species exhibit. Here we present video evidence that the cold-water coral Lophelia pertusa is capable of producing mucus nets as part of their feeding strategy. This finding suggests that L. pertusa has a more diverse range of feeding strategies than previously thought.
Introduction: Accurate forecasting of emergency department (ED) patient visits can inform better resource matching. Calendar variables such as day of week and time of day are routinely used as predictors of ED volume. Further improvement in forecasting will likely come from dynamic variables. The effect of snowfall on ED volumes in colder climates remains poorly understood. We sought to determine whether accounting for snowfall improves ED patient volume forecasting. Our secondary objective was to characterize the magnitude of effect of snowfall on ED volume. Methods: This was a retrospective observational study using historical patient volume data and local snowfall records from April 1st, 2011 to March 31st, 2018 (2,542 days) at a single urban ED. We fit a series of four generalized linear models: a baseline model which included calendar variables and three different snowfall models which contained the variables in the baseline model plus an indicator variable for modelling snowfall. Each snowfall model had a different daily threshold for its indicator variable: any snowfall ( >0cm), moderate snowfall ( > = 1 cm), or high snowfall ( > = 5 cm). We modeled daily ED volume as the dependent variable using a Poisson distribution. To evaluate model fit, we examined the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) in each of the four models. In both cases, a lower number indicates better model fit. Incident rate ratios were calculated to determine the effect of snowfall. We used the delta method to calculate confidence intervals. Results: A total of 2542 days were used to develop the model. All three snowfall models demonstrated improved model fit compared to the baseline model with lower AIC and BIC values. The best fitting model included a binary variable for moderate snowfall ( > = 1cm/day). This model showed a statistically significant decrease in ED volume of 2.65% (95% CI: 1.23% -4.00%) on snowfall days, representing 5.4 (95% CI: 2.5 -8.2) patients per day at our hospital with an average daily volume of 205 patients. Conclusion: The addition of a snowfall variable results in improved forecasting model performance in ED volume forecasting with optimal threshold set at 1 cm of snow in our setting. Snowfall is associated with a modest, but statistically significant reduction in ED volume.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Whereas genetic susceptibility increases the risk for major depressive disorder (MDD), non-genetic protective factors may mitigate this risk. In a large-scale prospective study of US Army soldiers, we examined whether trait resilience and/or unit cohesion could protect against the onset of MDD following combat deployment, even in soldiers at high polygenic risk.
Data were analyzed from 3079 soldiers of European ancestry assessed before and after their deployment to Afghanistan. Incident MDD was defined as no MDD episode at pre-deployment, followed by a MDD episode following deployment. Polygenic risk scores were constructed from a large-scale genome-wide association study of major depression. We first examined the main effects of the MDD PRS and each protective factor on incident MDD. We then tested the effects of each protective factor on incident MDD across strata of polygenic risk.
Polygenic risk showed a dose–response relationship to depression, such that soldiers at high polygenic risk had greatest odds for incident MDD. Both unit cohesion and trait resilience were prospectively associated with reduced risk for incident MDD. Notably, the protective effect of unit cohesion persisted even in soldiers at highest polygenic risk.
Polygenic risk was associated with new-onset MDD in deployed soldiers. However, unit cohesion – an index of perceived support and morale – was protective against incident MDD even among those at highest genetic risk, and may represent a potent target for promoting resilience in vulnerable soldiers. Findings illustrate the value of combining genomic and environmental data in a prospective design to identify robust protective factors for mental health.
Objectives: We assessed trends in the incidence, prevalence, and post-diagnosis mortality of parkinsonism in Ontario, Canada over 18 years. We also explored the influence of a range of risk factors for brain health on the trend of incident parkinsonism. Methods: We established an open cohort by linking population-based health administrative databases from 1996 to 2014 in Ontario. The study population comprised residents aged 20–100 years with an incident diagnosis of parkinsonism ascertained using a validated algorithm. We calculated age- and sex-standardized incidence, prevalence, and mortality of parkinsonism, stratified by young onset (20–39 years) and mid/late onset (≥40 years). We assessed trends in incidence using Poisson regression, mortality using negative binomial regression, and prevalence of parkinsonism and pre-existing conditions (e.g., head injury) using the Cochran–Armitage trend test. To better understand trends in the incidence of mid/late-onset parkinsonism, we adjusted for various pre-existing conditions in the Poisson regression model. Results: From 1996 to 2014, we identified 73,129 incident cases of parkinsonism (source population of ∼10.5 million), of whom 56% were male, mean age at diagnosis was 72.6 years, and 99% had mid/late-onset parkinsonism. Over 18 years, the age- and sex-standardized incidence decreased by 13.0% for mid/late-onset parkinsonism but remained unchanged for young-onset parkinsonism. The age- and sex-standardized prevalence increased by 22.8%, while post-diagnosis mortality decreased by 5.5%. Adjustment for pre-existing conditions did not appreciably explain the declining incidence of mid/late-onset parkinsonism. Conclusion: Young-onset and mid/late-onset parkinsonism exhibited differing trends in incidence over 18 years in Ontario. Further research to identify other factors that may appreciably explain trends in incident parkinsonism is warranted.
Journals use social media to increase the awareness of their publications. Infographics show research findings in a concise and visually appealing manner, well suited for dissemination on social media platforms. We hypothesized that infographic abstracts promoted on social media would increase the dissemination and online readership of the parent research articles.
Twenty-four articles were chosen from the six issues of CJEM published between July 2016 and June 2017 and randomized to infographic or control groups. All articles were disseminated through the journal’s social media accounts (Twitter and Facebook). Control articles were promoted using a screen capture image of each article’s abstract on the journal’s social media accounts. Infographic articles were promoted similarly using a visual infographic. Infographics were also published and promoted on the CanadiEM.org’s website and social media channels. Abstract views, full-text views, and the change in Altmetric score were compared between groups using unpaired two-tailed t-tests.
There were no significant differences in the groups at baseline. Abstract views (mean, 95% CI) were higher in the infographics (379, 287-471) than the control group (176, 136-215, p<0.001). Mean change in Altmetric scores was higher in the infographics (26, 18-34) than in the control group (3, 2-4, p<0.0001). There was no difference in full-text views between the infographics (50, 0-101) and control groups (25, 18-32).
The promotion of CJEM articles using infographics on social media and the CanadiEM.org website increased Altmetric scores and abstract views. Infographics may have a role in increasing awareness of medical literature.