To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cognitive-behavioural therapy (CBT) is an effective treatment for depressed adults. CBT interventions are complex, as they include multiple content components and can be delivered in different ways. We compared the effectiveness of different types of therapy, different components and combinations of components and aspects of delivery used in CBT interventions for adult depression. We conducted a systematic review of randomised controlled trials in adults with a primary diagnosis of depression, which included a CBT intervention. Outcomes were pooled using a component-level network meta-analysis. Our primary analysis classified interventions according to the type of therapy and delivery mode. We also fitted more advanced models to examine the effectiveness of each content component or combination of components. We included 91 studies and found strong evidence that CBT interventions yielded a larger short-term decrease in depression scores compared to treatment-as-usual, with a standardised difference in mean change of −1.11 (95% credible interval −1.62 to −0.60) for face-to-face CBT, −1.06 (−2.05 to −0.08) for hybrid CBT, and −0.59 (−1.20 to 0.02) for multimedia CBT, whereas wait list control showed a detrimental effect of 0.72 (0.09 to 1.35). We found no evidence of specific effects of any content components or combinations of components. Technology is increasingly used in the context of CBT interventions for depression. Multimedia and hybrid CBT might be as effective as face-to-face CBT, although results need to be interpreted cautiously. The effectiveness of specific combinations of content components and delivery formats remain unclear. Wait list controls should be avoided if possible.
Background: Biallelic variants in POLR1C are associated with POLR3-related leukodystrophy (POLR3-HLD), or 4H leukodystrophy (Hypomyelination, Hypodontia, Hypogonadotropic Hypogonadism), and Treacher Collins syndrome (TCS). The clinical spectrum of POLR3-HLD caused by variants in this gene has not been described. Methods: A cross-sectional observational study involving 25 centers worldwide was conducted between 2016 and 2018. The clinical, radiologic and molecular features of 23 unreported and previously reported cases of POLR3-HLD caused by POLR1C variants were reviewed. Results: Most participants presented between birth and age 6 years with motor difficulties. Neurological deterioration was seen during childhood, suggesting a more severe phenotype than previously described. The dental, ocular and endocrine features often seen in POLR3-HLD were not invariably present. Five patients (22%) had a combination of hypomyelinating leukodystrophy and abnormal craniofacial development, including one individual with clear TCS features. Several cases did not exhibit all the typical radiologic characteristics of POLR3-HLD. A total of 29 different pathogenic variants in POLR1C were identified, including 13 new disease-causing variants. Conclusions: Based on the largest cohort of patients to date, these results suggest novel characteristics of POLR1C-related disorder, with a spectrum of clinical involvement characterized by hypomyelinating leukodystrophy with or without abnormal craniofacial development reminiscent of TCS.
Rapid increases in herbicide resistance have highlighted the ability of weeds to undergo genetic change within a short period of time. That change, in turn, has resulted in an increasing emphasis in weed science on the evolutionary ecology and potential adaptation of weeds to herbicide selection. Here we argue that a similar emphasis would also be invaluable for understanding another challenge that will profoundly alter weed biology: the rapid rise in atmospheric carbon dioxide (CO2) and the associated changes in climate. Our review of the literature suggests that elevated CO2 and climate change will impose strong selection pressures on weeds and that weeds will often have the capacity to respond with rapid adaptive evolution. Based on current data, climate change and rising CO2 levels are likely to alter the evolution of agronomic and invasive weeds, with consequences for distribution, community composition, and herbicide efficacy. In addition, we identify four key areas that represent clear knowledge gaps in weed evolution: (1) differential herbicide resistance in response to a rapidly changing CO2/climate confluence; (2) shifts in the efficacy of biological constraints (e.g., pathogens) and resultant selection shifts in affected weed species; (3) climate-induced phenological shifts in weed distribution, demography, and fitness relative to crop systems; and (4) understanding and characterization of epigenetics and the differential expression of phenotypic plasticity versus evolutionary adaptation. These consequences, in turn, should be of fundamental interest to the weed science community.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
Environmental risk factors for dementia are poorly understood. Aluminium and fluorine in drinking water have been linked with dementia but uncertainties remain about this relationship.
In the largest longitudinal study in this context, we set out to explore the individual effect of aluminium and fluoride in drinking water on dementia risk and, as fluorine can increase absorption of aluminium, we also examine any synergistic influence on dementia.
We used Cox models to investigate the association between mean aluminium and fluoride levels in drinking water at their residential location (collected 2005–2012 by the Drinking Water Quality Regulator for Scotland) with dementia in members of the Scottish Mental Survey 1932 cohort who were alive in 2005.
A total of 1972 out of 6990 individuals developed dementia by the linkage date in 2012. Dementia risk was raised with increasing mean aluminium levels in women (hazard ratio per s.d. increase 1.09, 95% CI 1.03–1.15, P < 0.001) and men (1.12, 95% CI 1.03–1.21, P = 0.004). A dose-response pattern of association was observed between mean fluoride levels and dementia in women (1.34, 95% CI 1.28–1.41, P < 0.001) and men (1.30, 95% CI 1.22–1.39, P < 0.001), with dementia risk more than doubled in the highest quartile compared with the lowest. There was no statistical interaction between aluminium and fluoride levels in relation with dementia.
Higher levels of aluminium and fluoride were related to dementia risk in a population of men and women who consumed relatively low drinking-water levels of both.
OBJECTIVES/SPECIFIC AIMS: The objective of this project is to determine whether HRV, collected peri-operatively, is predictive of cognitive decline among older adults who undergo elective surgery/anesthesia. METHODS/STUDY POPULATION: This project is a part of the ongoing INTUIT/PRIME study, which is collecting pre- and post-operative cognitive testing, fMRI imaging, CSF samples, and EEG recordings from 200 older adults (age ≥ 60) undergoing elective non-cardiac/non-neurologic surgery scheduled to last > 2 hours at Duke University Medical Center and Duke Regional Hospital. This project utilizes data from the first 60 INTUIT participants who contributed continuous heart rate data before and during surgery. Participants undergo cognitive testing prior to surgery (baseline) and at 6 weeks after surgery. Our primary dependent variable is the change in the composite score from baseline to 6-weeks. Delirium is assessed in the hospital with the twice daily 3D-CAM tool, so we will report the proportion of individuals with 6-week cognitive decline who exhibited delirium in the days following surgery. Participants’ echocardiogram (ECG) recordings are extracted pre- and intraoperatively from B650/B850 patient monitors with VSCapture software. HRV is defined as the variability between successive R-spikes or inter-beat-intervals on ECG. RESULTS/ANTICIPATED RESULTS: We anticipate that lower intraoperative HRV is associated with worse cognitive decline at 6 weeks after surgery. As secondary objectives, we will determine whether pre-operative HRV or change in HRV (from pre-operative to intra-operative measures) are predictive of cognitive decline after surgery. We expect that in-hospital delirium will be detected in a higher proportion of those with 6-week cognitive decline, compared to those with stable or improved cognition at 6 weeks. DISCUSSION/SIGNIFICANCE OF IMPACT: HRV may address the present need for pre- and intra-operative cognitive risk stratification in the elderly. Physiological indices like HRV have the potential to dramatically change our understanding of CI in older adults undergoing surgery, as they offer an accessible, cost-effective, and non-invasive means whereby clinicians, particularly those unfamiliar with the nuances of geriatric and CI/dementia-related care, can monitor patients and refer those at high-risk of CI after surgery for early intervention.
Healthcare organizations are required to provide workers with respiratory protection (RP) to mitigate hazardous airborne inhalation exposures. This study sought to better identify gaps that exist between RP guidance and clinical practice to understand issues that would benefit from additional research or clarification.
Night-migratory songbirds appear to sense the direction of the Earth's magnetic field via radical pair intermediates formed photochemically in cryptochrome flavoproteins contained in photoreceptor cells in their retinas. It is an open question whether this light-dependent mechanism could be sufficiently sensitive given the low-light levels experienced by nocturnal migrants. The scarcity of available photons results in significant uncertainty in the signal generated by the magnetoreceptors distributed around the retina. Here we use results from Information Theory to obtain a lower bound estimate of the precision with which a bird could orient itself using only geomagnetic cues. Our approach bypasses the current lack of knowledge about magnetic signal transduction and processing in vivo by computing the best-case compass precision under conditions where photons are in short supply. We use this method to assess the performance of three plausible cryptochrome-derived flavin-containing radical pairs as potential magnetoreceptors.
Middle-third helical rim defects may arise from trauma or oncological resection, and pose a challenging reconstructive problem. Reconstructing defects larger than 2 cm using traditional methods commits patients to the inconvenience of staged procedures.
This paper describes a single-stage helical rim reconstruction technique using a post-auricular bipedicled flap and ipsilateral conchal cartilage graft for delayed middle-third helical rim reconstruction.
Two examples of this technique used in post-trauma and oncological reconstruction cases are presented, with pre- and post-operative photographs provided for demonstration.
Contralateral graft harvest and staged operations for helical rim reconstruction are associated with donor site morbidity and the inconvenience of multiple operations to achieve the desired reconstructive outcome. Our single-stage helical rim reconstruction technique was well tolerated by patients, and showed satisfactory aesthetic results in terms of size and symmetry.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
The role of vegetable and fruit intake in reducing falls risk in elderly populations is uncertain. This study examined the associations of vegetable and fruit intake with falls-related hospitalisations in a prospective cohort study of elderly women (n 1429, ≥70 years), including effects on muscular function, which represented a potential causal pathway. Muscular function, measured using grip strength and timed-up-and-go (TUG), and vegetable and fruit intake, quantified using a validated FFQ, were assessed at baseline (1998). Incident falls-related hospitalisation over 14·5-year follow-up was captured by the Hospital Morbidity Data Collection, linked via the Western Australian Data Linkage System. Falls-related hospitalisation occurred in 568 (39·7 %) of women. In multivariable-adjusted models, falls-related hospitalisations were lower in participants consuming more vegetables (hazard ratio (HR) per 75 g serve: 0·90 (95 % CI 0·82, 0·99)), but not fruit intake (per 150 g serve: 1·03 (95 % CI 0·93, 1·14)). Only total cruciferous vegetable intake was inversely associated with falls-related hospitalisation (HR: per 20 g serve: 0·90 (95 % CI 0·83, 0·97)). Higher total vegetable intake was associated with lower odds for poor grip strength (OR: 0·87 (95 % CI 0·77, 0·97)) and slow TUG (OR: 0·88 (95 % CI 0·78, 0·99)). Including grip strength and TUG in the multivariable-adjusted model attenuated the association between total vegetable intake and falls-related hospitalisations. In conclusion, elderly women with higher total and cruciferous vegetable intake had lower injurious falls risk, which may be explained in a large part by better physical function. Falls reduction may be considered an additional benefit of higher vegetable intake in older women.
The ideal sampling method and benefit of qualitative versus quantitative culture for carbapenem-resistant Enterobacteriaceae (CRE) recovery in hospitalized patient rooms and bathrooms is unknown. Although the use of nylon-flocked swabs improved overall gram-negative organism recovery compared with cellulose sponges, they were similar for CRE recovery. Quantitative culture was inferior and unrevealing beyond the qualitative results.
We report on the Bulge Asymmetries and Dynamic Evolution (BAaDE) survey which has observed 19 000 MSX color selected red giant stars for SiO maser emission at 43 GHz with the VLA and is in the process of observing 9 000 of these stars with ALMA at 86 GHz in the Southern sky. Our setup covers the main maser transitions, as well as those of isotopologues and selected lines of carbon-bearing species. Observations of this set of lines allow a far-reaching catalog of line-of-sight velocities in the dust-obscured regions where optical surveys cannot reach. Our preliminary detection rate is close to 70%, predicting a wealth of new information on the distribution of metal rich stars, their kinematics as function of location in the Galaxy, as well as the occurrence of lines and line ratios between the different transitions in combination with the spectral energy distribution from about 1 to 100 μm. Similar to the OH/IR stars, a clear kinematic signature between disk and bulge stars can be seen. Furthermore, the SiO J = →10 (v=3) line plays a prominent role in the derived maser properties.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
To determine the effect of mandatory and nonmandatory influenza vaccination policies on vaccination rates and symptomatic absenteeism among healthcare personnel (HCP).
Retrospective observational cohort study.
This study took place at 3 university medical centers with mandatory influenza vaccination policies and 4 Veterans Affairs (VA) healthcare systems with nonmandatory influenza vaccination policies.
The study included 2,304 outpatient HCP at mandatory vaccination sites and 1,759 outpatient HCP at nonmandatory vaccination sites.
To determine the incidence and duration of absenteeism in outpatient settings, HCP participating in the Respiratory Protection Effectiveness Clinical Trial at both mandatory and nonmandatory vaccination sites over 3 viral respiratory illness (VRI) seasons (2012–2015) reported their influenza vaccination status and symptomatic days absent from work weekly throughout a 12-week period during the peak VRI season each year. The adjusted effects of vaccination and other modulating factors on absenteeism rates were estimated using multivariable regression models.
The proportion of participants who received influenza vaccination was lower each year at nonmandatory than at mandatory vaccination sites (odds ratio [OR], 0.09; 95% confidence interval [CI], 0.07–0.11). Among HCP who reported at least 1 sick day, vaccinated HCP had lower symptomatic days absent compared to unvaccinated HCP (OR for 2012–2013 and 2013–2014, 0.82; 95% CI, 0.72–0.93; OR for 2014–2015, 0.81; 95% CI, 0.69–0.95).
These data suggest that mandatory HCP influenza vaccination policies increase influenza vaccination rates and that HCP symptomatic absenteeism diminishes as rates of influenza vaccination increase. These findings should be considered in formulating HCP influenza vaccination policies.