To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite knowing for many decades that depressive psychopathology is common in first-episode schizophrenia spectrum disorders (FES), there is limited knowledge regarding the extent and nature of such psychopathology (degree of comorbidity, caseness, severity) and its demographic, clinical, functional and treatment correlates. This study aimed to determine the pooled prevalence of depressive disorder and caseness, and the pooled mean severity of depressive symptoms, as well as the demographic, illness, functional and treatment correlates of depressive psychopathology in FES.
This systematic review, meta-analysis and meta-regression was prospectively registered (CRD42018084856) and conducted in accordance with PRISMA and MOOSE guidelines.
Forty studies comprising 4041 participants were included. The pooled prevalence of depressive disorder and caseness was 26.0% (seven samples, N = 855, 95% CI 22.1–30.3) and 43.9% (11 samples, N = 1312, 95% CI 30.3–58.4), respectively. The pooled mean percentage of maximum depressive symptom severity was 25.1 (38 samples, N = 3180, 95% CI 21.49–28.68). Correlates of depressive psychopathology were also found.
At least one-quarter of individuals with FES will experience, and therefore require treatment for, a full-threshold depressive disorder. Nearly half will experience levels of depressive symptoms that are severe enough to warrant diagnostic investigation and therefore clinical intervention – regardless of whether they actually fulfil diagnostic criteria for a depressive disorder. Depressive psychopathology is prominent in FES, manifesting not only as superimposed comorbidity, but also as an inextricable symptom domain.
The aim of this study was to describe patient level costing methods and develop a database of healthcare resource use and cost in patients with AHF receiving ventricular assist device (VAD) therapy.
Patient level micro-costing was used to identify documented activity in the years preceding and following VAD implantation, and preceding heart transplant for a cohort of seventy-seven consecutive patients listed for heart transplantation (2009–12). Clinician interviews verified activity, established time resource required for each activity, and added additional undocumented activities. Costs were sourced from the general ledger, salary, stock price, pharmacy formulary data, and from national medical benefits and prostheses lists. Linked administrative data analyses of activity external to the implanting institution, used National Weighted Activity Units (NWAU), 2014 efficient price, and admission complexity cost weights and were compared with micro-costed data for the implanting admission.
The database produced includes patient level activity and costs associated with the seventy-seven patients across thirteen resource areas including hospital activity external to the implanting center. The median cost of the implanting admission using linked administrative data was $246,839 (interquartile range [IQR] $246,839–$271,743), versus $270,716 (IQR $211,740–$378,482) for the institutional micro-costing (p = .08).
Linked administrative data provides a useful alternative for imputing costs external to the implanting center, and combined with institutional data can illuminate both the pathways to transplant referral and the hospital activity generated by patients experiencing the terminal phases of heart failure in the year before transplant, cf-VAD implant, or death.
We present a re-analysis of the results obtained from a series of measurements on freshwater and saline ice beams under various centrifugal accelerations. The data show a strong influence of beam size, brine volume and centrifugal acceleration on the elastic modulus of ice. The data suggest a transition brine volume at around 9%, which might occur close to the melting point, at which the elastic modulus of ice drops rapidly due to a possible change of brine-pocket structure. Furthermore, for brine volumes less than 9%, there is a negligible increase in the elastic modulus measured under high centrifugal acceleration, but for brine volumes more than 9% the increase is considerable, approaching that measured with freshwater ice. This may be due to necking of brine drainage channels just above the ice/water interface at high centrifugal acceleration. A model of sea ice was constructed based on existing theories of brine inclusions in sea ice, which satisfactorily predicts the observed trends.
1) To evaluate whether transient ischemic attack (TIA) management in emergency departments (EDs) of the Nova Scotia Capital District Health Authority followed Canadian Best Practice Recommendations, and 2) to assess the impact of being followed up in a dedicated outpatient neurovascular clinic.
Retrospective chart review of all patients discharged from EDs in our district from January 1, 2011 to December 31, 2012 with a diagnosis of TIA. Cox proportional hazards models, Kaplan-Meier survival curve, and propensity matched analyses were used to evaluate 90-day mortality and readmission.
Of the 686 patients seen in the ED for TIA, 88.3% received computed tomography (CT) scanning, 86.3% received an electrocardiogram (ECG), 35% received vascular imaging within 24 hours of triage, 36% were seen in a neurovascular clinic, and 4.2% experienced stroke, myocardial infarction, or vascular death within 90 days. Rates of antithrombotic use were increased in patients seen in a neurovascular clinic compared to those who were not (94% v. 86.3%, p<0.0001). After adjustment for age, sex, vascular disease risk factors, and stroke symptoms, the risk of readmission for stroke, myocardial infarction, or vascular death was lower for those seen in a neurovascular clinic compared to those who were not (adjusted hazard ratio 0.28; 95% confidence interval 0.08–0.99, p=0.048).
The majority of patients in our study were treated with antithrombotic agents in the ED and investigated with CT and ECG within 24 hours; however, vascular imaging and neurovascular clinic follow-up were underutilized. For those with neurovascular clinic follow-up, there was an association with reduced risk of subsequent stroke, myocardial infarction, or vascular death.
The aim of this study was to examine whether people differed in change in performance across the first five blocks of an online flanker task and whether those trajectories of change were associated with self-reported aerobic or resistance exercise frequency according to age. A total of 8752 men and women aged 13–89 completed a lifestyle survey and five 45-s games (each game was a block of ~46 trials) of an online flanker task. Accuracy of the congruent and incongruent flanker stimuli was analyzed using latent class and growth curve modeling adjusting for time between blocks, whether the blocks occurred on the same or different days, education, smoking, sleep, caffeinated coffee and tea use, and Lumosity training status (“free play” or part of a “daily brain workout”). Aerobic and resistance exercise were unrelated to first block accuracies. For the more cognitively demanding incongruent flanker stimuli, aerobic activity was positively related to the linear increase in accuracy [B=0.577%, 95% confidence interval (CI), 0.112 to 1.25 per day above the weekly mean of 2.8 days] and inversely related to the quadratic deceleration of accuracy gains (B=−0.619% CI, −1.117 to −0.121 per day). An interaction of aerobic activity with age indicated that active participants younger than age 45 had a larger linear increase and a smaller quadratic deceleration compared to other participants. Age moderates the association between self-reported aerobic, but not self-reported resistance, exercise and changes in cognitive control that occur with practice during incongruent presentations across five blocks of a 45-s online, flanker task. (JINS, 2015, 21, 802–815)
Volcanic eruptions commonly produce buoyant ash-laden plumes that rise through the stratified atmosphere. On reaching their level of neutral buoyancy, these plumes cease rising and transition to horizontally spreading intrusions. Such intrusions occur widely in density-stratified fluid environments, and in this paper we develop a shallow-layer model that governs their motion. We couple this dynamical model to a model for particle transport and sedimentation, to predict both the time-dependent distribution of ash within volcanic intrusions and the flux of ash that falls towards the ground. In an otherwise quiescent atmosphere, the intrusions spread axisymmetrically. We find that the buoyancy-inertial scalings previously identified for continuously supplied axisymmetric intrusions are not realised by solutions of the governing equations. By calculating asymptotic solutions to our model we show that the flow is not self-similar, but is instead time-dependent only in a narrow region at the front of the intrusion. This non-self-similar behaviour results in the radius of the intrusion growing with time
, rather than
as suggested previously. We also identify a transition to drag-dominated flow, which is described by a similarity solution with radial growth now proportional to
. In the presence of an ambient wind, intrusions are not axisymmetric. Instead, they are predominantly advected downstream, while at the same time spreading laterally and thinning vertically due to persistent buoyancy forces. We show that close to the source, this lateral spreading is in a buoyancy-inertial regime, whereas far downwind, the horizontal buoyancy forces that drive the spreading are balanced by drag. Our results emphasise the important role of buoyancy-driven spreading, even at large distances from the source, in the formation of the flowing thin horizontally extensive layers of ash that form in the atmosphere as a result of volcanic eruptions.
Longitudinal, patient-level data on resource use and costs after an ischemic stroke are lacking in Canada. The objectives of this analysis were to calculate costs for the first year post-stroke and determine the impact of disability on costs.
The Economic Burden of Ischemic Stroke (BURST) Study was a one-year prospective study with a cohort of ischemic stroke patients recruited at 12 Canadian stroke centres. Clinical history, disability, health preference and resource utilization information was collected at discharge, three months, six months and one year. Resources included direct medical costs (2009 CAN$) such as emergency services, hospitalizations, rehabilitation, physician services, diagnostics, medications, allied health professional services, homecare, medical/assistive devices, changes to residence and paid caregivers, as well as indirect costs. Results were stratified by disability measured at discharge using the modified Rankin Score (mRS): non-disabling stroke (mRS 0-2) and disabling stroke (mRS 3-5).
We enrolled 232 ischemic stroke patients (age 69.4 ± 15.4 years; 51.3% male) and 113 (48.7%) were disabled at hospital discharge. The average annual cost was $74,353; $107,883 for disabling strokes and $48,339 for non-disabling strokes.
An average annual cost for ischemic stroke was calculated in which a disabling stroke was associated with a two-fold increase in costs compared to NDS. Costs during the hospitalization to three months phase were the highest contributor to the annual cost. A “back of the envelope” calculation using 38,000 stroke admissions and the average annual cost yields $2.8 billion as the burden of ischemic stroke.
Fatigue affects 33-77% of stroke survivors. There is no consensus concerning risk factors for fatigue post-stroke, perhaps reflecting the multifaceted nature of fatigue. We characterized post-stroke fatigue using the Fatigue Impact Scale (FIS), a validated questionnaire capturing physical, cognitive, and psychosocial aspects of fatigue.
The Stroke Outcomes Study (SOS) prospectively enrolled ischemic stroke patients from 2001-2002. Measures collected included basic demographics, pre-morbid function (Oxford Handicap Scale, OHS), stroke severity (Stroke Severity Scale, SSS), stroke subtype (Oxfordshire Community Stroke Project Classification, OCSP), and discharge function (OHS; Barthel Index, BI). An interview was performed at 12 months evaluating function (BI; Modified Rankin Score, mRS), quality of life (Reintegration into Normal living Scale, RNL), depression (Geriatric Depression Scale, GDS), and fatigue (FIS).
We enrolled 522 ischemic stroke patients and 228 (57.6%) survivors completed one-year follow-up. In total, 36.8% endorsed fatigue (59.5% rated one of worst post-stroke symptoms). Linear regression demonstrated younger age was associated with increased fatigue frequency (β=-0.20;p=0.01), duration (β=-0.22;p<0.01), and disability (β=-0.24;p<0.01). Younger patients were more likely to describe fatigue as one of the worst symptoms post-stroke (β=-0.24;p=0.001). Younger patients experienced greater impact on cognitive (β=-0.27;p<0.05) and psychosocial (β=-0.27;p<0.05) function due to fatigue. Fatigue was correlated with depressive symptoms and diminished quality of life. Fatigue occurred without depression as 49.0% of respondents with fatigue as one of their worst symptoms did not have an elevated GDS.
Age was the only consistent predictor of fatigue severity at one year. Younger participants experienced increased cognitive and psychosocial fatigue.
The success of central line-associated bloodstream infection (CLABSI) prevention programs in intensive care units (ICUs) has led to the expansion of surveillance at many hospitals. We sought to compare non-ICU CLABSI (nCLABSI) rates with national reports and describe methods of surveillance at several participating US institutions.
Design and Setting.
An electronic survey of several medical centers about infection surveillance practices and rate data for non-ICU Patients.
Ten tertiary care hospitals.
In March 2011, a survey was sent to 10 medical centers. The survey consisted of 12 questions regarding demographics and CLABSI surveillance methodology for non-ICU patients at each center. Participants were also asked to provide available rate and device utilization data.
Hospitals ranged in size from 238 to 1,400 total beds (median, 815). All hospitals reported using Centers for Disease Control and Prevention (CDC) definitions. Denominators were collected by different means: counting patients with central lines every day (5 hospitals), indirectly estimating on the basis of electronic orders (n = 4), or another automated method (n = 1). Rates of nCLABSI ranged from 0.2 to 4.2 infections per 1,000 catheter-days (median, 2.5). The national rate reported by the CDC using 2009 data from the National Healthcare Surveillance Network was 1.14 infections per 1,000 catheter-days.
Only 2 hospitals were below the pooled CLABSI rate for inpatient wards; all others exceeded this rate. Possible explanations include differences in average central line utilization or hospital size in the impact of certain clinical risk factors notably absent from the definition and in interpretation and reporting practices. Further investigation is necessary to determine whether the national benchmarks are low or whether the hospitals surveyed here represent a selection of outliers.