To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The importance of timely identification and treatment of psychosis are increasingly the focus of early interventions, with research targeting the initial high-risk period in the months following first-episode hospitalization. However, ongoing psychiatric treatment and service utilization after the symptoms have been stabilized over the initial years following first-episode has received less research attention.
To model the variables predicting continued service utilization with psychiatrists for adolescents following their first-episode psychosis; examine associated temporal patterns in continued psychiatric service utilization.
This study utilized a cohort design to assess adolescents (age 14.4 ± 2.5 years) discharged following their index hospitalization for first-episode psychosis. Bivariate analyses were conducted on predictor variables associated with psychiatric service utilization. All significant predictor variables were included in a logistic regression model.
Variables that were significantly associated with psychiatric service utilization included: diagnosis with a schizophrenia spectrum disorder rather than major mood disorder with psychotic features (OR = 24.0; P = 0.02), a first degree relative with depression (OR = 0.12; P = 0.05), and months since last psychiatric inpatient discharge (OR = 0.92; P = 0.02). Further examination of time since last hospitalization found that all adolescents continued service utilization up to 18 months post-discharge.
Key findings highlight the importance of early diagnosis, that a first degree relative with depression may negatively influence the adolescent's ongoing service utilization, and that 18 months post-discharge may a critical time to review current treatment strategies and collaborate with youth and families to ensure that services continue to meet their needs.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The Sort, Access, Life-saving interventions, Treatment and/or Triage (SALT) mass-casualty incident (MCI) algorithm is unique in that it includes two subjective questions during the triage process: “Is the victim likely to survive given the resources?” and “Is the injury minor?”
Given this subjectivity, it was hypothesized that as casualties increase, the inter-rater reliability (IRR) of the tool would decline, due to an increase in the number of patients triaged as Minor and Expectant.
A pre-collected dataset of pediatric trauma patients age <14 years from a single Level 1 trauma center was used to generate “patients.” Three trained raters triaged each patient using SALT as if they were in each of the following scenarios: 10, 100, and 1,000 victim MCIs. Cohen’s kappa test was used to evaluate IRR between the raters in each of the scenarios.
A total of 247 patients were available for triage. The kappas were consistently “poor” to “fair:” 0.37 to 0.59 in the 10-victim scenario; 0.13 to 0.36 in the 100-victim scenario; and 0.05 to 0.36 in the 1,000-victim scenario. There was an increasing percentage of subjects triaged Minor as the number of estimated victims increased: 27.8% increase from 10- to 100-victim scenario and 7.0% increase from 100- to 1,000-victim scenario. Expectant triage categorization of patients remained stable as victim numbers increased.
Overall, SALT demonstrated poor IRR in this study of increasing casualty counts while triaging pediatric patients. Increased casualty counts in the scenarios did lead to increased Minor but not Expectant categorizations.
Recent investigations now suggest that cerebrovascular reactivity (CVR) is impaired in Alzheimer’s disease (AD) and may underpin part of the disease’s neurovascular component. However, our understanding of the relationship between the magnitude of CVR, the speed of cerebrovascular response, and the progression of AD is still limited. This is especially true in patients with mild cognitive impairment (MCI), which is recognized as an intermediate stage between normal aging and dementia. The purpose of this study was to investigate AD and MCI patients by mapping repeatable and accurate measures of cerebrovascular function, namely the magnitude and speed of cerebrovascular response (τ) to a vasoactive stimulus in key predilection sites for vascular dysfunction in AD.
Thirty-three subjects (age range: 52–83 years, 20 males) were prospectively recruited. CVR and τ were assessed using blood oxygen level-dependent MRI during a standardized carbon dioxide stimulus. Temporal and parietal cortical regions of interest (ROIs) were generated from anatomical images using the FreeSurfer image analysis suite.
Of 33 subjects recruited, 3 individuals were excluded, leaving 30 subjects for analysis, consisting of 6 individuals with early AD, 11 individuals with MCI, and 13 older healthy controls (HCs). τ was found to be significantly higher in the AD group compared to the HC group in both the temporal (p = 0.03) and parietal cortex (p = 0.01) following a one-way ANCOVA correcting for age and microangiopathy scoring and a Bonferroni post-hoc correction.
The study findings suggest that AD is associated with a slowing of the cerebrovascular response in the temporal and parietal cortices.
The dissolution of the United Kingdom’s vitrified high-level-waste simulant, CaZn MW28, was investigated following the Product Consistency Test-B protocol for 112 d at 90 °C and in ultra-high-quality water. Residual rate dissolution (stage II) and rate resumption (stage III), after 28 d, was observed. Thermodynamic modelling suggested that solutions were saturated with respect to Mg- and Zn-bearing phases, and the presence of Mg- and Zn-smectite clays was tentatively observed. The formation of these phases was concurrent with a significant increase in the dissolution rate, similar to Stage III behavior seen in other nuclear waste simulant glass materials, indicating that the addition of Mg and Zn to high-level-waste glass (7.3 wt. % combined) significantly influences the dissolution rate.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Ecosystem services typically benefit multiple groups of people. However, natural resource management decisions aiming to secure ecosystem services for one beneficiary group rarely consider potential consequences for others. Here, we examine records of moose hunting in Vermont, USA, a recreational ecosystem service with at least two beneficiary groups: hunters, who benefit from recreational experiences and moose meat, and residents, who live in hunting areas and benefit from hunters’ expenditures. We ask how the allocation of hunting permits has affected (1) the total number of hunters and therefore the benefits enjoyed by this group, (2) the benefits residents received, and (3) the spatial distribution of benefits for each group. We found that changes in the allocation of permits had heterogeneous effects on the beneficiaries. For example, increasing the number of hunting permits increased the total number of hunters, but not necessarily the number of residents who potentially benefit. Also, a more balanced distribution of permits across Vermont increased the total number of potentially benefiting residents, but not those from lower socio-economic groups. Understanding these differences and interactions between beneficiary groups is necessary to distribute benefits equitably amongst them.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Background: Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist and recommended first line treatment for opioid use disorder (OUD). Emergency departments (EDs) are a key point of contact with the healthcare system for patients living with OUD. Aim Statement: We implemented a multi-disciplinary quality improvement project to screen patients for OUD, initiate bup/nal for eligible individuals, and provide rapid next business day walk-in referrals to addiction clinics in the community. Measures & Design: From May to September 2018, our team worked with three ED sites and three addiction clinics to pilot the program. Implementation involved alignment with regulatory requirements, physician education, coordination with pharmacy to ensure in-ED medication access, and nurse education. The project is supported by a full-time project manager, data analyst, operations leaders, physician champions, provincial pharmacy, and the Emergency Strategic Clinical Network leadership team. For our pilot, our evaluation objective was to determine the degree to which our initiation and referral pathway was being utilized. We used administrative data to track the number of patients given bup/nal in ED, their demographics and whether they continued to fill bup/nal prescriptions 30 days after their ED visit. Addiction clinics reported both the number of patients referred to them and the number of patients attending their referral. Evaluation/Results: Administrative data shows 568 opioid-related visits to ED pilot sites during the pilot phase. Bup/nal was given to 60 unique patients in the ED during 66 unique visits. There were 32 (53%) male patients and 28 (47%) female patients. Median patient age was 34 (range: 21 to 79). ED visits where bup/nal was given had a median length of stay of 6 hours 57 minutes (IQR: 6 hours 20 minutes) and Canadian Triage Acuity Scores as follows: Level 1 – 1 (2%), Level 2 – 21 (32%), Level 3 – 32 (48%), Level 4 – 11 (17%), Level 5 – 1 (2%). 51 (77%) of these visits led to discharge. 24 (47%) discharged patients given bup/nal in ED continued to fill bup/nal prescriptions 30 days after their index ED visit. EDs also referred 37 patients with OUD to the 3 community clinics, and 16 of those individuals (43%) attended their first follow-up appointment. Discussion/Impact: Our pilot project demonstrates that with dedicated resources and broad institutional support, ED patients with OUD can be appropriately initiated on bup/nal and referred to community care.
Distinguishing a disorder of persistent and impairing grief from normative grief allows clinicians to identify this often undetected and disabling condition. As four diagnostic criteria sets for a grief disorder have been proposed, their similarities and differences need to be elucidated.
Participants were family members bereaved by US military service death (N = 1732). We conducted analyses to assess the accuracy of each criteria set in identifying threshold cases (participants who endorsed baseline Inventory of Complicated Grief ⩾30 and Work and Social Adjustment Scale ⩾20) and excluding those below this threshold. We also calculated agreement among criteria sets by varying numbers of required associated symptoms.
All four criteria sets accurately excluded participants below our identified clinical threshold (i.e. correctly excluding 86–96% of those subthreshold), but they varied in identification of threshold cases (i.e. correctly identifying 47–82%). When the number of associated symptoms was held constant, criteria sets performed similarly. Accurate case identification was optimized when one or two associated symptoms were required. When employing optimized symptom numbers, pairwise agreements among criteria became correspondingly ‘very good’ (κ = 0.86–0.96).
The four proposed criteria sets describe a similar condition of persistent and impairing grief, but differ primarily in criteria restrictiveness. Diagnostic guidance for prolonged grief disorder in International Classification of Diseases, 11th Edition (ICD-11) functions well, whereas the criteria put forth in Section III of Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) are unnecessarily restrictive.
The last 12 years have seen the evolution of a new funding regime under the supervision of the Pensions Regulator. Over this period, there has been significant turbulence in financial markets, including record low interest rates. This paper takes a critical look at the development of funding approaches and methodologies over this period. It analyses the Pensions Regulator guidance and how scheme specific actuarial methods have emerged since the move away from the Minimum Funding Requirement in 2001 and the introduction of the Scheme Specific Funding Requirements in 2005. It asks whether these new methodologies have been successful from the perspective of members, trustees, employers and shareholders. At a time when actuarial valuation methodologies have faced considerable criticism, this paper aims to propose a pension funding methodology which is fit for purpose and also reflects the latest guidance from the Pensions Regulator on integrated risk management.
The forward rate of dissolution of the International Simple Glass (ISG) was determined under alkaline conditions at 40 °C using the Single Pass Flow Through (SPFT) method. Forward rates were consistent with those obtained in the literature for this glass composition. The formation of altered gel layers and surface pits was observed on the surface of glass particles, especially at the very highest pH values, despite the application of high flow rates to prevent the build-up of solubility limiting phases. These features could be attributed to preferential localized dissolution at sites with a higher alkali concentration or from a separate, less durable, vitreous phase. These results may indicate that surface pit and altered gel formation occurs under the forward rate of dissolution as imposed by the SPFT method, particularly for simplified borosilicate glass materials.
We analyzed intestinal contents of two late-glacial mastodons preserved in lake sediments in Ohio (Burning Tree mastodon) and Michigan (Heisler mastodon). A multi-proxy suite of macrofossils and microfossils provided unique insights into what these individuals had eaten just before they died and added significantly to knowledge of mastodon diets. We reconstructed the mastodons’ habitats with similar multi-proxy analyses of the embedding lake sediments. Non-pollen palynomorphs, especially spores of coprophilous fungi differentiated intestinal and environmental samples. The Burning Tree mastodon gut sample originates from the small intestine. The Heisler mastodon sample is part of the large intestine to which humans had added clastic material to anchor parts of the carcass under water to cache the meat. Both carcasses had been dismembered, suggesting that the mastodons had been hunted or scavenged, in line with other contemporaneous mastodon finds and the timing of early human incursion into the Midwest. Both mastodons lived in mixed coniferous-deciduous late-glacial forests. They browsed tree leaves and twigs, especially Picea. They also ate sedge-swamp plants and drank the lake water. Our multi-proxy estimates for a spring/summer season of death contrast with autumn estimates derived from prior tusk analyses. We document the recovered fossil remains with photographs.
Given the history of the International Criminal Court in Africa, the relationship between African states and the Court is particularly significant to its legitimacy. If the power of the Court is grounded in international political support and the perception that it transcends international and national politics to deny impunity for ‘atrocity’ crimes, the Court's perceived legitimacy and normative legitimacy are so intertwined that charges of illegitimacy from significant regional stakeholders hold particular weight. More importantly, criticisms voiced by African actors point to a valid challenge to the Court's legitimate moral standing as an arbiter of global justice: the international power imbalance that seems to be becoming more entrenched and apparent in the Court's work. Tactics adopted by some African leaders of prioritising the issue of heads-of-state immunity, however, minimise the broader issue of power differentials and reduce the chance that African states will find allies in their cause to challenge the Court's operations.
The intent of this study was to determine whether there are differences in disaster preparedness between urban and rural community hospitals across New York State.
Descriptive and analytical cross-sectional survey study of 207 community hospitals; thirty-five questions evaluated 6 disaster preparedness elements: disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, disaster preparedness funding levels, and perception of disaster preparedness.
Completed surveys were received from 48 urban hospitals and 32 rural hospitals.There were differences in disaster preparedness between urban and rural hospitals with respect to disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, and perception of disaster preparedness. No difference was identified between these hospitals with respect to disaster preparedness funding levels.
The results of this study provide an assessment of the current state of disaster preparedness in urban and rural community hospitals in New York. Differences in preparedness between the two settings may reflect differing priorities with respect to perceived threats, as well as opportunities for improvement that may require additional advocacy and legislation. (Disaster Med Public Health Preparedness. 2019;13:424-428)