We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Emergency psychiatric care, unplanned hospital admissions, and inpatient health care are the costliest forms of mental health care. According to Statistics Canada (2018), almost 18% (5.3 million) of Canadians reported needing mental health support. However, just above half of this figure (56.2%) have reported their needs were fully met. To further expand capacity and access to mental health care in the province, Nova Scotia Health has launched a novel mental health initiative, the Rapid Access, and Stabilization Program (RASP).
Objectives
This study evaluates the effectiveness and impact of the RASP on high-cost health services utilization (e.g. ED visits, mobile crisis visits, and inpatient treatments) and related costs. It also assesses healthcare partners’ (e.g. healthcare providers, policymakers, community leaders) perceptions and patient experiences and satisfaction with the program and identifies sociodemographic characteristics, psychological conditions, recovery, well-being, and risk measures in the assisted population.
Methods
This is a hypothesis-driven program evaluation study that employs a mixed methods approach. A within-subject comparison will examine health services utilization data from patients attending RASP, one year before and one year after their psychiatry assessment at the program. A controlled between-subject comparison will use historical data from a control population will examine whether possible changes in high-cost health services utilization are associated with the intervention (RASP). The primary analysis involves extracting secondary data from provincial information systems, electronic medical records, and regular self-reported clinical assessments. Additionally, a qualitative sub-study will examine patient experience and satisfaction, and examine health care partners’ impressions.
Results
The results for the primary, secondary, and qualitative outcome measures to be available within 6 months of study completion. We expect that RASP evaluation findings will demonstrate a minimum 10% reduction in high-cost health services utilization and corresponding 10% cost savings, and also a reduction in the wait times for patient consultations with psychiatrists to less than 30 calendar days. In addition, we anticipate that patients, healthcare providers, and healthcare partners would express high levels of satisfaction with the new service.
Conclusions
This study will demonstrate the results of the Mental Health and Addictions Program (MHAP) efforts to provide stepped-care, particularly community-based support, to individuals with mental illnesses. Results will provide new insights into a novel community-based approach to mental health service delivery and contribute to knowledge on how to implement mental health programs across varying contexts.
Poor mental health of university students is a growing concern for public health. Indeed, academic settings may exacerbate students’ vulnerability to mental health issues. Nonetheless, university students are often unable to seek mental health support due to barriers, at both individual and organisational level. Digital technologies are proved to be effective in collecting health-related information and in managing psychological distress, representing useful instruments to tackle mental health needs, especially considering their accessibility and cost-effectiveness.
Objectives
Although digital tools are recognised to be useful for mental health support, university students’ opinions and experiences related to such interventions are still to be explored. In this qualitative research, we aimed to address this gap in the scientific literature.
Methods
Data were drawn from “the CAMPUS study”, which longitudinally assesses students’ mental health at the University of Milano-Bicocca (Italy) and the University of Surrey (United Kingdom). We performed detailed interviews and analysed the main themes of the transcripts. We also performed a cross-cultural comparison between Italy and the United Kingdom.
Results
Across 33 interviews, five themes were identified, and an explanatory model was developed. From the students’ perspective, social media, podcasts, and apps could be sources of significant mental health content. On the one hand, students recognised wide availability and anonymity as advantages that make digital technologies suitable for primary to tertiary prevention, to reduce mental health stigma, and as an extension of face-to-face interventions. On the other hand, perceived disadvantages were lower efficacy compared to in-person approaches, lack of personalisation, and difficulties in engagement. Students’ opinions and perspectives could be widely influenced by cultural and individual background.
Conclusions
Digital tools may be an effective option to address mental health needs of university students. Since face-to-face contact remains essential, digital interventions should be integrated with in-person ones, in order to offer a multi-modal approach to mental well-being.
Aluminum-substituted hematites (Fe2−xAlxO3) were synthesized from Fe-Al coprecipitates at pH 5.5, 7.0, and in 10−1, 10−2, and 10−2 M KOH at 70°C. As little as 1 mole % Al suppressed goethite completely at pH 7 whereas in KOH higher Al concentrations were necessary. Al substitution as determined chemically and by XRD line shift was related to Al addition up to a maximum of 16–17 mole %. The relationship between the crystallographic a0 parameter and Al substitution deviated from the Vegard rule. At low substitution crystallinity of the hematites was improved whereas higher substitution impeded crystal growth in the crystallographic z-direction as indicated by differential XRD line broadening. At still higher Al addition crystal growth was strongly retarded. The initial Al-Fe coprecipitate behaved differently from a mechanical mixture of the respective “hydroxides” and was, therefore, considered an aluminous ferrihydrite.
Analysis of Mossbauer effect in layer silicates provides a spectroscopic method for determining valences and coordination of iron. In this study Mossbauer spectra were obtained for amesite, cronstedtite, nontronite, two glauconites, biotite, lepidomelane, chlorite, minnesotaite, vermiculite. stilpnomelane, and chloritoid.
Trivalent iron was detected in tetrahedral coordination. Abundant trivalent iron in octahedral coordination apparently causes quadrupole splitting values of divalent iron in the same mineral to decrease. This phenomenon was noted in cronstedtite and glauconite. In cases where divalent iron predominates in the mineral, the quadrupole splitting is larger. It is generally accepted that ferrous iron is largely in octahedral coordination. This suggests that the octahedral sites may be more distorted when ferric iron is present in the octahedral sheet. In biotite, quadrupole splitting of divalent iron is decreased when trivalent iron is present in tetrahedral sheets. This suggests that there is also more distortion in the octahedral sheet because of iron in tetrahedral positions.
Background: Carotid body tumours (CBT) are rare neoplasms of the paraganglia at the carotid bifurcation. Histopathologic analysis alone is insufficient to confirm malignancy, requiring metastases to non-neuroendocrine tissue including cervical lymph nodes for definitive diagnosis. The role of selective neck dissection (SND) during CBT surgeries in detecting malignancy and guiding subsequent management remains uncertain. Methods: A retrospective case series was performed on all patients undergoing CBT surgeries with SND between 2002 and 2022. Data collection included demographics, genetic and laboratory testing, imaging, intra- and post-operative complications, follow-up and histopathology. Results: Twenty-one patients underwent CBT resection with SND. Of these, 3 had carotid artery injuries, and 5 had nerve injuries. One patient experienced peri-operative embolic strokes, presumed related to tumour embolization. Three patients were found to have lymph node involvement, confirming malignancy. Malignancy was significantly associated with the risk of carotid injury (p = 0.04.) Conclusions: SND is a useful adjunct in detecting malignancy during CBT resection. The incidence of malignancy in CBT is low but not negligible and SND should be considered in patients with suspected malignancy or high-risk factors. This study’s 14% incidence of malignancy suggests there may be a rationale for considering universal implementation of SND during CBT resection.
Hard-to-treat childhood cancers are those where standard treatment options do not exist and the prognosis is poor. Healthcare professionals (HCPs) are responsible for communicating with families about prognosis and complex experimental treatments. We aimed to identify HCPs’ key challenges and skills required when communicating with families about hard-to-treat cancers and their perceptions of communication-related training.
Methods
We interviewed Australian HCPs who had direct responsibilities in managing children/adolescents with hard-to-treat cancer within the past 24 months. Interviews were analyzed using qualitative content analysis.
Results
We interviewed 10 oncologists, 7 nurses, and 3 social workers. HCPs identified several challenges for communication with families including: balancing information provision while maintaining realistic hope; managing their own uncertainty; and nurses and social workers being underutilized during conversations with families, despite widespread preferences for multidisciplinary teamwork. HCPs perceived that making themselves available to families, empowering them to ask questions, and repeating information helped to establish and maintain trusting relationships with families. Half the HCPs reported receiving no formal training for communicating prognosis and treatment options with families of children with hard-to-treat cancers. Nurses, social workers, and less experienced oncologists supported the development of communication training resources, more so than more experienced oncologists.
Significance of results
Resources are needed which support HCPs to communicate with families of children with hard-to-treat cancers. Such resources may be particularly beneficial for junior oncologists and other HCPs during their training, and they should aim to prepare them for common challenges and foster greater multidisciplinary collaboration.
We demonstrate the importance of radio selection in probing heavily obscured galaxy populations. We combine Evolutionary Map of the Universe (EMU) Early Science data in the Galaxy and Mass Assembly (GAMA) G23 field with the GAMA data, providing optical photometry and spectral line measurements, together with Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry, providing IR luminosities and colours. We investigate the degree of obscuration in star-forming galaxies, based on the Balmer decrement (BD), and explore how this trend varies, over a redshift range of $0<z<0.345$. We demonstrate that the radio-detected population has on average higher levels of obscuration than the parent optical sample, arising through missing the lowest BD and lowest mass galaxies, which are also the lower star formation rate (SFR) and metallicity systems. We discuss possible explanations for this result, including speculation around whether it might arise from steeper stellar initial mass functions in low mass, low SFR galaxies.
Various water-based heater-cooler devices (HCDs) have been implicated in nontuberculous mycobacteria outbreaks. Ongoing rigorous surveillance for healthcare-associated M. abscessus (HA-Mab) put in place following a prior institutional outbreak of M. abscessus alerted investigators to a cluster of 3 extrapulmonary M. abscessus infections among patients who had undergone cardiothoracic surgery.
Methods:
Investigators convened a multidisciplinary team and launched a comprehensive investigation to identify potential sources of M. abscessus in the healthcare setting. Adherence to tap water avoidance protocols during patient care and HCD cleaning, disinfection, and maintenance practices were reviewed. Relevant environmental samples were obtained. Patient and environmental M. abscessus isolates were compared using multilocus-sequence typing and pulsed-field gel electrophoresis. Smoke testing was performed to evaluate the potential for aerosol generation and dispersion during HCD use. The entire HCD fleet was replaced to mitigate continued transmission.
Results:
Clinical presentations of case patients and epidemiologic data supported intraoperative acquisition. M. abscessus was isolated from HCDs used on patients and molecular comparison with patient isolates demonstrated clonality. Smoke testing simulated aerosolization of M. abscessus from HCDs during device operation. Because the HCD fleet was replaced, no additional extrapulmonary HA-Mab infections due to the unique clone identified in this cluster have been detected.
Conclusions:
Despite adhering to HCD cleaning and disinfection strategies beyond manufacturer instructions for use, HCDs became colonized with and ultimately transmitted M. abscessus to 3 patients. Design modifications to better contain aerosols or filter exhaust during device operation are needed to prevent NTM transmission events from water-based HCDs.
Recent studies have sought to understand the epidemiology and impact of beta-lactam allergy labels on children; however, most of these studies have focused on penicillin allergy labels. Fewer studies assess cephalosporin antibiotic allergy labels in children. The objective of this study was to determine the prevalence, factors associated with, and impact of cephalosporin allergy labels in children cared for in the primary care setting.
Methods:
Cephalosporin allergy labels were reviewed among children in a dual center, retrospective, birth cohort who were born between 2010 and 2020 and followed in 90 pediatric primary care practices. Antibiotic prescriptions for acute otitis media were compared in children with and without cephalosporin allergies.
Results:
334,465 children comprised the birth cohort and 2,877 (0.9%) were labeled as cephalosporin allergic during the study period at a median age of 1.6 years. Third-generation cephalosporins were the most common class of cephalosporin allergy (83.0%). Cephalosporin allergy labels were more common in children with penicillin allergy labels than those without (5.8% vs. 0.6%). Other factors associated with a cephalosporin allergy label included white race, private insurance, presence of a chronic condition, and increased health care utilization. Children with third-generation cephalosporin allergy labels received more amoxicillin/clavulanate (28.8% vs. 10.2%) and macrolides (10.4% vs. 1.9%) and less amoxicillin (55.8% vs. 70.9%) for treatment of acute otitis media than non-allergic peers p < 0.001.
Conclusions:
One in 100 children is labeled as cephalosporin allergic, and these children receive different antibiotics for the treatment of acute otitis media compared to non-allergic peers.
Key theoretical frameworks have proposed that examining the impact of exposure to specific dimensions of stress at specific developmental periods is likely to yield important insight into processes of risk and resilience. Utilizing a sample of N = 549 young adults who provided a detailed retrospective history of their lifetime exposure to numerous dimensions of traumatic stress and ratings of their current trauma-related symptomatology via completion of an online survey, here we test whether an individual’s perception of their lifetime stress as either controllable or predictable buffered the impact of exposure on trauma-related symptomatology assessed in adulthood. Further, we tested whether this moderation effect differed when evaluated in the context of early childhood, middle childhood, adolescence, and young adulthood stress. Consistent with hypotheses, results highlight both stressor controllability and stressor predictability as buffering the impact of traumatic stress exposure on trauma-related symptomatology and suggest that the potency of this buffering effect varies across unique developmental periods. Leveraging dimensional ratings of lifetime stress exposure to probe heterogeneity in outcomes following stress – and, critically, considering interactions between dimensions of exposure and the developmental period when stress occurred – is likely to yield increased understanding of risk and resilience following traumatic stress.
The worldwide spread of the COVID-19 pandemic affected all major sectors, including higher education. The measures to contain this deadly disease led to the closure of universities across the globe, introducing several changes in students’ academic and social experience. During the last two years, self-isolation together with the difficulties linked to online teaching and learning, have amplified psychological burden and mental health vulnerability of students.
Objectives
We aimed to explore in depth students’ feelings and perspectives regarding the impact of the COVID-19 on their mental health and to compare these data among students from Italy and the UK.
Methods
Data were resulting from the qualitative arm of “the CAMPUS study”, a large ongoing project to longitudinally assess the mental health of university students enrolled at the University of Milano-Bicocca (Unimib, Italy) and the University of Surrey (UoS, Guildford, UK). We conducted in-depth interviews through the Microsoft Teams online platform between September 2021 and April 2022, and thematically analysed the transcripts.
Results
A total of 33 students (15 for Unimib and 18 for UoS), with a wide range of sociodemographic characteristics, were interviewed. Four themes were identified: i) impact of COVID-19 on students’ mental health; ii) causes of poor mental health; iii) most vulnerable subgroups; vi) coping strategies.
Anxiety symptoms, social anxiety, and stress were frequently reported as negative effects of the pandemic, while the main sources of poor mental health were identified in loneliness, exceeding time online, unhealthy management of space and time, bad organization/communication with university, low motivation and uncertainty about the future. Freshers, international or off-campus students, as well as both extremely extroverted and introverted subjects, represented the most vulnerable populations, because of their extensive exposure to loneliness. Among coping strategies, the opportunity to take time for yourself, family support, and mental health support were common in the sample.
Some differences were found comparing students from Italy and the UK. While at Unimib the impact of COVID-19 on mental health was mainly described in relation to academic worries and the inadequate organization of the university system, UoS students, familiar to the conviviality of campus life, explained these effects as a result of the drastic loss of social connectedness.
Conclusions
The current study highlights the key role of mental health support for university students, mainly during crisis times, and calls for measures to improve communication between students and the educational institution, as well as to encourage social connectedness.
Background: ALS is a progressive neurodegenerative disease without a cure and limited treatment options. Edaravone, a free radical scavenger, was shown to slow disease progression in a select group of patients with ALS over 6 months; however, the effect on survival was not investigated in randomized trials. The objective of this study is to describe real-world survival effectiveness over a longer timeframe. Methods: This retrospective cohort study included patients with ALS across Canada with symptom onset up to three years. Those with a minimum 6-month edaravone exposure between 2017 and 2022 were enrolled in the interventional arm, and those without formed the control arm. The primary outcome of tracheostomy-free survival was compared between the two groups, accounting for age, sex, ALS-disease progression rate, disease duration, pulmonary vital capacity, bulbar ALS-onset, and presence of frontotemporal dementia or C9ORF72 mutation using inverse propensity treatment weights. Results: 182 patients with mean ± SD age 60±11 years were enrolled in the edaravone arm and 860 in the control arm (mean ± SD age 63±12 years). Mean ± SD time from onset to edaravone initiation was 18±10 months. Tracheostomy-free survival will be calculated. Conclusions: This study will provide evidence for edaravone effectiveness on tracheostomy-free survival in patients with ALS.
Community-based, public, not-for-profit teaching hospital in the southeastern United States.
Participants:
Adult inpatients with a positive urine culture and the absence of urinary tract infection signs and symptoms.
Intervention:
Implementation of a microbiology comment nudge on urine cultures.
Results:
In total, 204 patients were included in the study. Antibiotics were less likely to be continued beyond 72 hours in the postimplementation group: 57 (55%) of 104 versus 38 (38%) of 100 (P = .016). They were less likely to have antibiotics continued beyond 48 hours: 60 (58%) of 104 versus 43 (43%) of 100 (P = .036). They were also less likely to have antibiotics prescribed at discharge 35 (34%) of 104 versus 20 (20%) of 100 (P = .028). In addition, they had fewer total antibiotic days of therapy: 4 (IQR, 1–6) versus 1 (IQR, 0–6) (P = .022).
Conclusion:
Microbiology comment nudging may contribute to less antibiotic utilization in patients with ASB.
Intensive agricultural crop production is typically associated with low biodiversity. Low biodiversity is associated with a deficit of ecosystem services, which may limit crop yield (e.g., low pollination of insect-pollinated crops) at the individual field level or exacerbate the landscape-level impacts of intensive agriculture. To increase biodiversity and enhance ecosystem services with minimal loss of crop production area, farmers can plant desirable non-crop species near crop fields. Adoption of this practice is limited by inefficiencies in existing establishment methods. We have developed a novel seed-molding method allowing non-crop species to be planted with a conventional corn (Zea mays L.) planter, reducing labor and capital costs associated with native species establishment. Common milkweed (Asclepias syriaca L.) was selected as a model native species, because Asclepias plants are the sole food source for monarch butterfly (Danaus plexippus L.) larvae. Stratified A. syriaca seeds were added to a mixture of binder (maltodextrin) and filler (diatomaceous earth and wood flour) materials in a 3D-printed mold with the dimensions of a corn seed. The resulting Multi-Seed Zea Pellets (MSZP), shaped like corn seeds, were tested against non-pelleted A. syriaca seeds in several indoor and outdoor pot experiments. Molding into MSZP did not affect percent emergence or time to emergence from a 2-cm planting depth. Intraspecific competition among seedlings that emerged from an MSZP did not differ from competition among seedlings that emerged from a cluster of non-pelleted seeds. These findings demonstrate the potential of MSZP technology as a precise and efficient method for increasing agroecosystem biodiversity.
The attempt to provide a firm scientific basis for understanding consciousness is now in full swing, with special contributions from two areas. One is experimental: brain imaging is providing ever increasing detail of the brain structures used by humans (and other animals) as they solve a variety of tasks, including those of higher cognition. The other is theoretical: the discipline of neural networks is allowing models of these cognitive processes to be constructed and tested against the available data. In particular, a control framework can be created to give a global view of the brain. The highest cognitive process, that of consciousness, is naturally a target for such experimentation and modelling. This paper reviews available data and related models leading to the central representation, which involves particular brain regions and functional processing. Principles of consciousness, which have great relevance to the question in the title, are thereby deduced. The requisite neuronal systems needed to provide animal experience, and the problem of assessing the quality and quantity of such experience, will then be considered. In conclusion, animal consciousness is seen to exist broadly across those species with the requisite control structures; the level of pain and other sensations depends in an increasingly well-defined manner on the complexity of the cerebral apparatus.
We assessed patterns of enteric infections caused by 14 pathogens, in a longitudinal cohort study of sequelae in British Columbia (BC) Canada, 2005–2014. Our population cohort of 5.8 million individuals was followed for an average of 7.5 years/person; during this time, 40 523 individuals experienced 42 308 incident laboratory-confirmed, provincially reported enteric infections (96.4 incident infections per 100 000 person-years). Most individuals (38 882/40 523; 96%) had only one, but 4% had multiple concurrent infections or more than one infection across the study. Among individuals with more than one infection, the pathogens and combinations occurring most frequently per individual matched the pathogens occurring most frequently in the BC population. An additional 298 557 new fee-for-service physician visits and hospitalisations for enteric infections, that did not coincide with a reported enteric infection, also occurred, and some may be potentially unreported enteric infections. Our findings demonstrate that sequelae risk analyses should explore the possible impacts of multiple infections, and that estimating risk for individuals who may have had a potentially unreported enteric infection is warranted.