We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Family interventions are critical in addressing many of the risks and issues of children and adolescents. However, a key factor in ensuring their effectiveness is understanding the context in which they are needed. This chapter describes the role of culture in shaping the acceptance of and access to family interventions. It focuses on how culture can influence the recognition of problems, access to information, openness to help-seeking, social support and acceptance of providers and interventions. It also discusses critical factors to enable community engagement with diverse ethnic and cultural groups, including information sharing, referral pathways, building social support, cultural adaptation, building trust relationships and cross-cultural competence, harnessing resources and using multi-disciplinal teams.
Experiencing poverty increases vulnerability for dysregulated hypothalamic–pituitary–adrenal (HPA) axis functioning and compromises long-term health. Positive parenting buffers children from HPA axis reactivity, yet this has primarily been documented among families not experiencing poverty. We tested the theorized power of positive parenting in 124 parent–child dyads recruited from Early Head Start (Mage = 25.21 months) by examining child cortisol trajectories using five samples collected across a standardized stress paradigm. Piecewise latent growth models revealed that positive parenting buffered children's stress responses when controlling for time of day, last stress task completed, and demographics. Positive parenting also interacted with income such that positive parenting was especially protective for cortisol reactivity in families experiencing greater poverty. Findings suggest that positive parenting behaviors are important for protecting children in families experiencing low income from heightened or prolonged physiologic stress reactivity to an acute stressor.
As the pathophysiology of Covid-19 emerges, this paper describes dysphagia as a sequela of the disease, including its diagnosis and management, hypothesised causes, symptomatology in relation to viral progression, and concurrent variables such as intubation, tracheostomy and delirium, at a tertiary UK hospital.
Results
During the first wave of the Covid-19 pandemic, 208 out of 736 patients (28.9 per cent) admitted to our institution with SARS-CoV-2 were referred for swallow assessment. Of the 208 patients, 102 were admitted to the intensive treatment unit for mechanical ventilation support, of which 82 were tracheostomised. The majority of patients regained near normal swallow function prior to discharge, regardless of intubation duration or tracheostomy status.
Conclusion
Dysphagia is prevalent in patients admitted either to the intensive treatment unit or the ward with Covid-19 related respiratory issues. This paper describes the crucial role of intensive swallow rehabilitation to manage dysphagia associated with this disease, including therapeutic respiratory weaning for those with a tracheostomy.
Background: Nosocomial central-line–associated bloodstream infections (CLABSIs) are an important cause of morbidity and mortality in hospitalized patients. CLABSI surveillance establishes rates for internal and external comparison, identifies risk factors, and allows assessment of interventions. Objectives: To determine the frequency of CLABSIs among adult patients admitted to intensive care units (ICUs) in CNISP hospitals and evaluate trends over time. Methods: CNISP is a collaborative effort of the Canadian Hospital Epidemiology Committee, the Association of Medical Microbiologists and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, CNISP has conducted hospital-based sentinel surveillance of healthcare-associated infections. Overall, 55 CNISP hospitals participated in ≥1 year of CLABSI surveillance. Adult ICUs are categorized as mixed ICUs or cardiovascular (CV) surgery ICUs. Data were collected using standardized definitions and collection forms. Line-day denominators for each participating ICU were collected. Negative-binomial regression was used to test for linear trends, with robust standard errors to account for clustering by hospital. We used the Fisher exact test to compare binary variables. Results: Each year, 28–42 adult ICUs participated in surveillance (27–37 mixed, 6–8 CV surgery). In both mixed ICUs and CV-ICUs, rates remained relatively stable between 2011 and 2018 (Fig. 1). In mixed ICUs, CLABSI rates were 1.0 per 1,000 line days in 2011, and 1.0 per 1,000 line days in 2018 (test for linear trend, P = .66). In CV-ICUs, CLABSI rates were 1.1 per 1,000 line days in 2011 and 0.8 per 1,000 line days in 2018 (P = .19). Case age and gender distributions were consistent across the surveillance period. The 30-day all-cause mortality rate was 29% in 2011 and in 2018 (annual range, 29%–35%). Between 2011 and 2018, the percentage of isolated microorganisms that were coagulase-negative staphylococci (CONS) decreased from 31% to 18% (P = .004). The percentage of other gram-positive organisms increased from 32% to 37% (P = .34); Bacillus increased from 0% to 4% of isolates and methicillin-susceptible Staphylococcus aureus from 2% to 6%). The gram-negative organisms increased from 21% to 27% (P = .19). Yeast represented 16% in 2011 and 18% in 2018; however, the percentage of yeast that were Candida albicans decreased over time (58% of yeast in 2011 and 30% in 2018; P = .04). Between 2011 and 2018, the most commonly identified species of microorganism in each year were CONS (18% in 2018) and Enterococcus spp (18% in 2018). Conclusions: Ongoing CLABSI surveillance has shown stable rates of CLABSI in adult ICUs from 2011 to 2018. The causative microorganisms have changed, with CONS decreasing from 31% to 18%.
Funding: CNISP is funded by the Public Health Agency of Canada.
Disclosures: Allison McGeer reports funds to her for studies, for which she is the principal investigator, from Pfizer and Merck, as well as consulting fees from Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
This study aimed to identify diets with improved nutrient quality and environmental impact within the boundaries of dietary practices.
Design:
We used Data Envelopment Analysis to benchmark diets for improved adherence to food-based dietary guidelines (FBDG). We then optimised these diets for dietary preferences, nutrient quality and environmental impact. Diets were evaluated using the Nutrient Rich Diet score (NRD15.3), diet-related greenhouse gas emission (GHGE) and a diet similarity index that quantified the proportion of food intake that remained similar as compared with the observed diet.
Setting:
National dietary surveys of four European countries (Denmark, Czech Republic, Italy and France).
Subjects:
Approximately 6500 adults, aged 18–64 years.
Results:
When dietary preferences were prioritised, NRD15·3 was ~6 % higher, GHGE was ~4 % lower and ~85 % of food intake remained similar. This diet had higher amounts of fruit, vegetables and whole grains than the observed diet. When nutrient quality was prioritised, NRD15·3 was ~16 % higher, GHGE was ~3 % lower and ~72 % of food intake remained similar. This diet had higher amounts of legumes and fish and lower amounts of sweetened and alcoholic beverages. Finally, when environmental impact was prioritised, NRD15·3 was ~9 % higher, GHGE was ~21 % lower and ~73 % of food intake remained similar. In this diet, red and processed meat partly shifted to either eggs, poultry, fish or dairy.
Conclusions:
Benchmark modelling can generate diets with improved adherence to FBDG within the boundaries of dietary practices, but fully maximising health and minimising GHGE cannot be achieved simultaneously.
Middle grades education has been the object of efforts to remediate US education to address an array of social problems. Districts have sought out K-8 models to create smaller learning communities, require fewer school transitions, and allow sustained student connections. This paper offers a historical analysis of K-8 schools, drawing on statistical and spatial methods and a DisCrit intersectional lens to illustrate how creating K-8 schools produced enclaves of privilege in one urban school district. K-8 schools in our target district became whiter and wealthier than district averages. Students with disabilities attending K-8 schools tended to be placed in more inclusive classrooms, where they were more likely to be integrated alongside nondisabled peers than counterparts attending traditional middle schools. We consider how the configuration of K-8 schools, which could be considered an administrative decision to better serve students, has obscured interworkings of power and privilege.
As the unemployment rate continues to shrink, organizations are increasingly in competition for the best talent. For this reason, the ability to effectively source, recruit, and hire qualified employees has become a cornerstone of effective human resource management, and a critical function in creating value through human capital. However, in order to capitalize on effective recruitment and hiring, organizations need to be able to retain and motivate new employees throughout their first year, during which studies estimate the risk of newcomer turnover ranges from 10 percent to as much as 50 percent and above for some jobs (Maurer, 2017). Thus the organizational entry period comes on the heels of substantial investment on the part of employers with the potential for both significant payoff and significant risk (Kammeyer-Mueller & Wanberg, 2003; Wanberg, 2012).
To utilise a community-based participatory approach in the design and implementation of an intervention targeting diet-related health problems on Navajo Nation.
Design:
A dual strategy approach of community needs/assets assessment and engagement of cross-sectorial partners in programme design with systematic cyclical feedback for programme modifications.
Setting:
Navajo Nation, USA.
Participants:
Navajo families with individuals meeting criteria for programme enrolment. Participant enrolment increased with iterative cycles.
Results:
The Navajo Fruit and Vegetable Prescription (FVRx) Programme.
Conclusions:
A broad, community-driven and culturally relevant programme design has resulted in a programme able to maintain core programmatic principles, while also allowing for flexible adaptation to changing needs.
Tourniquets (TQs) save lives. Although military-approved TQs appear more effective than improvised TQs in controlling exsanguinating extremity hemorrhage, their bulk may preclude every day carry (EDC) by civilian lay-providers, limiting availability during emergencies.
Study Objective:
The purpose of the current study was to compare the efficacy of three novel commercial TQ designs to a military-approved TQ.
Methods:
Nine Emergency Medicine residents evaluated four different TQ designs: Gen 7 Combat Application Tourniquet (CAT7; control), Stretch Wrap and Tuck Tourniquet (SWAT-T), Gen 2 Rapid Application Tourniquet System (RATS), and Tourni-Key (TK). Popliteal artery flow cessation was determined using a ZONARE ZS3 ultrasound. Steady state maximal generated force was measured for 30 seconds with a thin-film force sensor.
Results:
Success rates for distal arterial flow cessation were 89% CAT7; 67% SWAT-T; 89% RATS; and 78% TK (H 0.89; P = .83). Mean (SD) application times were 10.4 (SD = 1.7) seconds CAT7; 23.1 (SD = 9.0) seconds SWAT-T; 11.1 (SD = 3.8) seconds RATS; and 20.0 (SD = 7.1) seconds TK (F 9.71; P <.001). Steady state maximal forces were 29.9 (SD = 1.2) N CAT7; 23.4 (SD = 0.8) N SWAT-T; 33.0 (SD = 1.3) N RATS; and 41.9 (SD = 1.3) N TK.
Conclusion:
All novel TQ systems were non-inferior to the military-approved CAT7. Mean application times were less than 30 seconds for all four designs. The size of these novel TQs may make them more conducive to lay-provider EDC, thereby increasing community resiliency and improving the response to high-threat events.
The UK has longstanding problems with psychiatry recruitment. Various initiatives aim to improve psychiatry's image among medical students, but involve research and none are student-led. Providing opportunities to take part in psychiatry research and quality improvement could increase the number of students who choose to enter the speciality.
Objectives
We have developed the student psychiatry audit and research collaborative (SPARC), a student-led initiative for nationwide collaboration in high-quality research and audits.
Methods
Our model is inspired by the success of the UK Student audit and research in surgery (STARSurg). Area teams, located in medical schools, take part in multi-centre projects. The area teams consist of medical students, who have the main responsibility for collecting data; a junior doctor, to supervise the process; and a consultant, with overall responsibility for patient care. The data collected centrally and analysed by a team of medical students and doctors. Student leads from each site are named authors on resulting papers. All other students are acknowledged and are able to present the work.
Results
We have completed our first audits in Cardiff and London; other sites will return data in 2017. Student feedback indicated a high level of satisfaction with the project and interest in psychiatry as a future career.
Conclusions
This initiative aims to tackle the recruitment problems in psychiatry by giving students a chance to take part in high quality research and audits.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Venous thromboembolism (VTE) is a potentially fatal condition. Hospital-associated VTE leads to more than 25,000 deaths per year in the UK. Therefore identification of at-risk patients is crucial. Psychiatric in-patients have unique factors which may affect their risk of VTE (antipsychotic prescription, restraint) however there are currently no UK guidelines which specifically address VTE risk in this population.
Objectives
We assessed VTE risk among psychiatric inpatients in Cardiff and Vale university health board, Wales, UK, and whether proformas currently provided for VTE risk assessment were being completed.
Methods
All acute adult in-patient and old age psychiatric wards were assessed by a team of medical students and a junior doctor over three days. We used the UK department of health VTE risk assessment tool which was adapted to include factors specific for psychiatric patients. We also assessed if there were concerns about prescribing VTE prophylaxis (compression stockings or anticoagulants), because of a history of self-harm or ligature use.
Results
Of the 145 patients included, 0% had a completed VTE risk assessment form. We found 38.6% to be at an increased risk of VTE and there were concerns about prescribing VTE prophylaxis in 31% of patients.
Conclusions
Our findings suggest that VTE risk assessment is not being carried out on psychiatric wards. Staff education is needed to improve awareness of VTE. Specific guidance for this population is needed due to the presence of unique risk factors in psychiatric in-patients and concerns regarding VTE prophylaxis.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
Drought and high temperature each damage rice (Oryza sativa L.) crops. Their effect during seed development and maturation on subsequent seed quality development was investigated in Japonica (cv. Gleva) and Indica (cv. Aeron 1) plants grown in controlled environments subjected to drought (irrigation ended) and/or brief high temperature (HT; 3 days at 40/30°C). Ending irrigation early in cv. Gleva (7 or 14 days after anthesis, DAA) resulted in earlier plant senescence, more rapid decline in seed moisture content, more rapid seed quality development initially, but substantial decline later in planta in the ability of seeds to germinate normally. Subsequent seed storage longevity amongst later harvests was greatest with no drought because with drought it declined from 16 or 22 DAA onwards in planta, 9 or 8 days after irrigation ended, respectively. Later drought (14 or 28 DAA) also reduced seed longevity at harvest maturity (42 DAA). Well-irrigated plants provided poorer longevity the earlier during seed development they were exposed to HT (greatest at anthesis and histodifferentiation; no effect during seed maturation). Combining drought and HT damaged seed quality more than each stress alone, and more so in the Japonica cv. Gleva than the Indica cv. Aeron 1. Overall, the earlier plant drought occurred the greater the damage to subsequent seed quality; seed quality was most vulnerable to damage from plant drought and HT at anthesis and histodifferentiation; and seed quality of the Indica rice was more resilient to damage from these stresses than the Japonica.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Method:
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
Results:
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
Conclusions:
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
Decreases in cognitive function related to increases in oxidative stress and inflammation occur with ageing. Acknowledging the free radical-quenching activity and anti-inflammatory action of the carotenoid lycopene, the aim of the present review was to assess if there is evidence for a protective relationship between lycopene and maintained cognitive function or between lycopene and development or progression of dementia. A systematic literature search identified five cross-sectional and five longitudinal studies examining these outcomes in relation to circulating or dietary lycopene. Among four studies evaluating relationships between lycopene and maintained cognition, three reported significant positive relationships. Neither of the two studies reporting on relationship between lycopene and development of dementia reported significant results. Of four studies investigating circulating lycopene and pre-existing dementia, only one reported significant associations between lower circulating lycopene and higher rates of Alzheimer's disease mortality. Acknowledging heterogeneity among studies, there is insufficient evidence and a paucity of data to draw firm conclusions or tease apart direct effects of lycopene. Nevertheless, as low circulating lycopene is a predictor of all-cause mortality, further investigation into its relationship with cognitive longevity and dementia-related mortality is warranted.
Introduction: Little is known about the variety of roles volunteers play in the emergency department (ED), and the potential impact they have on patient experience. The objective of this scoping review was to identify published and unpublished reports that described volunteer programs in EDs, and determine how these programs impacted patient experiences or outcomes. Methods: Electronic searches of Medline, EMBASE, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews and CINAHL were conducted and reference lists were hand-searched. A grey literature search was also conducted (Web of Science, ProQuest, Canadian Business and Current Affairs Database ProQuest Dissertations and Theses Global). Two reviewers independently screened titles and abstracts, reviewed full text articles, and extracted data. Results: The search strategy yielded 4,589 potentially relevant citations. After eliminating duplicate citations and articles that did not meet eligibility criteria, 87 reports were included in the review. Of the included reports, 18 were peer-reviewed articles, 6 were conference proceedings, 59 were magazine or newspaper articles, and 4 were graduate dissertations or theses. Volunteer activities were categorized as non-clinical tasks (e.g., provision of meals/snacks, comfort items and mobility assistance), navigation, emotional support/communication, and administrative duties. 52 (59.8%) programs had general volunteers in the ED and 35 (40.2%) had volunteers targeting a specific patient population, including pediatrics, geriatrics, patients with mental health and addiction issues and other vulnerable populations. 20 (23.0%) programs included an evaluative component describing how ED volunteers affected patient experiences and outcomes. Patient satisfaction, follow-up and referral rates, ED and hospital costs and length of stay, subsequent ED visits, medical complications, and malnutrition in the hospital were all reported to be positively affected by volunteers in the ED. Conclusion: This scoping review demonstrates the important role volunteers play in enhancing patient and caregiver experience in the ED. Future volunteer engagement programs implemented in the ED should be formally described and evaluated to share their success and experience with others interested in implementing similar programs in the ED.