To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As the unemployment rate continues to shrink, organizations are increasingly in competition for the best talent. For this reason, the ability to effectively source, recruit, and hire qualified employees has become a cornerstone of effective human resource management, and a critical function in creating value through human capital. However, in order to capitalize on effective recruitment and hiring, organizations need to be able to retain and motivate new employees throughout their first year, during which studies estimate the risk of newcomer turnover ranges from 10 percent to as much as 50 percent and above for some jobs (Maurer, 2017). Thus the organizational entry period comes on the heels of substantial investment on the part of employers with the potential for both significant payoff and significant risk (Kammeyer-Mueller & Wanberg, 2003; Wanberg, 2012).
To utilise a community-based participatory approach in the design and implementation of an intervention targeting diet-related health problems on Navajo Nation.
A dual strategy approach of community needs/assets assessment and engagement of cross-sectorial partners in programme design with systematic cyclical feedback for programme modifications.
Navajo Nation, USA.
Navajo families with individuals meeting criteria for programme enrolment. Participant enrolment increased with iterative cycles.
The Navajo Fruit and Vegetable Prescription (FVRx) Programme.
A broad, community-driven and culturally relevant programme design has resulted in a programme able to maintain core programmatic principles, while also allowing for flexible adaptation to changing needs.
Tourniquets (TQs) save lives. Although military-approved TQs appear more effective than improvised TQs in controlling exsanguinating extremity hemorrhage, their bulk may preclude every day carry (EDC) by civilian lay-providers, limiting availability during emergencies.
The purpose of the current study was to compare the efficacy of three novel commercial TQ designs to a military-approved TQ.
Nine Emergency Medicine residents evaluated four different TQ designs: Gen 7 Combat Application Tourniquet (CAT7; control), Stretch Wrap and Tuck Tourniquet (SWAT-T), Gen 2 Rapid Application Tourniquet System (RATS), and Tourni-Key (TK). Popliteal artery flow cessation was determined using a ZONARE ZS3 ultrasound. Steady state maximal generated force was measured for 30 seconds with a thin-film force sensor.
Success rates for distal arterial flow cessation were 89% CAT7; 67% SWAT-T; 89% RATS; and 78% TK (H 0.89; P = .83). Mean (SD) application times were 10.4 (SD = 1.7) seconds CAT7; 23.1 (SD = 9.0) seconds SWAT-T; 11.1 (SD = 3.8) seconds RATS; and 20.0 (SD = 7.1) seconds TK (F 9.71; P <.001). Steady state maximal forces were 29.9 (SD = 1.2) N CAT7; 23.4 (SD = 0.8) N SWAT-T; 33.0 (SD = 1.3) N RATS; and 41.9 (SD = 1.3) N TK.
All novel TQ systems were non-inferior to the military-approved CAT7. Mean application times were less than 30 seconds for all four designs. The size of these novel TQs may make them more conducive to lay-provider EDC, thereby increasing community resiliency and improving the response to high-threat events.
The UK has longstanding problems with psychiatry recruitment. Various initiatives aim to improve psychiatry's image among medical students, but involve research and none are student-led. Providing opportunities to take part in psychiatry research and quality improvement could increase the number of students who choose to enter the speciality.
We have developed the student psychiatry audit and research collaborative (SPARC), a student-led initiative for nationwide collaboration in high-quality research and audits.
Our model is inspired by the success of the UK Student audit and research in surgery (STARSurg). Area teams, located in medical schools, take part in multi-centre projects. The area teams consist of medical students, who have the main responsibility for collecting data; a junior doctor, to supervise the process; and a consultant, with overall responsibility for patient care. The data collected centrally and analysed by a team of medical students and doctors. Student leads from each site are named authors on resulting papers. All other students are acknowledged and are able to present the work.
We have completed our first audits in Cardiff and London; other sites will return data in 2017. Student feedback indicated a high level of satisfaction with the project and interest in psychiatry as a future career.
This initiative aims to tackle the recruitment problems in psychiatry by giving students a chance to take part in high quality research and audits.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Venous thromboembolism (VTE) is a potentially fatal condition. Hospital-associated VTE leads to more than 25,000 deaths per year in the UK. Therefore identification of at-risk patients is crucial. Psychiatric in-patients have unique factors which may affect their risk of VTE (antipsychotic prescription, restraint) however there are currently no UK guidelines which specifically address VTE risk in this population.
We assessed VTE risk among psychiatric inpatients in Cardiff and Vale university health board, Wales, UK, and whether proformas currently provided for VTE risk assessment were being completed.
All acute adult in-patient and old age psychiatric wards were assessed by a team of medical students and a junior doctor over three days. We used the UK department of health VTE risk assessment tool which was adapted to include factors specific for psychiatric patients. We also assessed if there were concerns about prescribing VTE prophylaxis (compression stockings or anticoagulants), because of a history of self-harm or ligature use.
Of the 145 patients included, 0% had a completed VTE risk assessment form. We found 38.6% to be at an increased risk of VTE and there were concerns about prescribing VTE prophylaxis in 31% of patients.
Our findings suggest that VTE risk assessment is not being carried out on psychiatric wards. Staff education is needed to improve awareness of VTE. Specific guidance for this population is needed due to the presence of unique risk factors in psychiatric in-patients and concerns regarding VTE prophylaxis.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
Drought and high temperature each damage rice (Oryza sativa L.) crops. Their effect during seed development and maturation on subsequent seed quality development was investigated in Japonica (cv. Gleva) and Indica (cv. Aeron 1) plants grown in controlled environments subjected to drought (irrigation ended) and/or brief high temperature (HT; 3 days at 40/30°C). Ending irrigation early in cv. Gleva (7 or 14 days after anthesis, DAA) resulted in earlier plant senescence, more rapid decline in seed moisture content, more rapid seed quality development initially, but substantial decline later in planta in the ability of seeds to germinate normally. Subsequent seed storage longevity amongst later harvests was greatest with no drought because with drought it declined from 16 or 22 DAA onwards in planta, 9 or 8 days after irrigation ended, respectively. Later drought (14 or 28 DAA) also reduced seed longevity at harvest maturity (42 DAA). Well-irrigated plants provided poorer longevity the earlier during seed development they were exposed to HT (greatest at anthesis and histodifferentiation; no effect during seed maturation). Combining drought and HT damaged seed quality more than each stress alone, and more so in the Japonica cv. Gleva than the Indica cv. Aeron 1. Overall, the earlier plant drought occurred the greater the damage to subsequent seed quality; seed quality was most vulnerable to damage from plant drought and HT at anthesis and histodifferentiation; and seed quality of the Indica rice was more resilient to damage from these stresses than the Japonica.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
Decreases in cognitive function related to increases in oxidative stress and inflammation occur with ageing. Acknowledging the free radical-quenching activity and anti-inflammatory action of the carotenoid lycopene, the aim of the present review was to assess if there is evidence for a protective relationship between lycopene and maintained cognitive function or between lycopene and development or progression of dementia. A systematic literature search identified five cross-sectional and five longitudinal studies examining these outcomes in relation to circulating or dietary lycopene. Among four studies evaluating relationships between lycopene and maintained cognition, three reported significant positive relationships. Neither of the two studies reporting on relationship between lycopene and development of dementia reported significant results. Of four studies investigating circulating lycopene and pre-existing dementia, only one reported significant associations between lower circulating lycopene and higher rates of Alzheimer's disease mortality. Acknowledging heterogeneity among studies, there is insufficient evidence and a paucity of data to draw firm conclusions or tease apart direct effects of lycopene. Nevertheless, as low circulating lycopene is a predictor of all-cause mortality, further investigation into its relationship with cognitive longevity and dementia-related mortality is warranted.
Introduction: Little is known about the variety of roles volunteers play in the emergency department (ED), and the potential impact they have on patient experience. The objective of this scoping review was to identify published and unpublished reports that described volunteer programs in EDs, and determine how these programs impacted patient experiences or outcomes. Methods: Electronic searches of Medline, EMBASE, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews and CINAHL were conducted and reference lists were hand-searched. A grey literature search was also conducted (Web of Science, ProQuest, Canadian Business and Current Affairs Database ProQuest Dissertations and Theses Global). Two reviewers independently screened titles and abstracts, reviewed full text articles, and extracted data. Results: The search strategy yielded 4,589 potentially relevant citations. After eliminating duplicate citations and articles that did not meet eligibility criteria, 87 reports were included in the review. Of the included reports, 18 were peer-reviewed articles, 6 were conference proceedings, 59 were magazine or newspaper articles, and 4 were graduate dissertations or theses. Volunteer activities were categorized as non-clinical tasks (e.g., provision of meals/snacks, comfort items and mobility assistance), navigation, emotional support/communication, and administrative duties. 52 (59.8%) programs had general volunteers in the ED and 35 (40.2%) had volunteers targeting a specific patient population, including pediatrics, geriatrics, patients with mental health and addiction issues and other vulnerable populations. 20 (23.0%) programs included an evaluative component describing how ED volunteers affected patient experiences and outcomes. Patient satisfaction, follow-up and referral rates, ED and hospital costs and length of stay, subsequent ED visits, medical complications, and malnutrition in the hospital were all reported to be positively affected by volunteers in the ED. Conclusion: This scoping review demonstrates the important role volunteers play in enhancing patient and caregiver experience in the ED. Future volunteer engagement programs implemented in the ED should be formally described and evaluated to share their success and experience with others interested in implementing similar programs in the ED.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Approximately 70% of the 30 000 known bee (Hymenoptera) species and most flower-visiting, solitary wasps (Hymenoptera) nest in the ground. However, nesting behaviours of most ground-nesting bees and wasps are poorly understood. Habitat loss, including nesting habitat, threatens populations of ground-nesting bees and wasps. Most ground-nesting bee and wasp studies implement trapping methods that capture foraging individuals, but provide little insight into the nesting preferences of these taxa. Some researchers have suggested that emergence traps may provide a suitable means by which to determine ground-nesting bee and wasp abundance. We sought to evaluate nest-site selection of ground-nesting bees and wasps using emergence traps in two study systems: (1) planted wildflower enhancement plots and fallow control plots in agricultural land; and (2) upland pine and hammock habitat in forests. Over the course of three years (2015–2017), we collected 306 ground-nesting bees and wasps across all study sites from emergence traps. In one study, we compared captures per trap between coloured pan traps and emergence traps and found that coloured pan traps captured far more ground-nesting bees and wasps than did emergence traps. Based on our emergence trap data, our results also suggest ground-nesting bees and wasps are more apt to nest within wildflower enhancement plots than in fallow control plots, and in upland pine habitats than in hammock forests. In conclusion, emergence traps have potential to be a unique tool to gain understanding of ground-nesting bee and wasp habitat requirements.
Unfavourable dietary habits, such as skipping breakfast, are common among ethnic minority children and may contribute to inequalities in cardiometabolic disease. We conducted a longitudinal follow-up of a subsample of the UK multi-ethnic Determinants of Adolescent Social well-being and Health cohort, which represents the main UK ethnic groups and is now aged 21–23 years. We aimed to describe longitudinal patterns of dietary intake and investigate their impact on cardiometabolic risk in young adulthood. Participants completed a dietary behaviour questionnaire and a 24 h dietary intake recall; anthropometry, blood pressure, total cholesterol and HDL-cholesterol and HbA1c were measured. The cohort consisted of 107 White British, 102 Black Caribbean, 132 Black African, 98 Indian, 111 Bangladeshi/Pakistani and 115 other/mixed ethnicity. Unhealthful dietary behaviours such as skipping breakfast and low intake of fruits and vegetables were common (56, 57 and 63 %, respectively). Rates of skipping breakfast and low fruit and vegetable consumption were highest among Black African and Black Caribbean participants. BMI and cholesterol levels at 21–23 years were higher among those who regularly skipped breakfast at 11–13 years (BMI 1·41 (95 % CI 0·57, 2·26), P=0·001; cholesterol 0·15 (95 % CI –0·01, 0·31), P=0·063) and at 21–23 years (BMI 1·05 (95 % CI 0·22, 1·89), P=0·014; cholesterol 0·22 (95 % CI 0·06, 0·37), P=0·007). Childhood breakfast skipping is more common in certain ethnic groups and is associated with cardiometabolic risk factors in young adulthood. Our findings highlight the importance of targeting interventions to improve dietary behaviours such as breakfast consumption at specific population groups.
Flexible piezoelectric generators (PEGs) present a unique opportunity for renewable and sustainable energy harvesting. Here, we present a low-temperature and low-energy deposition method using solvent evaporation-assisted three-dimensional printing to deposit electroactive poly(vinylidene fluoride) (PVDF)-trifluoroethylene (TrFE) up to 19 structured layers. Visible-wavelength transmittance was above 92%, while ATR-FTIR spectroscopy showed little change in the electroactive phase fraction between layer depositions. Electroactivity from the fabricated PVDF-TrFE PEGs showed that a single structured layer gave the greatest output at 289.3 mV peak-to-peak voltage. This was proposed to be due to shear-induced polarization affording the alignment of the fluoropolymer dipoles without an electric field or high temperature.
Two category 5 storms hit the US Virgin Islands (USVI) within 13 days of each other in September 2017. This caused an almost complete loss of power and devastated critical infrastructure such as the hospitals and airports
The USVI Department of Health conducted 2 response Community Assessments for Public Health Emergency Response (CASPERs) in November 2017 and a recovery CASPER in February 2018. CASPER is a 2-stage cluster sampling method designed to provide household-based information about a community’s needs in a timely, inexpensive, and representative manner.
Almost 70% of homes were damaged or destroyed, 81.2% of homes still needed repair, and 10.4% of respondents felt their home was unsafe to live in approximately 5 months after the storms. Eighteen percent of individual respondents indicated that their mental health was “not good” for 14 or more days in the past month, a significant increase from 2016.
The CASPERs helped characterize the status and needs of residents after the devastating hurricanes and illustrate the evolving needs of the community and the progression of the recovery process. CASPER findings were shared with response and recovery partners to promote data-driven recovery efforts, improve the efficiency of the current response and recovery efforts, and strengthen emergency preparedness in USVI. (Disaster Med Public Health Preparedness. 2019;13:53-62)
Two Category 5 storms, Hurricane Irma and Hurricane Maria, hit the U.S. Virgin Islands (USVI) within 13 days of each other in September 2017. These storms caused catastrophic damage across the territory, including widespread loss of power, destruction of homes, and devastation of critical infrastructure. During large scale disasters such as Hurricanes Irma and Maria, public health surveillance is an important tool to track emerging illnesses and injuries, identify at-risk populations, and assess the effectiveness of response efforts. The USVI Department of Health (DoH) partnered with shelter staff volunteers to monitor the health of the sheltered population and help guide response efforts.
Shelter volunteers collect data on the American Red Cross Aggregate Morbidity Report form that tallies the number of client visits at a shelter’s health services every 24 hours. Morbidity data were collected at all 5 shelters on St. Thomas and St. Croix between September and October 2017. This article describes the health surveillance data collected in response to Hurricanes Irma and Maria.
Following Hurricanes Irma and Maria, 1130 health-related client visits were reported, accounting for 1655 reasons for the visits (each client may have more than 1 reason for a single visit). Only 1 shelter reported data daily. Over half of visits (51.2%) were for health care management; 17.7% for acute illnesses, which include respiratory conditions, gastrointestinal symptoms, and pain; 14.6% for exacerbation of chronic disease; 9.8% for mental health; and 6.7% for injury. Shelter volunteers treated many clients within the shelters; however, reporting of the disposition (eg, referred to physician, pharmacist) was often missed (78.1%).
Shelter surveillance is an efficient means of quickly identifying and characterizing health issues and concerns in sheltered populations following disasters, allowing for the development of evidence-based strategies to address identified needs. When incorporated into broader surveillance strategies using multiple data sources, shelter data can enable disaster epidemiologists to paint a more comprehensive picture of community health, thereby planning and responding to health issues both within and outside of shelters. The findings from this report illustrated that managing chronic conditions presented a more notable resource demand than acute injuries and illnesses. Although there remains room for improvement because reporting was inconsistent throughout the response, the capacity of shelter staff to address the health needs of shelter residents and the ability to monitor the health needs in the sheltered population were critical resources for the USVI DoH overwhelmed by the disaster. (Disaster Med Public Health Preparedness. 2019;13:38-43)