To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Across Eurasia, horse transport transformed ancient societies. Although evidence for chariotry is well dated, the origins of horse riding are less clear. Techniques to distinguish chariotry from riding in archaeological samples rely on elements not typically recovered from many steppe contexts. Here, the authors examine horse remains of Mongolia's Deer Stone-Khirigsuur (DSK) Complex, comparing them with ancient and modern East Asian horses used for both types of transport. DSK horses demonstrate unique dentition damage that could result from steppe chariotry, but may also indicate riding with a shallow rein angle at a fast gait. A key role for chariots in Late Bronze Age Mongolia helps explain the trajectory of horse use in early East Asia.
Background: Sink drains can act as breeding grounds for multidrug-resistant (MDR) bacteria, leading to outbreaks. Drains provide a protected humid environment where nutrient-rich substances are available. Recent and growing installation of water and energy conservation devices have led to increased frequency of drain blockage due to biofilm accumulation. Ineffective drainage may lead to backflow and accumulation of water in the sink during use, increasing the risk of contaminated aerosols formation or direct contamination of surrounding material and equipment. Cleaning and disinfection procedures of sink drains need to be improved to prevent amplification and dispersion of MDR bacteria. The objective of this study was to investigate alternatives to reduce the biofilm and risk of contamination through aerosols. Methods: Sink drains from patient rooms were randomly selected in the neonatal intensive care unit of a 450-bed pediatric hospital. We tested 4 approaches: (1) new drain; (2) self-disinfecting heating-vibration drain; (3) chemical disinfection with 20 ppm chlorine for 30 minutes; and (4) thermal disinfection with > 90°C water for 30 minutes. A special device was used during disinfection to increase the disinfectant contact time with the biofilm. Treatments were conducted weekly, with prior sampling of drain water. Other drains were also sampled weekly, including a control drain with no intervention. Bacterial loads were evaluated using flow cytometry and heterotrophic plate counts. The drains were made of stainless steel, a heat-conductive material. Results: Preliminary results show that chlorine disinfection had a small impact (<1 log) on culturable bacteria at 48 hours after disinfection but not after a week or repeated weekly disinfection. Thermal disinfection using boiling water is promising, showing an important decrease of 4 log in culturable cells after 48 hours and a concentration still 100× lower 1 week after the disinfection. Repeated weekly thermal disinfection maintained lower culturable levels in the drain. No culturable cells were detected in water from the self-disinfecting drain 2 months after installation, whereas the new drain became fully colonized to concentrations similar to those of drains prior to interventions during the same period. Conclusions: Thermal disinfection of drains is a promising alternative to chlorine. This solution is interesting because it is nontoxic and easy to perform, requiring a small volume of hot water. The rapid recolonization of the new drain suggests that replacing contaminated drains is not a sustainable solution and would need to be paired with a thermal disinfection program to maintain low culturable cells.
Given the aging population of people with HIV (PWH), along with increasing rates of binge drinking among both PWH and the general older adult population, this study examined the independent and interactive effects of HIV, binge drinking, and age on neurocognition.
Participants were 146 drinkers stratified by HIV and binge drinking status (i.e., ≥4 drinks for women and ≥5 drinks for men within approximately 2 h): HIV+/Binge+ (n = 30), HIV−/Binge+ (n = 23), HIV+/Binge− (n = 55), HIV−/Binge− (n = 38). All participants completed a comprehensive neuropsychological battery measuring demographically-corrected global and domain-specific neurocognitive T scores. ANCOVA models examined independent and interactive effects of HIV and binge drinking on neurocognitive outcomes, adjusting for overall alcohol consumption, lifetime substance use, sex, and age. Subsequent multiple linear regressions examined whether HIV/Binge group moderated the relationship between age and neurocognition.
HIV+/Binge+ participants had worse global neurocognition, processing speed, delayed recall, and working memory than HIV−/Binge− participants (p’s < .05). While there were significant main effects of HIV and binge drinking, their interaction did not predict any of those neurocognitive outcomes (p’s > .05). Significant interactions between age and HIV/Binge group showed that HIV+/Binge+ participants demonstrated steeper negative relationships between age and neurocognitive outcomes of learning, delayed recall, and motor skills compared to HIV−/Binge− participants (p’s < .05).
Results showed adverse additive effects of HIV and binge drinking on neurocognitive functioning, with older adults demonstrating the most vulnerability to these effects. Findings support the need for interventions to reduce binge drinking, especially among older PWH.
This study aimed to conduct longitudinal analysis of suicide reviews for mental health service users in Ayrshire to improve local practice and outcomes. Traditional risk factors – middle-age, male and alcohol misuse – were hypothesised to convey greater risk of completing suicide.
Suicide is an important public health issue in Scotland, with potentially devastating impacts. Practice and policy may lag behind emerging evidence. Mental health problems are associated with an increased suicide risk, and care provided to those who take their own lives is reviewed to identify recommendations and learning points to improve practice and outcomes. However, these reviews and their conclusions are often considered individually, when studying them collectively over time it is necessary to characterise common themes and highlight factors that could be addressed to reduce suicide. Moreover, national averages can obscure local patterns.
Access to reviews of suicides for mental health service users in Ayrshire was granted by the Adverse Event Review Group. Relevant data were extracted for the 35 General Adult service users completing suicide between 2013 and 2015, including details of the act, demographics and clinical factors, and analysed for trends. Those with and without emotional instability as a primary diagnosis or significant problem were dichotomised to facilitate identification of statistically significant factors specific to these symptoms.
There were 35 completed suicides including three inpatients. Suicide was most common in the 25-29 and 45-54 age ranges, and over 68.6% were male. Hanging accounted for 60.0% of deaths, and self-poisoning for 8.6%. Up to 62.9% of patients did not appear to have ongoing scheduled appointments on a regular basis. Diagnoses were difficult to identify – 48.6% had no clear primary diagnosis specified in the reviews, and features of depressive, anxiety, psychotic, substance misuse and personality disorders frequently overlapped and co-occurred. 22.9% had problems with emotional instability; their median age was 14 years younger, and 87.5% were female.
Small sample size precluded detailed analysis. The traditional risk profile remains relevant. However, almost 25% of those completing suicide were younger females with emotional instability, despite frequent contact with services. Given the challenges in predicting suicide, we should continue to consider how best to prevent this tragic outcome in all service users, especially in younger females with emotional instability; middle-aged males who misuse alcohol, and those with ill-defined diffuse psychological difficulties who do not fit into discrete categories or are reviewed infrequently.
Indigenous Australians experience higher levels of psychological distress compared to the general population. Physical activity is a culturally acceptable approach, associated with reduction of depressive symptoms. The protective properties of physical activity for depressive symptoms are yet to be evaluated in older Indigenous Australians.
A two-phase study design comprised of a qualitative thematic analysis following a quantitative regression and moderation analysis.
Firstly, a total of 336 Indigenous Australians aged 60 years and over from five NSW areas participated in assessments on mental health, physical activity participation, and childhood trauma. Secondly, a focus group of seven Indigenous Australians was conducted to evaluate barriers and facilitators to physical activity.
Regression and moderation analyses examined links between depression, childhood trauma, and physical activity. Thematic analysis was conducted exploring facilitators and barriers to physical activity following the focus group.
Childhood trauma severity and intensity of physical activity predicted depressive symptoms. Physical activity did not affect the strength of the relationship between childhood trauma and depression. Family support and low impact activities facilitated commitment to physical activity. In contrast, poor mental health, trauma, and illness acted as barriers.
Physical activity is an appropriate approach for reducing depressive symptoms and integral in maintaining health and quality of life. While situational factors, health problems and trauma impact physical activity, accessing low-impact group activities with social support was identified to help navigate these barriers.
The triarchic model was advanced as an integrative, trait-based framework for investigating psychopathy using different assessment methods and across developmental periods. Recent research has shown that the triarchic traits of boldness, meanness, and disinhibition can be operationalized effectively in youth, but longitudinal research is needed to realize the model's potential to advance developmental understanding of psychopathy. We report on the creation and validation of scale measures of the triarchic traits using questionnaire items available in the University of Southern California Risk Factors for Antisocial Behavior (RFAB) project, a large-scale longitudinal study of the development of antisocial behavior that includes measures from multiple modalities (self-report, informant rating, clinical-diagnostic, task-behavioral, physiological). Using a construct-rating and psychometric refinement approach, we developed triarchic scales that showed acceptable reliability, expected intercorrelations, and good temporal stability. The scales showed theory-consistent relations with external criteria including measures of psychopathy, internalizing/externalizing psychopathology, antisocial behavior, and substance use. Findings demonstrate the viability of measuring triarchic traits in the RFAB sample, extend the known nomological network of these traits into the developmental realm, and provide a foundation for follow-up studies examining the etiology of psychopathic traits and their relations with multimodal measures of cognitive-affective function and proneness to clinical problems.
Although recognized as one of the most significant cultural transformations in North America, the reintroduction of the horse to the continent after AD 1492 has been rarely addressed by archaeological science. A key contributing factor behind this limited study is the apparent absence of equine skeletal remains from early historic archaeological contexts. Here, we present a multidisciplinary analysis of a horse skeleton recovered in Lehi, Utah, originally attributed to the Pleistocene. Reanalysis of stratigraphic context and radiocarbon dating indicates a historic age for this horse (cal AD 1681–1939), linking it with Ute or other Indigenous groups, whereas osteological features demonstrate its use for mounted horseback riding—perhaps with a nonframe saddle. DNA analysis indicates that the animal was a female domestic horse, which was likely cared for as part of a breeding herd despite outliving its usefulness in transport. Finally, sequentially sampled stable carbon, oxygen, and strontium isotope values from tooth enamel (δ13C, δ18O, and 87Sr/86Sr) suggest that the horse was raised locally. These results show the utility of archaeological science as applied to horse remains in understanding Indigenous horse pastoralism, whereas consideration of the broader archaeological record suggests a pattern of misidentification of horse bones from early historic contexts.
Recently, artificial intelligence-powered devices have been put forward as potentially powerful tools for the improvement of mental healthcare. An important question is how these devices impact the physician-patient interaction.
Aifred is an artificial intelligence-powered clinical decision support system (CDSS) for the treatment of major depression. Here, we explore the use of a simulation centre environment in evaluating the usability of Aifred, particularly its impact on the physician–patient interaction.
Twenty psychiatry and family medicine attending staff and residents were recruited to complete a 2.5-h study at a clinical interaction simulation centre with standardised patients. Each physician had the option of using the CDSS to inform their treatment choice in three 10-min clinical scenarios with standardised patients portraying mild, moderate and severe episodes of major depression. Feasibility and acceptability data were collected through self-report questionnaires, scenario observations, interviews and standardised patient feedback.
All 20 participants completed the study. Initial results indicate that the tool was acceptable to clinicians and feasible for use during clinical encounters. Clinicians indicated a willingness to use the tool in real clinical practice, a significant degree of trust in the system's predictions to assist with treatment selection, and reported that the tool helped increase patient understanding of and trust in treatment. The simulation environment allowed for the evaluation of the tool's impact on the physician–patient interaction.
The simulation centre allowed for direct observations of clinician use and impact of the tool on the clinician–patient interaction before clinical studies. It may therefore offer a useful and important environment in the early testing of new technological tools. The present results will inform further tool development and clinician training materials.
To identify factors that increase the microbial load in the operating room (OR) and recommend solutions to minimize the effect of these factors.
Observation and sampling study.
Academic health center, public hospitals.
We analyzed 4 videotaped orthopedic surgeries (15 hours in total) for door openings and staff movement. The data were translated into a script denoting a representative frequency and location of movements for each OR team member. These activities were then simulated for 30 minutes per trial in a functional operating room by the researchers re-enacting OR staff-member roles, while collecting bacteria and fungi using settle plates. To test the hypotheses on the influence of activity on microbial load, an experimental design was created in which each factor was tested at higher (and lower) than normal activity settings for a 30-minute period. These trials were conducted in 2 phases.
The frequency of door opening did not independently affect the microbial load in the OR. However, a longer duration and greater width of door opening led to increased microbial load in the OR. Increased staff movement also increased the microbial load. There was a significantly higher microbial load on the floor than at waist level.
Movement of staff and the duration and width of door opening definitely affects the OR microbial load. However, further investigation is needed to determine how the number of staff affects the microbial load and how to reduce the microbial load at the surgical table.
People living with serious mental illness (SMI) experience debilitating symptoms that worsen their physical health and quality of life. Regular physical activity (PA) may bring symptomatic improvements and enhance wellbeing. When undertaken in community-based group settings, PA may yield additional benefits such as reduced isolation. Initiating PA can be difficult for people with SMI, so PA engagement is commonly low. Designing acceptable and effective PA programs requires a better understanding of the lived experiences of PA initiation among people with SMI.
This systematic review of qualitative studies used the meta-ethnography approach by Noblit and Hare (1988). Electronic databases were searched from inception to November 2017. Eligible studies used qualitative methodology; involved adults (≥18 years) with schizophrenia, bipolar affective disorder, major depressive disorder, or psychosis; reported community-based group PA; and captured the experience of PA initiation, including key features of social support. Study selection and quality assessment were performed by four reviewers.
Sixteen studies were included in the review. We identified a “journey” that depicted a long sequence of phases involved in initiating PA. The journey demonstrated the thought processes, expectations, barriers, and support needs of people with SMI. In particular, social support from a trusted source played an important role in getting people to the activity, both physically and emotionally.
The journey illustrated that initiation of PA for people with SMI is a long complex transition. This complex process needs to be understood before ongoing participation in PA can be addressed. Registration—The review was registered on the International Prospective Register of Systematic Reviews (PROSPERO) on 22/03/2017 (registration number CRD42017059948).
Preeclampsia (PE) and gestational hypertension (GH) are pregnancy-specific diseases that occur in around 10% of pregnancies worldwide. Increasing evidence suggests that women whose pregnancies were complicated by PE or GH, and their offspring, are at increased risk of cardiovascular disease (CVD) later in life. We hypothesised that PE and GH would associate with CVD risk factors 8–10 years after the first pregnancy in the mother and child and that differences in cardiovascular risk profile would be seen between 8- and 10-year-old male and female children. This is a follow-up study of the Adelaide SCOPE pregnancy cohort where 1164 nulliparous women and their babies were recruited between 2005 and 2008. Haemodynamic function was assessed using non-invasive USCOMBP+ and USCOM1A devices. Microvascular function was assessed by post-occlusive reactive hyperaemia. Of the 273 mother–child pairs followed up, 38 women had PE and 20 had GH during pregnancy. Augmentation index (Aix) and suprasystolic pulse pressure (ssPP) were increased, whereas measures of microvascular function were decreased in children who were born to PE compared to uncomplicated pregnancies. Female children had decreased Aix and ssPP compared to male children after in utero exposure to PE. Women who developed GH during their first pregnancy had increased systolic, diastolic and mean arterial pressures compared to women who had uncomplicated pregnancy. Our data suggest that GH is associated with increased cardiovascular risk in women 8–10 years after first pregnancy and PE is associated with increased offspring risk at 8–10 years of age, highlighting differences between these two hypertensive disorders of pregnancy.
The Community Children's Health Partnership (CCHP) was, at the time of the research project discussed in this chapter, a partnership between North Bristol National Health Service Trust (the NHS Trust) and the children's charity Barnardo’s. Working in close collaboration in service design, delivery and evaluation, the ambition was to provide equitable and integrated care with a focus on participation and the voices of children and young people (CYP) from the outset. Along with the health services provided by the NHS Trust CCHP included a dedicated participation service called HYPE (Helping Young People to Engage) delivered by Barnardo’s.
The CCHP was a case study for Louca-Mai's doctoral research on ‘embedding children and young people's participation in health services and research’ (Brady, 2017). The authors were involved in an action research project which sought to develop and embed participation across all CCHP services and identify learning to inform the development of CYP's participation in the development and delivery of health services. At the time Emily was the Barnardo's Participation Manager and had been involved as a key partner in the CCHP from the outset, and Felicity and Lizzy were involved through the HYPE project as young advisors. The project involved health professionals, young people and Barnardo's participation service working collaboratively to develop a strategy and framework to support children's participation in the organisation.
This chapter starts by outlining the background and methods used in the project, before exploring the lessons from this project for the involvement of CYP in health policy and services from both professional and young people's perspectives. We also consider events which took place after the completion of the project, when the CCHP was recommissioned and restructured, highlighting the risks of competitive tendering and NHS commissioning processes on the embedding of CYP's participation in health services. This chapter also draws on a previous publication by the authors (Brady et al, 2018).
The CCHP participation story
The CCHP was a partnership between the NHS Trust and Barnardo’s, contracted from 2009 to 2016 to deliver children's community health services in Bristol and South Gloucestershire. It employed over 800 staff in mental and physical health services including CAMHS (Child and Adolescent Mental Health Services), health visiting, school nursing, physiotherapy, speech and language therapy, occupational therapy, community paediatricians and seven specialist services, including an inpatient adolescent unit.
Background: Nosocomial infections cause 4%–56% mortality in newborns. Several epidemiological studies have shown that transmission of opportunistic pathogens from the sink to the patient, including Pseudomonas aeruginosa, Stenotrophomonas maltophilia, and Serratia marcescens are associated with nosocomial infections in neonatal intensive care units (NICUs). In this project, we aimed to develop fast, accurate, and high-throughput multilocus sequence typing assays (HiMLST-Illumina) to detect opportunistic pathogens to assess their distribution in the sink environment of NICUs and their transfer to patients. Methods: Genome sequences of P. aeruginosa (n = 45), S. maltophilia (n = 23) and S. marcescens (n = 34) strains were retrieved from public genome databases to build their pangenomes, using the open-source PGAdb-builder server. The core genome was identified for each opportunistic pathogen and was searched for genes displaying the highest polymorphism. The minimal number of loci to include in a HiMLST-Illumina assay was determined by comparing topology of phylogenetic trees of concatenated loci based on genome similarity, computed as the average nucleotide identity (ANI) score. The primers used for HiMLST-Illumina schemes were designed in silico on a conserved domain and were tested on reference strains of each species. Results: Bioinformatics analyses showed that 3–4 loci (<300 base pairs per locus) distinguished strains with the same performances than ANI scores. The assays were tested using opportunistic pathogen isolates and environmental DNA originating from NICU sinks. The HiMLST-Illumina analysis of environmental DNA revealed the presence of at least 1 of the 3 studied opportunistic pathogens in 50% of sampled drains (n = 20). In a previous sampling, P. aeruginosa was isolated on selective culture media before and 48 hours after disinfection of a sink drain with chlorine. S. marcescens was also isolated from another sink 2 weeks after disinfection. Identification of the isolates was confirmed by HiMLST-Illumina analyses and will be typed to compare with clinical isolates. Conclusions: Initial in silico tests predict a high discriminating power of the HiMLST-Illumina method, suggesting that it would be possible to quickly identify strains of interest in a large number of samples. The power of this method is also in the possibility for molecular typing without a need for cultivation. Preliminary results suggest that sinks are readily colonized by opportunistic pathogens. This HiMLST-Illumina scheme will be applied in a 2-year intensive survey of NICUs in 3 hospitals in Montreal to evaluate the performance of new sink designs in limiting bioaerosol production and transmission of opportunistic pathogens to patients.
To evaluate the associations of pregestational BMI, gestational weight gain (GWG) and breast-feeding at 1 month postpartum with four patterns of weight change during the first year after delivery: postpartum weight retention (PPWR), postpartum weight gain (PPWG), postpartum weight retention + gain (PPWR + WG) and return to pregestational weight.
In this secondary analysis of a prospective study, we categorised postpartum weight change into four patterns using pregestational weight and weights at 1, 6 and 12 months postpartum. We evaluated their associations with pregestational BMI, GWG and breast-feeding using multinomial logistic regression. Results are presented as relative risk ratios (RRR) and 95 % CI.
Women participating in the Programming Research in Obesity, Growth, Environment and Social Stressors pregnancy cohort.
Five hundred women were included (53 % of the cohort). Most women returned to their pregestational weight by 1 year postpartum (57 %); 8 % experienced PPWR, 14 % PPWG and 21 % PPWR + WG. Compared with normal weight, pregestational overweight (RRR 2·5, 95 % CI 1·3, 4·8) and obesity (RRR 2·2, 95 % CI 1·0, 4·7) were associated with a higher risk of PPWG. Exclusive breast-feeding, compared with no breast-feeding, was associated with a lower risk of PPWR (RRR 0·3, 95 % CI 0·1, 0·9). Excessive GWG, compared with adequate, was associated with a higher risk of PPWR (RRR 3·3, 95 % CI 1·6, 6·9) and PPWR + WG (RRR 2·4, 95 % CI 1·4, 4·2).
Targeting women with pregestational overweight or obesity and excessive GWG, as well as promoting breast-feeding, may impact the pattern of weight change after delivery and long-term women’s health.
Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.
Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.
SHEA endorses adhering to the recommendations by the CDC and ACIP for immunizations of all children and adults. All persons providing clinical care should be familiar with these recommendations and should routinely assess immunization compliance of their patients and strongly recommend all routine immunizations to patients. All healthcare personnel (HCP) should be immunized against vaccine-preventable diseases as recommended by the CDC/ACIP (unless immunity is demonstrated by another recommended method). SHEA endorses the policy that immunization should be a condition of employment or functioning (students, contract workers, volunteers, etc) at a healthcare facility. Only recognized medical contraindications should be accepted for not receiving recommended immunizations.
Hydrocarbon contamination plagues high-resolution and analytical electron microscopy by depositing carbonaceous layers onto surfaces during electron irradiation, which can render carefully prepared specimens useless. Increased specimen thickness degrades resolution with beam broadening alongside loss of contrast. The large inelastic cross-section of carbon hampers accurate atomic species detection. Oxygen and water molecules pose problems of lattice damage by chemically etching the specimen during imaging. These constraints on high-resolution and spectroscopic imaging demand clean, high-vacuum microscopes with dry pumps. Here, we present an open-hardware design of a high-vacuum manifold for transmission electron microscopy (TEM) holders to mitigate hydrocarbon and residual species exposure. We quantitatively show that TEM holders are inherently dirty and introduce a range of unwanted chemical species. Overnight storage in our manifold reduces contaminants by one to two orders of magnitude and promotes two to four times faster vacuum recovery. A built-in bakeout system further reduces contaminants partial pressure to below 10−10 hPa (Torr) (approximately four orders of magnitude down from ambient storage) and alleviates monolayer adsorption during a typical TEM experiment. We determine that bakeout of TEM holder with specimen held therein is the optimal cleaning method. Our high-vacuum manifold design is published with open-source blueprints, parts, and cost list.
Sports medicine clinicians face conflicts of interest in providing medical care to athletes. Using a survey of college football players, this study evaluates whether athletes are aware of these conflicts of interest, whether these conflicts affect athlete trust in their health care providers, or whether conflicts or athletes' trust in stakeholders are associated with athletes' injury reporting behaviors.
Increasing evidence suggests that circulating factors and immune dysfunction may contribute to the pathogenesis of schizophrenia. In particular, proinflammatory cytokines, complement and autoantibodies against CNS epitopes have recently been associated with psychosis. Related concepts in previous decades led to several clinical trials of dialysis and plasmapheresis as treatments for schizophrenia. These trials may have relevance for the current understanding of schizophrenia. We aimed to identify whether dialysis or plasmapheresis are beneficial interventions in schizophrenia. We conducted a systematic search in major electronic databases for high-quality studies (double-blinded randomised trials with sham controls) applying either haemodialysis or plasmapheresis as an intervention in patients with schizophrenia, published in English from the start of records until September 2018. We found nine studies meeting inclusion criteria, reporting on 105 patients in total who received either sham or active intervention. One out of eight studies reported a beneficial effect of haemodialysis on schizophrenia, one a detrimental effect and six no effect. The sole trial of plasmapheresis found it to be ineffective. Adverse events were reported in 23% of patients. Studies were at unclear or high risk of bias. It is unlikely that haemodialysis is a beneficial treatment in schizophrenia, although the studies were of small size and could not consider potential subgroups. Plasmapheresis was only addressed by one study and warrants further exploration as a treatment modality in schizophrenia.