To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe here efforts to create and study magnetized electron–positron pair plasmas, the existence of which in astrophysical environments is well-established. Laboratory incarnations of such systems are becoming ever more possible due to novel approaches and techniques in plasma, beam and laser physics. Traditional magnetized plasmas studied to date, both in nature and in the laboratory, exhibit a host of different wave types, many of which are generically unstable and evolve into turbulence or violent instabilities. This complexity and the instability of these waves stem to a large degree from the difference in mass between the positively and the negatively charged species: the ions and the electrons. The mass symmetry of pair plasmas, on the other hand, results in unique behaviour, a topic that has been intensively studied theoretically and numerically for decades, but experimental studies are still in the early stages of development. A levitated dipole device is now under construction to study magnetized low-energy, short-Debye-length electron–positron plasmas; this experiment, as well as a stellarator device that is in the planning stage, will be fuelled by a reactor-based positron source and make use of state-of-the-art positron cooling and storage techniques. Relativistic pair plasmas with very different parameters will be created using pair production resulting from intense laser–matter interactions and will be confined in a high-field mirror configuration. We highlight the differences between and similarities among these approaches, and discuss the unique physics insights that can be gained by these studies.
Oral contraceptive use has been previously associated with an increased risk of suicidal behavior in some, but not all, samples. The use of large, representative, longitudinally-assessed samples may clarify the nature of this potential association.
We used Swedish national registries to identify women born between 1991 and 1995 (N = 216 702) and determine whether they retrieved prescriptions for oral contraceptives. We used Cox proportional hazards models to test the association between contraceptive use and first observed suicidal event (suicide attempt or death) from age 15 until the end of follow-up in 2014 (maximum age 22.4). We adjusted for covariates, including mental illness and parental history of suicide.
In a crude model, use of combination or progestin-only oral contraceptives was positively associated with suicidal behavior, with hazard ratios (HRs) of 1.73–2.78 after 1 month of use, and 1.25–1.82 after 1 year of use. Accounting for sociodemographic, parental, and psychiatric variables attenuated these associations, and risks declined with increasing duration of use: adjusted HRs ranged from 1.56 to 2.13 1 month beyond the initiation of use, and from 1.19 to 1.48 1 year after initiation of use. HRs were higher among women who ceased use during the observation period.
Young women using oral contraceptives may be at increased risk of suicidal behavior, but risk declines with increased duration of use. Analysis of former users suggests that women susceptible to depression/anxiety are more likely to cease hormonal contraceptive use. Additional studies are necessary to determine whether the observed association is attributable to a causal mechanism.
We studied the compositional turnover in infracommunities and component communities of ecto- and endoparasites infesting a bat, Miniopterus natalensis (Chiroptera, Miniopteridae), across seven sampling sites using the zeta diversity metric (measuring similarity between multiple communities) and calculating zeta decline and retention rate (both scales) and zeta decay (component communities). We asked whether the patterns of zeta diversity differ between (a) infracommunities and component communities; (b) ecto- and endoparasites and (c) subsets of communities infecting male and female bats. The pattern of compositional turnover differed between infracommunities and component communities in endoparasites only. The shape of zeta decline for infracommunities indicated that there were approximately equal probabilities of ecto- and endoparasitic species to occur on/in any bat individual within a site. The shape of zeta decline for component communities suggested the stochasticity of ectoparasite turnover, whereas the turnover of endoparasites was driven by niche-based processes. Compositional turnover in component communities of ectoparasites was more spatially dependent than that of endoparasites. Spatial independence of compositional turnover in endoparasites was due to subcommunities harboured by female bats. We conclude that the patterns of compositional turnover in infracommunities were similar in ecto- and endoparasites, whereas the patterns of turnover in component communities differed between these groups.
Neurodegenerative diseases (NDDs), such as Alzheimer’s disease, frontotemporal dementia, dementia with Lewy bodies, and Huntington’s disease, inevitably lead to impairments in higher-order cognitive functions, including the perception of emotional cues and decision-making behavior. Such impairments are likely to cause risky daily life behavior, for instance, in traffic. Impaired recognition of emotional expressions, such as fear, is considered a marker of impaired experience of emotions. Lower fear experience can, in turn, be related to risk-taking behavior. The aim of our study was to investigate whether impaired emotion recognition in patients with NDD is indeed related to unsafe decision-making in risky everyday life situations, which has not been investigated yet.
Fifty-one patients with an NDD were included. Emotion recognition was measured with the Facial Expressions of Emotions: Stimuli and Test (FEEST). Risk-taking behavior was measured with driving simulator scenarios and the Action Selection Test (AST). Data from matched healthy controls were used: FEEST (n = 182), AST (n = 36), and driving simulator (n = 18).
Compared to healthy controls, patients showed significantly worse emotion recognition, particularly of anger, disgust, fear, and sadness. Furthermore, patients took significantly more risks in the driving simulator rides and the AST. Only poor recognition of fear was related to a higher amount of risky decisions in situations involving a direct danger.
To determine whether patients with an NDD are still fit to drive, it is crucial to assess their ability to make safe decisions. Measuring emotion recognition may be a valuable contribution to this judgment.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Non-medical opioid use (NMOU) is a growing crisis. Cancer patients at elevated risk of NMOU (+risk) are frequently underdiagnosed. The aim of this paper was to develop a nomogram to predict the probability of +risk among cancer patients receiving outpatient supportive care consultation at a comprehensive cancer center.
3,588 consecutive patients referred to a supportive care clinic were reviewed. All patients had a diagnosis of cancer and were on opioids for pain. All patients were assessed using the Edmonton Symptom Assessment Scale (ESAS), Screener and Opioid Assessment for Patients with Pain (SOAPP-14), and CAGE-AID (Cut Down-Annoyed-Guilty-Eye Opener) questionnaires. “+risk” was defined as an SOAPP-14 score of ≥7. A nomogram was devised based on the risk factors determined by the multivariate logistic regression model to estimate the probability of +risk.
731/3,588 consults were +risk. +risk was significantly associated with gender, race, marital status, smoking status, depression, anxiety, financial distress, MEDD (morphine equivalent daily dose), and CAGE-AID score. The C-index was 0.8. A nomogram was developed and can be accessed at https://is.gd/soappnomogram. For example, for a male Hispanic patient, married, never smoked, with ESAS scores for depression = 3, anxiety = 3, financial distress = 7, a CAGE score of 0, and an MEDD score of 20, the total score is 9 + 9+0 + 0+6 + 10 + 23 + 0+1 = 58. A nomogram score of 58 indicates the probability of +risk of 0.1.
Significance of results
We established a practical nomogram to assess the +risk. The application of a nomogram based on routinely collected clinical data can help clinicians establish patients with +risk and positively impact care planning.
Alcohol use disorder (AUD) is common and associated with increased risk of suicide.
To examine healthcare utilisation prior to suicide in persons with AUD in a large population-based cohort, which may reveal opportunities for prevention.
A national cohort study was conducted of 6 947 191 adults in Sweden in 2002, including 256 647 (3.7%) with AUD, with follow-up for suicide through 2015. A nested case–control design examined healthcare utilisation among people with AUD who died by suicide and 10:1 age- and gender-matched controls.
In 86.7 million person-years of follow-up, 15 662 (0.2%) persons died by suicide, including 2601 (1.0%) with AUD. Unadjusted and adjusted relative risks for suicide associated with AUD were 8.15 (95% CI 7.86–8.46) and 2.22 (95% CI 2.11–2.34). Of the people with AUD who died by suicide, 39.7% and 75.6% had a healthcare encounter <2 weeks or <3 months before the index date respectively, compared with 6.3% and 25.4% of controls (adjusted prevalence ratio (PR) and difference (PD), <2 weeks: PR = 3.86, 95% CI 3.50–4.25, PD = 26.4, 95% CI 24.2–28.6; <3 months: PR = 2.03, 95% CI 1.94–2.12, PD = 34.9, 95% CI 32.6–37.1). AUD accounted for more healthcare encounters within 2 weeks of suicide among men than women (P = 0.01). Of last encounters, 48.1% were in primary care and 28.9% were in specialty out-patient clinics, mostly for non-psychiatric diagnoses.
Suicide among persons with AUD is often shortly preceded by healthcare encounters in primary care or specialty out-patient clinics. Encounters in these settings are important opportunities to identify active suicidality and intervene accordingly in patients with AUD.
OBJECTIVES/GOALS: We compared the validity of an International Classification of Diseases, Clinical Modification (ICD) algorithm for identifying high-grade cervical intraepithelial neoplasia and adenocarcinoma in situ (together referred to as CIN2+) from ICD 9th revision (ICD-9) and 10th revision (ICD-10) codes. METHODS/STUDY POPULATION: Using Tennessee Medicaid data, we identified cervical diagnostic procedures in 2008-2017 among females aged 18-39 years in Davidson County, TN. Gold-standard cases were pathology-confirmed CIN2+ diagnoses validated by HPV-IMPACT, a population-based surveillance project in catchment areas of five US states. Procedures in the ICD transition year (2015) were excluded to account for implementation lag. We pre-grouped diagnosis and procedure codes by theme. We performed feature selection using least absolute shrinkage and selection operator (LASSO) logistic regression with 10-fold cross validation and validated models by ICD-9 era (2008-2014, N = 6594) and ICD-10 era (2016-2017, N = 1270). RESULTS/ANTICIPATED RESULTS: Of 7864 cervical diagnostic procedures, 880 (11%) were true CIN2+ cases. LASSO logistic regression selected the strongest features of case status: Having codes for a CIN2+ tissue diagnosis, non-specific CIN tissue diagnosis, high-grade squamous intraepithelial lesion, receiving a cervical treatment procedure, and receiving a cervical/vaginal biopsy. Features of non-case status were codes for a CIN1 tissue diagnosis, Pap test, and HPV DNA test. The ICD-9 vs ICD-10 algorithms predicted case status with 68% vs 63% sensitivity, 95% vs 94% specificity, 63% vs 64% positive predictive value, 96% vs 94% negative predictive value, 92% vs 89% accuracy, and C-indices of 0.95 vs 0.92, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: Overall, the algorithm’s validity for identifying CIN2+ case status was similar between coding versions. ICD-9 had slightly better discriminative ability. Results support a prior study concluding that ICD-10 implementation has not substantially improved the quality of administrative data from ICD-9.
Dietary fibre fermentation in humans and monogastric animals is considered to occur in the hindgut, but it may also occur in the lower small intestine. This study aimed to compare ileal and hindgut fermentation in the growing pig fed a human-type diet using a combined in vivo/in vitro methodology. Five pigs (23 (sd 1·6) kg body weight) were fed a human-type diet. On day 15, pigs were euthanised. Digesta from terminal jejunum and terminal ileum were collected as substrates for fermentation. Ileal and caecal digesta were collected for preparing microbial inocula. Terminal jejunal digesta were fermented in vitro with a pooled ileal digesta inoculum for 2 h, whereas terminal ileal digesta were fermented in vitro with a pooled caecal digesta inoculum for 24 h. The ileal organic matter fermentability (28 %) was not different from hindgut fermentation (35 %). However, the organic matter fermented was 66 % greater for ileal fermentation than hindgut fermentation (P = 0·04). Total numbers of bacteria in ileal and caecal digesta did not differ (P = 0·09). Differences (P < 0·05) were observed in the taxonomic composition. For instance, ileal digesta contained 32-fold greater number of the genus Enterococcus, whereas caecal digesta had a 227-fold greater number of the genus Ruminococcus. Acetate synthesis and iso-valerate synthesis were greater (P < 0·05) for ileal fermentation than hindgut fermentation, but propionate, butyrate and valerate synthesis was lower. SCFA were absorbed in the gastrointestinal tract location where they were synthesised. In conclusion, a quantitatively important degree of fermentation occurs in the ileum of the growing pig fed a human-type diet.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
Determining best practices for managing free farrowing systems is crucial for uptake. Cross-fostering, the exchange of piglets between litters, is routinely performed amongst crate-housed sows. However, cross-fostering can increase fighting amongst the litter and may be more challenging within free farrowing systems as sows have more freedom to respond to cross-fostered piglets. This study compared the effect of either cross-fostering (FOS), or a control of sham-fostering (CON), of four focal piglets per litter on Day 6 postpartum in crates (CRATE) and free farrowing pens (PEN). The post-treatment behavioural responses of sows were recorded (Day 6 = 60 min; Day 7 = 300 min; n = 48), as were the average daily gain (ADG; g/day), total weight gain (TWG; kg) and body lesion scores of focal piglets and their littermates throughout lactation (Day 6, Day 8, Day 11 and Day 26; n = 539) and the post-weaning period (Day 29, Day 32 and Day 60; n = 108). On Day 6, though post-reunion latency to nursing did not differ, latency to successful nursing was longer amongst FOS than CON litters (P < 0.001), more so amongst CRATE FOS than PEN FOS (P < 0.01). On Day 7, PEN FOS sows had fewer successful nursing bouts (P < 0.05) and exhibited decreased lateral (P < 0.01) and increased ventral lying frequencies (P < 0.01) compared to all other housing and treatment combinations. Focal piglet ADG was lower for FOS than CON in the CRATE during Day 6 to Day 8 (P < 0.01) and lower in the PEN during Day 6 to Day 8 (P < 0.001), Day 8 to Day 11 (P < 0.01) and Day 11 to Day 26 (P < 0.05). The TWG of pre-weaned focal piglets (Day 6 to Day 26) was higher amongst CON than FOS litters (P = 0.01). Post-weaning, piglet ADG was higher for PEN than CRATE during Day 26 to Day 29 (P < 0.01) and higher for FOS than CON during Day 26 to Day 29 (P < 0.05), Day 29 to Day 32 (P < 0.001) and Day 32 to Day 60 (P < 0.01); thus, TWG was higher for FOS than CON during the weaner (P = 0.001) and the combined lactation and weaner periods (P = 0.09). In conclusion, sow behaviour was disrupted by cross-fostering in the crates and pens and continued to be disturbed on the following day amongst penned sows. FOS piglets exhibited reduced ADG after cross-fostering, which extended throughout lactation in the pens. However, the increased post-weaning weight gain of FOS piglets meant that their TWG was higher than CON piglets, irrespective of the farrowing system used.
In a study conducted in the database of a large commercial healthcare insurer, we previously demonstrated that use of a commercial pharmacogenetic assay for individuals with mood disorders was associated with decreased resource utilization and cost in the 6 month period following use compared to propensity-score matched controls. We conducted a post hoc analysis to understand variables associated with high cost savings.
The results and methods of the initial study have previously been described. Cases were individuals with mood and anxiety disorders who received a commercial pharmacogenetic assay (Genomind, King of Prussia PA) to inform pharmacotherapy. 817 tested individuals (cases) with mood and/or anxiety disorders were matched to 2745 controls. Overall costs were estimated to be $1,948 lower in the tested group. The differences were largely the result of lesser emergency room and inpatient utilization for cases. In the present analysis, cost difference for cases compared to their matched controls was rank ordered by decile. High cost savers were arbitrarily defined a priori as the top 20% of savers. Using multivariable modeling techniques, an ordinal logistic regression model was generated in which baseline or follow-up variables were statistically tested for independent associations with high, low, and no cost savings.
606 (74%) of cases were net cost savers compared to their controls (cost difference <0). High cost savers (n=121) saved on average $10,690 compared to their matched controls. They were statistically more likely to have been diagnosed with bipolar disorder (n=33/121) than low cost savers (n=57/485) or non-savers (n=31/211), and had a lower Charlson Comorbidity index. High cost savers had fewer mean number of antidepressants in the baseline period (mean=3.16) compared to non-savers (3.73) but more than low cost savers (2.72) (p<0.05 across groups). In a multivariable model, bipolar, count of antidepressants, outpatient visits, and inpatient visits were statistically associated with being a high cost saver; antidepressant count and all-cause inpatient and outpatient visits in the baseline period were inversely associated with cost savings.
Use of a pharmacogenetic assay was associated with cost-savings in the database of a large commercial insurer. Patients with bipolar disorder were more likely to be high cost savers than individuals with other mood and anxiety disorders.
Fear and environmental stressors may negatively affect the welfare of farm animals such as pigs. The present study investigated the effects of music and positive handling on reproductive performance of sows (n = 1014; parity 1 to 8) from a commercial pig farm practicing a batch farrowing system. Every 2 weeks, 56 sows were moved from the gestation unit to conventional-crated farrowing houses 1 week prior to expected farrowing. Treated (T; n = 299) and control (C; n = 715) sows were included in the study. In the farrowing houses, auditory enrichment (music from a radio) was provided to sows of T groups daily from 0600 to 1800 h until the end of lactation. Until the day of farrowing, T sows were additionally subjected, for 15 s per day per sow, to continuous back scratching by one member of farm staff. Litter performance and piglet mortality were recorded and analysed between T and C sows using linear mixed regression models. The number of liveborn piglets (C 13.85 v. T 13.26) and liveborn corrected for fostering (C 13.85 v. T 13.43) was significantly higher (P < 0.05) in C groups compared to the T groups. The number of stillborn piglets was 0.60 and 0.72 in T and C groups, respectively (P > 0.05). With regard to piglet mortality, a linear mixed regression model showed a significant overall effect of treatment in reducing piglet mortality (P < 0.01). Yet, the effect of treatment varied according to litter size (number of liveborn piglets) with a diminishing treatment effect in sows with a high litter size (P < 0.01). Pre-weaning survival was improved in the current study by the combined effect of daily back scratching of sows prior to farrowing and providing music to sows and piglets during lactation. Further research is needed to assess the separate effects of both interventions.
Mycoprotein is a food high in both dietary fibre and non-animal-derived protein. Global mycoprotein consumption is increasing, although its effect on human health has not yet been systematically reviewed. This study aims to systematically review the effects of mycoprotein on glycaemic control and energy intake in humans. A literature search of randomised controlled trials was performed in PubMed, Embase, Web of Science, Google Scholar and hand search. A total of twenty-one studies were identified of which only five studies, totalling 122 participants, met the inclusion criteria. All five studies were acute studies of which one reported outcomes on glycaemia and insulinaemia, two reported on energy intake and two reported on all of these outcomes. Data were extracted, and risk-of-bias assessment was then conducted. The results did not show a clear effect of acute mycoprotein on blood glucose levels, but it showed a decrease in insulin levels. Acute mycoprotein intake also showed to decrease energy intake at an ad libitum meal and post-24 h in healthy lean, overweight and obese humans. In conclusion, the acute ingestion of mycoprotein reduces energy intake and insulinaemia, whereas its impact on glycaemia is currently unclear. However, evidence comes from a very limited number of heterogeneous studies. Further well-controlled studies are needed to elucidate the short- and long-term effects of mycoprotein intake on glycaemic control and energy intake, as well as the mechanisms underpinning these effects.
The prevalence of many diseases in pigs displays seasonal distributions. Despite growing concerns about the impacts of climate change, we do not yet have a good understanding of the role that weather factors play in explaining such seasonal patterns. In this study, national and county-level aggregated abattoir inspection data were assessed for England and Wales during 2010–2015. Seasonally-adjusted relationships were characterised between weekly ambient maximum temperature and the prevalence of both respiratory conditions and tail biting detected at slaughter. The prevalence of respiratory conditions showed cyclical annual patterns with peaks in the summer months and troughs in the winter months each year. However, there were no obvious associations with either high or low temperatures. The prevalence of tail biting generally increased as temperatures decreased, but associations were not supported by statistical evidence: across all counties there was a relative risk of 1.028 (95% CI 0.776–1.363) for every 1 °C fall in temperature. Whilst the seasonal patterns observed in this study are similar to those reported in previous studies, the lack of statistical evidence for an explicit association with ambient temperature may possibly be explained by the lack of information on date of disease onset. There is also the possibility that other time-varying factors not investigated here may be driving some of the seasonal patterns.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
In many materials development projects, scientists and research heads make decisions to guide the project direction. For example, scientists may decide which processing steps to use, what elements to include in their material selection, or from what suppliers to source their materials. Research heads may decide whether to invest development effort in reducing the environmental impact or production cost of a material. When making these decisions, it would be helpful to know how those decisions affect the achievable performance of the materials under consideration. Often, these decisions are complicated by trade-offs in performance between competing properties. This paper presents an approach for visualizing and evaluating design spaces, where a design space is defined as the set of possible materials under consideration given specified constraints. This design space visualization approach is applied to two case studies with environmental impact motivations: one in biodegradability for solvents, and the other in sustainable materials sourcing for Li-ion batteries. The results demonstrate how this visualization approach can enable data-driven, quantitative decisions for project direction.
The Supplemental Nutrition Assistance Program (SNAP) serves as the primary tool to alleviate food insecurity in the United States. Its effectiveness has been demonstrated in numerous studies, but the majority of SNAP recipients are still food insecure. One factor behind this is the difference in food prices across the country—SNAP benefits are not adjusted to reflect these differences. Using information from Feeding America's Map the Meal Gap (MMG) project, we compare the cost of a meal by county based on the Thrifty Food Plan (TFP)—which is used to set the maximum SNAP benefit—with the cost of the average meal for low-income food-secure households. We find that the cost of the latter meal is higher than the TFP meal for over 99 percent of the counties. We next consider the reduction in food insecurity if, by county, the maximum SNAP benefit level was set to the cost of the average meal for low-income food-secure households. We find that if this approach were implemented, there would be a decline of 50.9 percent in food insecurity among SNAP recipients at a cost of $23 billion.
Thermal infrared data collected by the Thermal Emission Spectrometer (TES) and Thermal Emission Imaging System (THEMIS) instruments have significantly impacted the understanding of martian surface mineralogy. Spatial/temporal variations in igneous lithologies; the discovery of quartz, carbonates, and chlorides; and the widespread identification of amorphous, silica-enriched materials reveal a planet that has experienced a diversity of primary and secondary geo-logic processes including igneous crustal evolution, regional sedimentation, aqueous alteration, and glacial/periglacial activity.
Describe common pathogens and antimicrobial resistance patterns for healthcare-associated infections (HAIs) that occurred during 2015–2017 and were reported to the Centers for Disease Control and Prevention’s (CDC’s) National Healthcare Safety Network (NHSN).
Data from central line-associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated events (VAEs), and surgical site infections (SSIs) were reported from acute-care hospitals, long-term acute-care hospitals, and inpatient rehabilitation facilities. This analysis included device-associated HAIs reported from adult location types, and SSIs among patients ≥18 years old. Percentages of pathogens with nonsusceptibility (%NS) to selected antimicrobials were calculated for each HAI type, location type, surgical category, and surgical wound closure technique.
Overall, 5,626 facilities performed adult HAI surveillance during this period, most of which were general acute-care hospitals with <200 beds. Escherichia coli (18%), Staphylococcus aureus (12%), and Klebsiella spp (9%) were the 3 most frequently reported pathogens. Pathogens varied by HAI and location type, with oncology units having a distinct pathogen distribution compared to other settings. The %NS for most pathogens was significantly higher among device-associated HAIs than SSIs. In addition, pathogens from long-term acute-care hospitals had a significantly higher %NS than those from general hospital wards.
This report provides an updated national summary of pathogen distributions and antimicrobial resistance among select HAIs and pathogens, stratified by several factors. These data underscore the importance of tracking antimicrobial resistance, particularly in vulnerable populations such as long-term acute-care hospitals and intensive care units.