To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It is difficult to separate an age-dependent fall in nitrogen use efficiency (NUE; N balance/N intake) in growing ruminants from a progressively decrease in animal protein requirements over time. This study examined the effect of dietary protein content on N partitioning, digestibility and N isotopic discrimination between the animal and its diet (Δ15Nanimal-diet) evaluated at two different fattening periods (early v. late). Twenty-four male Romane lambs (age: 19 ± 4.0 days; BW: 8.3 ± 1.39 kg) were equally allocated to three dietary CP treatments (15%, 17% and 20% CP on a DM basis). Lambs were reared with their mothers until weaning, thereafter housed in individual pens until slaughter (45 kg BW). During the post-weaning period, lambs were allocated twice (early fattening (30 days post-weaning) and late fattening (60 days post-weaning)) to metabolic cages for digestibility and N balance study. When diet CP content increased, the average daily gain of lambs increased (P < 0.05) while the age at slaughter decreased (P = 0.01), but no effect was observed on feed efficiency (P > 0.10). Diet CP content had limited effect on lamb carcass traits. Higher fibre digestibility was observed at the early v. late fattening period (P < 0.001). The N intake and the urinary N excretion increased when diet CP content increased (P < 0.001) and when shifting from early to late fattening period (P < 0.001). Faecal N excretion (P = 0.14) and N balance (P > 0.10) were not affected by diet CP content. Nitrogen digestibility increased (P < 0.001) as the diet CP content increased and on average it was greater at late v. early fattening period (P = 0.02). The NUE decreased (P = 0.001) as the diet CP content increased and as the lamb became older (P < 0.001). However, the age-dependent fall in NUE observed was lower at high v. low dietary CP content (CP × age interaction; P = 0.04). The Δ15Nanimal-diet was positively correlated (P < 0.05) with N intake (r = 0.59), excretion of faecal N (r = 0.41), urinary N (r = 0.69) and total manure N (r = 0.64), while negatively correlated with NUE (r = −0.57). Overall, the experiment showed NUE was lower in older lambs and when lambs were fed high diet CP content, and that Δ15Nanimal-diet was a useful indicator not only for NUE but also for urinary N excretion, which is a major environmental pollution factor on farm.
Enteric illness outbreaks are complex events, therefore, outbreak investigators use many different hypothesis generation methods depending on the situation. This scoping review was conducted to describe methods used to generate a hypothesis during enteric illness outbreak investigations. The search included five databases and grey literature for articles published between 1 January 2000 and 2 May 2015. Relevance screening and article characterisation were conducted by two independent reviewers using pretested forms. There were 903 outbreaks that described hypothesis generation methods and 33 papers which focused on the evaluation of hypothesis generation methods. Common hypothesis generation methods described are analytic studies (64.8%), descriptive epidemiology (33.7%), food or environmental sampling (32.8%) and facility inspections (27.9%). The least common methods included the use of a single interviewer (0.4%) and investigation of outliers (0.4%). Most studies reported using two or more methods to generate hypotheses (81.2%), with 29.2% of studies reporting using four or more. The use of multiple different hypothesis generation methods both within and between outbreaks highlights the complexity of enteric illness outbreak investigations. Future research should examine the effectiveness of each method and the contexts for which each is most effective in efficiently leading to source identification.
We describe the case of an 11-month-old girl with a rare cerebellar glioblastoma driven by a NACC2-NTRK2 (Nucleus Accumbens Associated Protein 2-Neurotrophic Receptor Tyrosine Kinase 2) fusion. Initial workup of our case demonstrated homozygous CDKN2A deletion, but immunohistochemistry for other driver mutations, including IDH1 R132H, BRAF V600E, and H3F3A K27M were negative, and ATRX was retained. Tissue was subsequently submitted for personalized oncogenomic analysis, including whole genome and whole transcriptome sequencing, which demonstrated an activating NTRK2 fusion, as well as high PD-L1 expression, which was subsequently confirmed by immunohistochemistry. Furthermore, H3 and IDH demonstrated wildtype status. These findings suggested the possibility of treatment with either NTRK- or immune checkpoint- inhibitors through active clinical trials. Ultimately, the family pursued standard treatment that involved Head Start III chemotherapy and proton radiotherapy. Notably, at most recent follow upapproximately two years from initial diagnosis, the patient is in disease remission and thriving, suggesting favorable biology despite histologic malignancy. This case illustrates the value of personalized oncogenomics, as the molecular profiling revealed two actionable changes that would not have been apparent through routine diagnostics. NTRK fusions are known oncogenic drivers in a range of cancer types, but this is the first report of a NACC2-NTRK2 fusion in a glioblastoma.
This presentation will enable the learner to:
1.Explore the current molecular landscape of pediatric high grade gliomas
2.Recognize the value of personalized oncogenomic analysis, particularly in rare and/or aggressive tumors
3.Discuss the current status of NTRK inhibitor clinical trials
Copy number variations (CNVs), as an important source of genetic variation, can affect a wide range of phenotypes by diverse mechanisms. The somatostatin receptor 2 (SSTR2) gene plays important roles in cell proliferation and apoptosis. Recently, this gene was mapped to a CNV region, which encompasses quantitative trait loci of cattle economic traits including body weight, marbling score, etc. Therefore, SSTR2 CNV may exhibit phenotypic effects on cattle growth traits. In the current study, distribution of SSTR2 gene CNVs was investigated in six Chinese cattle breeds (XN, QC, NY, JA, LX and PN), and the results showed higher CNV polymorphisms in XN, QC and NY cattle. Next, association analysis between growth traits and SSTR2 CNV was performed for XN, QC and NY cattle. In NY, individuals with fewer copies showed better performance than those with more copies. Further, the effects of SSTR2 CNV on the SSTR2 mRNA level were also investigated, but revealed no significant correlation in either muscle or adipose tissue of adult NY cattle. The results suggested the potential for use of SSTR2 CNV as a marker for the molecular breeding of NY cattle.
The Pain Catastrophizing Scale (PCS) measures three aspects of catastrophic cognitions about pain—rumination, magnification, and helplessness. To facilitate assessment and clinical application, we aimed to (a) develop a short version on the basis of its factorial structure and the items’ correlations with key pain-related outcomes, and (b) identify the threshold on the short form indicative of risk for depression.
Social centers for older people.
664 Chinese older adults with chronic pain.
Besides the PCS, pain intensity, pain disability, and depressive symptoms were assessed.
For the full scale, confirmatory factor analysis showed that the hypothesized 3-factor model fit the data moderately well. On the basis of the factor loadings, two items were selected from each of the three dimensions. An additional item significantly associated with pain disability and depressive symptoms, over and above these six items, was identified through regression analyses. A short-PCS composed of seven items was formed, which correlated at r=0.97 with the full scale. Subsequently, receiver operating characteristic (ROC) curves were plotted against clinically significant depressive symptoms, defined as a score of ≥12 on a 10-item version of the Center for Epidemiologic Studies-Depression Scale. This analysis showed a score of ≥7 to be the optimal cutoff for the short-PCS, with sensitivity = 81.6% and specificity = 78.3% when predicting clinically significant depressive symptoms.
The short-PCS may be used in lieu of the full scale and as a brief screen to identify individuals with serious catastrophizing.
Rabies is one of the major public health problems in China, and the mortality rate of rabies remains the highest among all notifiable infectious diseases. A meta-analysis was conducted to investigate the post-exposure prophylaxis (PEP) vaccination rate and risk factors for human rabies in mainland China. The PubMed, Web of Science, Chinese National Knowledge Infrastructure, Chinese Science and Technology Periodical and Wanfang databases were searched for articles on rabies vaccination status (published between 2007 and 2017). In total, 10 174 human rabies cases from 136 studies were included in this meta-analysis. Approximately 97.2% (95% confidence interval (CI) 95.1–98.7%) of rabies cases occurred in rural areas and 72.6% (95% CI 70.0–75.1%) occurred in farmers. Overall, the vaccination rate in the reported human rabies cases was 15.4% (95% CI 13.7–17.4%). However, among vaccinated individuals, 85.5% (95% CI 79.8%–83.4%) did not complete the vaccination regimen. In a subgroup analysis, the PEP vaccination rate in the eastern region (18.8%, 95% CI 15.9–22.1%) was higher than that in the western region (13.3%, 95% CI 11.1–15.8%) and this rate decreased after 2007. Approximately 68.9% (95% CI 63.6–73.8%) of rabies cases experienced category-III exposures, but their PEP vaccination rate was 27.0% (95% CI 14.4–44.9%) and only 6.1% (95% CI 4.4–8.4%) received rabies immunoglobulin. Together, these results suggested that the PEP vaccination rate among human rabies cases was low in mainland China. Therefore, standardised treatment and vaccination programs of dog bites need to be further strengthened, particularly in rural areas.
The response of soil microbial communities to soil quality changes is a sensitive indicator of soil ecosystem health. The current work investigated soil microbial communities under different fertilization treatments in a 31-year experiment using the phospholipid fatty acid (PLFA) profile method. The experiment consisted of five fertilization treatments: without fertilizer input (CK), chemical fertilizer alone (MF), rice (Oryza sativa L.) straw residue and chemical fertilizer (RF), low manure rate and chemical fertilizer (LOM), and high manure rate and chemical fertilizer (HOM). Soil samples were collected from the plough layer and results indicated that the content of PLFAs were increased in all fertilization treatments compared with the control. The iC15:0 fatty acids increased significantly in MF treatment but decreased in RF, LOM and HOM, while aC15:0 fatty acids increased in these three treatments. Principal component (PC) analysis was conducted to determine factors defining soil microbial community structure using the 21 PLFAs detected in all treatments: the first and second PCs explained 89.8% of the total variance. All unsaturated and cyclopropyl PLFAs except C12:0 and C15:0 were highly weighted on the first PC. The first and second PC also explained 87.1% of the total variance among all fertilization treatments. There was no difference in the first and second PC between RF and HOM treatments. The results indicated that long-term combined application of straw residue or organic manure with chemical fertilizer practices improved soil microbial community structure more than the mineral fertilizer treatment in double-cropped paddy fields in Southern China.
Recent studies indicate that early postnatal period is a critical window for gut microbiota manipulation to optimise the immunity and body growth. This study investigated the effects of maternal faecal microbiota orally administered to neonatal piglets after birth on growth performance, selected microbial populations, intestinal permeability and the development of intestinal mucosal immune system. In total, 12 litters of crossbred newborn piglets were selected in this study. Litter size was standardised to 10 piglets. On day 1, 10 piglets in each litter were randomly allotted to the faecal microbiota transplantation (FMT) and control groups. Piglets in the FMT group were orally administrated with 2ml faecal suspension of their nursing sow per day from the age of 1 to 3 days; piglets in the control group were treated with the same dose of a placebo (0.1M potassium phosphate buffer containing 10% glycerol (vol/vol)) inoculant. The experiment lasted 21 days. On days 7, 14 and 21, plasma and faecal samples were collected for the analysis of growth-related hormones and cytokines in plasma and lipocalin-2, secretory immunoglobulin A (sIgA), selected microbiota and short-chain fatty acids (SCFAs) in faeces. Faecal microbiota transplantation increased the average daily gain of piglets during week 3 and the whole experiment period. Compared with the control group, the FMT group had increased concentrations of plasma growth hormone and IGF-1 on days 14 and 21. Faecal microbiota transplantation also reduced the incidence of diarrhoea during weeks 1 and 3 and plasma concentrations of zonulin, endotoxin and diamine oxidase activities in piglets on days 7 and 14. The populations of Lactobacillus spp. and Faecalibacterium prausnitzii and the concentrations of faecal and plasma acetate, butyrate and total SCFAs in FMT group were higher than those in the control group on day 21. Moreover, the FMT piglets have higher concentrations of plasma transforming growth factor-β and immunoglobulin G, and faecal sIgA than the control piglets on day 21. These findings indicate that early intervention with maternal faecal microbiota improves growth performance, decreases intestinal permeability, stimulates sIgA secretion, and modulates gut microbiota composition and metabolism in suckling piglets.
Our objective was to identify predictors of severe acute respiratory infection in hospitalised patients and understand the impact of vaccination and neuraminidase inhibitor administration on severe influenza. We analysed data from a study evaluating influenza vaccine effectiveness in two Michigan hospitals during the 2014–2015 and 2015–2016 influenza seasons. Adults admitted to the hospital with an acute respiratory infection were eligible. Through patient interview and medical record review, we evaluated potential risk factors for severe disease, defined as ICU admission, 30-day readmission, and hospital length of stay (LOS). Two hundred sixteen of 1119 participants had PCR-confirmed influenza. Frailty score, Charlson score and tertile of prior-year healthcare visits were associated with LOS. Charlson score >2 (OR 1.5 (1.0–2.3)) was associated with ICU admission. Highest tertile of prior-year visits (OR 0.3 (0.2–0.7)) was associated with decreased ICU admission. Increasing tertile of visits (OR 1.5 (1.2–1.8)) was associated with 30-day readmission. Frailty and prior-year healthcare visits were associated with 30-day readmission among influenza-positive participants. Neuraminidase inhibitors were associated with decreased LOS among vaccinated participants with influenza A (HR 1.6 (1.0–2.4)). Overall, frailty and lack of prior-year healthcare visits were predictors of disease severity. Neuraminidase inhibitors were associated with reduced severity among vaccine recipients.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Chilling injury is an important natural stress that can threaten cotton production, especially at the sowing and seedling stages in early spring. It is therefore important for cotton production to improve chilling tolerance at these stages. The current work examines the potential for glycine betaine (GB) treatment of seeds to increase the chilling tolerance of cotton at the seedling stage. Germination under cold stress was increased significantly by GB treatment. Under low temperature, the leaves of seedlings from treated seeds exhibited a higher net photosynthetic rate (PN), higher antioxidant enzyme activity including superoxide dismutase, ascorbate peroxidase and catalase, lower hydrogen peroxide (H2O2) content and less damage to the cell membrane. Enzyme activity was correlated negatively with H2O2 content and degree of damage to the cell membrane but correlated positively with GB content. The experimental results suggested that although GB was only used to treat cotton seed, the beneficial effect caused by the preliminary treatment of GB could play a significant role during germination that persisted to at least the four-leaf seedling stage. Therefore, it is crucial that this method is employed in agricultural production to improve chilling resistance in the seedling stage by soaking the seeds in GB.
Achieving adequate fertility is essential in any dairy unit, but is compromised by genetic selection for increased yield. Selection has altered the somatotrophic axis and resulted in cows which mobilise more body tissue for milk production in early lactation, thus prolonging both the depth and duration of the post partum negative energy balance. Poor energy status is reflected in altered metabolic parameters including raised urea and decreased insulin-like growth factor-I (IGF-I) and insulin concentrations, which adversely affect ovarian cyclicity and early embryo survival. Attempts to optimise the diet in terms of energy and protein content have generally been aimed at increasing milk production further rather than improving fertility. Advances in biosensor technology now provide us with the opportunity to monitor production, fertility and health parameters of each cow. Integration of this information should improve the timing for inseminations and could assist in selecting diets more suited to the needs of the individual cow. Genetic selection may in future be used to produce cows optimised for a particular type of management system. In both cases we need a greater understanding of the rules governing nutrient partitioning at different stages of the cows' life cycle to ensure that diets selected are cost effective and achieve an appropriate balance in promoting production, reproduction and health.
Central line-associated bloodstream infections (CLABSIs) in intensive care units (ICUs) result in poor clinical outcomes and increased costs. Although frequently regarded as preventable, infection risk may be influenced by non-modifiable factors. The objectives of this study were to evaluate organisational factors associated with CLABSI in Victorian ICUs to determine the nature and relative contribution of modifiable and non-modifiable risk factors. Data captured by the Australian and New Zealand Intensive Care Society regarding ICU-admitted patients and resources were linked to CLABSI surveillance data collated by the Victorian Healthcare Associated Infection Surveillance System between 1 January 2010 and 31 December 2013. Accepted CLABSI surveillance methods were applied and hospital/patient characteristics were classified as ‘modifiable’ and ‘non-modifiable’, enabling longitudinal Poisson regression modelling of CLABSI risk. In total, 26 ICUs were studied. Annual CLABSI rates were 1·72, 1·37, 1·00 and 0·93/1000 CVC days for 2010–2013. Of non-modifiable factors, the number of non-invasively ventilated patients standardised to total ICU bed days was found to be independently associated with infection (RR 1·07; 95% CI 1·01–1·13; P = 0·030). Modelling of modifiable risk factors demonstrated the existence of a policy for mandatory ultrasound guidance for central venous catheter (CVC) localisation (RR 0·51; 95% CI 0·37–0·70; P < 0·001) and increased number of sessional specialist full-time equivalents (RR 0·52; 95% CI 0·29–0·93; P = 0·027) to be independently associated with protection against infection. Modifiable factors associated with reduced CLABSI risk include ultrasound guidance for CVC localisation and increased availability of sessional medical specialists.
Default mode network (DMN) is vulnerable to the effects of APOE genotype. Given the reduced brain volumes and APOE ε 4-related brain changes in elderly carriers, it is less known that whether these changes would influence the functional connectivity and to what extent. This study aimed to examine the functional connectivity within DMN, and its diagnostic value with age-related morphometric alterations considered.
Whole brain and seed-based resting-state functional connectivity (RSFC) analysis were conducted in cognitively normal APOE ε 4 carriers and matched non-carriers (N=38). The absolute values of mean correlation coefficients (z-values) were used as a measure of functional connectivity strength (FCS) between DMN subregions, which were also used to estimate their diagnostic value by receiver-operating characteristic (ROC) curves.
APOE ε 4 carriers demonstrated decreased interhemispheric FCS, particularly between right hippocampal formation (R.HF) and left inferior parietal lobular (L.IPL) (t=3.487, p<0.001). ROC analysis showed that the FCS of R.HF and L.IPL could differentiate APOE ε 4 carriers from healthy counterparts (AUC value=0.734, p=0.025). Moreover, after adjusting the impact of morphometry, the differentiated value of FCS of R.HF and L.IPL was markedly improved (AUC value=0.828, p=0.002).
Our findings suggest that APOE ε 4 allele affects the functional connectivity within posterior DMN, particularly the atrophy-corrected interhemispheric FCS before the clinical expression of neurodegenerative disease.
The objective of the present study was to investigate live weight (LW) gain, urinary nitrogen (UN) excretion and urination behaviour of dairy heifers grazing pasture, chicory and plantain in autumn and spring. The study comprised a 35-day autumn trial (with a 7-day acclimation period) and a 28-days spring trial (with a 7-day acclimation period). For each trial, 56 Friesian × Jersey heifers were blocked into five dietary treatments balanced for their LW and breeding worth (i.e. genetic merit of a cow for production and reproduction): 1·00 perennial ryegrass–white clover pasture (PA); 1·00 chicory (CH); 1·00 plantain (PL); 0·50 pasture + 0·50 chicory (PA + CH); and 0·50 pasture + 0·50 plantain (PA + PL). A fresh allocation of the herbage was offered every 3 days with allowance calculated according to feed requirement for maintenance plus gain of 1·0 kg LW/day. In both trials, LW gain was lower on CH than other treatments. In the spring trial, UN concentration and UN excretion were lower in CH and PL than other treatments. In autumn, a higher urination frequency was observed over the first 6 h after forage allocation in CH and PA + CH than other treatments. Data from the present study indicate that feeding CH alone limited heifer LW gain. However, heifers grazing swards containing chicory (CH and PA + CH) and plantain (PL and PA + PL) had the potential to lower nitrous oxide emissions and nitrate leaching from soil compared with heifers grazing PA, by reducing N loading in urine patches.
There has been wide-spread use of plane wave theory in muffler design in industry. However, This has led to an underestimation of acoustical performances at higher frequencies. To overcome the above drawback, the finite element and boundary element methods have been developed. Nevertheless, the time consumed in calculating the noise level is unacceptable. Moreover, considering the acoustical effect and necessity of space-constrained situation in industry, a compact design of reverse mufflers which may improve the acoustical efficiency is then proposed.
In this paper, a numerical assessment of rectangular mufflers hybridized with straight/reverse chambers using eigen function, four-pole matrix, and genetic algorithm under limited space is developed. Before the optimization is performed, an accuracy check of the mathematical models for the muffler will be carried out. Results reveal that the noise reduction will increase when the number of chambers increases. In addition, the acoustical performance of the mufflers is reversely proportional to the diameter of the inlet/outlet tubes. Also, the TL of the mufflers will be improved when using more number of target tones in the objective function. Consequently, a successful approach in searching optimal shaped rectangular straight/reverse mufflers using an eigen function and a genetic algorithm method within a constrained space has been demonstrated.
This paper investigates the rupture problem of a thin micropolar liquid film under a magnetic field on a horizontal plate, using long-wave perturbation to resolve nonlinear evolution equations with a free film interface. The governing equation is resolved using a finite difference method as part of an initial value problem for spatial periodic boundary conditions. The effect of a micropolar liquid under a magnetic field on the nonlinear rupture mechanism is studied in terms of the micropolar parameter, R, the Hartmann constant, m and the initial disturbance amplitude, H0. Modeling results indicate that the R, m and H0 parameters strongly affect the film flow. Enhancing the micropolar and magnetic effects is found to delay the rupture time. In addition, the results show that the film rupture time increases as the values of initial disturbance magnitude decrease. The micropolar and magnetic parameters indeed play a significant role in the film flow on a horizontal plate. Moreover, the optimum conditions can be found to alter stability of the film flow by controlling the applied magnetic field.
To study the association between gastrointestinal colonization of carbapenemase-producing Enterobacteriaceae (CPE) and proton pump inhibitors (PPIs).
We analyzed 31,526 patients with prospective collection of fecal specimens for CPE screening: upon admission (targeted screening) and during hospitalization (opportunistic screening, safety net screening, and extensive contact tracing), in our healthcare network with 3,200 beds from July 1, 2011, through December 31, 2015. Specimens were collected at least once weekly during hospitalization for CPE carriers and subjected to broth enrichment culture and multiplex polymerase chain reaction.
Of 66,672 fecal specimens collected, 345 specimens (0.5%) from 100 patients (0.3%) had CPE. The number and prevalence (per 100,000 patient-days) of CPE increased from 2 (0.3) in 2012 to 63 (8.0) in 2015 (P<.001). Male sex (odds ratio, 1.91 [95% CI, 1.15–3.18], P=.013), presence of wound or drain (3.12 [1.70–5.71], P<.001), and use of cephalosporins (3.06 [1.42–6.59], P=.004), carbapenems (2.21 [1.10–4.48], P=.027), and PPIs (2.84 [1.72–4.71], P<.001) in the preceding 6 months were significant risk factors by multivariable analysis. Of 79 patients with serial fecal specimens, spontaneous clearance of CPE was noted in 57 (72.2%), with a median (range) of 30 (3–411) days. Comparing patients without use of antibiotics and PPIs, consumption of both antibiotics and PPIs after CPE identification was associated with later clearance of CPE (hazard ratio, 0.35 [95% CI, 0.17–0.73], P=.005).
Concomitant use of antibiotics and PPIs prolonged duration of gastrointestinal colonization by CPE.