To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The commercial Computational Fluid Dynamics (CFD) software STAR-CCM+ was used to simulate the flow and breakup characteristics of a Liquid Jet Injected into the gaseous Crossflow (LJIC) under real engine operating conditions. The reasonable calculation domain geometry and flow boundary conditions were obtained based on a civil aviation engine performance model similar to the Leap-1B engine which was developed using the GasTurb software and the preliminary design results of its low-emission combustor. The Volume of Fluid (VOF) model was applied to simulate the breakup feature of the near field of LJIC. The numerical method was validated and calibrated through comparison with the public test data at atmospheric conditions. The results showed that the numerical method can capture most of the jet breakup structure and predict the jet trajectory with an error not exceeding ±5%. The verified numerical method was applied to simulate the breakup of LJIC at the real engine operating condition. The breakup mode of LJIC was shown to be surface shear breakup at elevated condition. The trajectory of the liquid jet showed good agreement with Ragucci’s empirical correlation.
To stop transmission of hepatitis B virus (HBV) and hepatitis C virus (HCV) infections in association with myocardial perfusion imaging (MPI) at a cardiology clinic.
Outbreak investigation and quasispecies analysis of HCV hypervariable region 1 genome.
Outpatient cardiology clinic.
Patients undergoing MPI.
Case patients met definitions for HBV or HCV infection. Cases were identified through surveillance registry cross-matching against clinic records and serological screening. Observations of clinic practices were performed.
During 2012–2014, 7 cases of HCV and 4 cases of HBV occurred in 4 distinct clusters among patients at a cardiology clinic. Among 3 case patients with HCV infection who had MPI on June 25, 2014, 2 had 98.48% genetic identity of HCV RNA. Among 4 case patients with HCV infection who had MPI on March 13, 2014, 3 had 96.96%–99.24% molecular identity of HCV RNA. Also, 2 clusters of 2 patients each with HBV infection had MPI on March 7, 2012, and December 4, 2014. Clinic staff reused saline vials for >1 patient. No infection control breaches were identified at the compounding pharmacy that supplied the clinic. Patients seen in clinic through March 27, 2015, were encouraged to seek testing for HBV, HCV, and human immunodeficiency virus. The clinic switched to all single-dose medications and single-use intravenous flushes on March 27, 2015, and no further cases were identified.
This prolonged healthcare-associated outbreak of HBV and HCV was most likely related to breaches in injection safety. Providers should follow injection safety guidelines in all practice settings.
To evaluate the impacts of guanidinoacetic acid (GAA) and coated folic acid (CFA) on growth performance, nutrient digestion and hepatic gene expression, fifty-two Angus bulls were assigned to four groups in a 2 × 2 factor experimental design. The CFA of 0 or 6 mg/kg dietary DM folic acid was supplemented in diets with GAA of 0 (GAA−) or 0·6 g/kg DM (GAA+), respectively. Average daily gain (ADG), feed efficiency and hepatic creatine concentration increased with GAA or CFA addition, and the increased magnitude of these parameters was greater for addition of CFA in GAA− diets than in GAA+ diets. Blood creatine concentration increased with GAA or CFA addition, and greater increase was observed when CFA was supplemented in GAA+ diets than in GAA− diets. DM intake was unchanged, but rumen total SCFA concentration and digestibilities of DM, crude protein, neutral-detergent fibre and acid-detergent fibre increased with the addition of GAA or CFA. Acetate:propionate ratio was unaffected by GAA, but increased for CFA addition. Increase in blood concentrations of albumin, total protein and insulin-like growth factor-1 (IGF-1) was observed for GAA or CFA addition. Blood folate concentration was decreased by GAA, but increased with CFA addition. Hepatic expressions of IGF-1, phosphoinositide 3-kinase, protein kinase B, mammalian target of rapamycin and ribosomal protein S6 kinase increased with GAA or CFA addition. Results indicated that the combined supplementation of GAA and CFA could not cause ADG increase more when compared with GAA or CFA addition alone.
Introduction: There is ongoing concern about the burden placed on healthcare systems by lab tests. Although these concerns are widespread, it is difficult to quantify the extent of the problem. One approach involves use of a metric known as the Mean Abnormal Response Rate (MARR), which is the proportion of tests ordered that return an abnormal result; a higher MARR value indicates higher yield. The primary objective of this study was to calculate MARRs for tests ordered between April 2014 and March 2019 at the four adult emergency departments (EDs) covering a metropolitan population of 1.3 million. Secondary objectives included identifying tests with highest and lowest MARRs; comparison of MARRs for nurse- and physician-initiated orders; correlation of the number of tests per order requisition to MARR; and correlation of physician experience to MARR. Methods: In total, 40 laboratory tests met inclusion criteria for this study. Administrative data on these tests as ordered at the four EDs were obtained and analyzed. Multi-component test results, such as from CBC, were consolidated such that an abnormal result for any component was coded as an abnormal result for the entire test. Repeat tests ordered within a single patient visit were excluded. Physician experience was quantified for 209 ED physicians as number of years since licensure. Analyses were descriptive where appropriate for whole-population data. Risk of bias was attenuated by the focus on administrative data. Results: The population dataset comprised 33,757,004 test results on 415,665 unique patients. Of these results, 30.3% were the outcomes of nurse-initiated orders. The 5-year MARRs for the four hospitals were 38.3%, 40.0%, 40.7% and 40.9%. The highest per-test MARRs were for BNP (80.5%) and CBC (62.6%), while the lowest were for glucose (7.9%) and sodium (11.6%). MARRs were higher for nurse-initiated orders than for physician-initiated orders (44.7% vs. 38.1%), likely due to the greater order frequency of high-yield CBC in nurse-initiated orders (38.6% vs. 18.1%). The number of tests per order requisition was inversely associated with MARR (r = -0.90, p < 0.001). Finally, the number of years since licensure was modestly but significantly associated with MARR (r = 0.28, p < 0.001). Conclusion: This is the first and largest study to apply the MARR in an ED setting. As a metric, MARR effectively identifies differences in test ordering practices on per-test and per-hospital bases, which could be useful for data-informed practice optimization.
Introduction: Prognostication and disposition among older Emergency Department (ED) patients with suspected infection remains challenging. Frailty is increasingly recognized as a predictor of poor prognosis among critically ill patients, however its association with clinical outcomes among older ED patients with suspected infection is unknown. Methods: We conducted a multicentre prospective cohort study at two tertiary care EDs. We included older ED patients (≥ 75 years) presenting with suspected infection. Frailty at baseline (prior to index illness) was explicitly measured for all patients by the treating physicians using the Clinical Frailty Scale (CFS). We defined frailty as a CFS 5-8. The primary outcome was 30-day mortality. We used multivariable logistic regression to adjust for known confounders. We also compared the prognostic accuracy of frailty against the Systemic Inflammatory Response Syndrome (SIRS) and Quick Sequential Organ Failure Assessment (qSOFA) criteria. Results: We enrolled 203 patients, of whom 117 (57.6%) were frail. Frail patients were more likely to develop septic shock (adjusted odds ratio [aOR]: 1.83, 95% confidence interval [CI]: 1.08-2.51) and more likely to die within 30 days of ED presentation (aOR 2.05, 95% CI: 1.02-5.24). Sensitivity for mortality was highest among the CFS (73.1%, 95% CI: 52.2-88.4), as compared to SIRS ≥ 2 (65.4%, 95% CI: 44.3-82.8) or qSOFA ≥ 2 (38.4, 95% CI: 20.2-59.4). Conclusion: Frailty is a highly prevalent prognostic factor that can be used to risk-stratify older ED patients with suspected infection. ED clinicians should consider screening for frailty in order to optimize disposition in this population.
Se can enhance lactation performance by improving nutrient utilization and antioxidant status. However, sodium selenite (SS) can be reduced to non-absorbable elemental Se in the rumen, thereby reducing the intestinal availability of Se. The study investigated the impacts of SS and coated SS (CSS) supplementation on lactation performance, nutrient digestibility, ruminal fermentation and microbiota in dairy cows. Sixty multiparous Holstein dairy cows were blocked by parity, daily milk yield and days in milk and randomly assigned to five treatments: control, SS addition (0.3 mg Se/kg DM as SS addition) or CSS addition (0.1, 0.2 and 0.3 mg Se/kg DM as CSS addition for low CSS (LCSS), medium CSS (MCSS) and high CSS (HCSS), respectively). Experiment period was 110 days with 20 days of adaptation and 90 days of sample collection. Dry matter intake was higher for MCSS and HCSS compared with control. Yields of milk, milk fat and milk protein and feed efficiency were higher for MCSS and HCSS than for control, SS and LCSS. Digestibility of DM and organic matter was highest for CSS addition, followed by SS addition and then control. Digestibility of CP was higher for MCSS and HCSS than for control, SS and LCSS. Higher digestibility of ether extract, NDF and ADF was observed for SS or CSS addition. Ruminal pH decreased with dietary Se addition. Acetate to propionate ratio and ammonia N were lower, and total volatile fatty acids (VFAs) concentration was greater for SS, MCSS and HCSS than control. Ruminal H ion concentration was highest for MCSS and HCSS and lowest for control. Activities of cellobiase, carboxymethyl-cellulase, xylanase and protease and copies of total bacteria, fungi, Ruminococcus flavefaciens, Fibrobacter succinogenes and Ruminococcus amylophilus increased with SS or CSS addition. Activity of α-amylase, copies of protozoa, Ruminococcus albus and Butyrivibrio fibrisolvens and serum glucose, total protein, albumin and glutathione peroxidase were higher for SS, MCSS and HCSS than for control and LCSS. Dietary SS or CSS supplementation elevated blood Se concentration and total antioxidant capacity activity. The data implied that milk yield was elevated due to the increase in total tract nutrient digestibility, total VFA concentration and microorganism population with 0.2 or 0.3 mg Se/kg DM from CSS supplementation in dairy cows. Compared with SS, HCSS addition was more efficient in promoting lactation performance of dairy cows.
The main aim of the present studies is to determine whether, or to some extent, specific cognitive domains could differentiate the main subtypes of mood disorder in the depressed and clinically remitted status respectively.
Three groups of bipolar I (n = 92), bipolar II (n = 131) and unipolar depression patients (n = 293) were tested with a battery of neuropsychological tests at baseline and after 6 weeks of treatment, contrasting with 202 healthy controls on cognitive performance.
At the acute depressive state, the three patients groups (bipolar I, bipolar II and unipolar depression) showed cognitive dysfunction in processing speed, memory, verbal fluency and executive function but not attention compared with controls. And post comparisons revealed that bipolar I patients performed significantly worse in these impaired cognitive domain than bipolar II and unipolar depression patients in verbal fluency and executive function. After treatment, clinically remitted bipolar I and bipolar II patients only displayed cognitive impairment in processing speed and visual memory in relative to controls, while remitted unipolar depression patients showed cognitive impairment in executive function in addition to processing speed and visual memory.
Bipolar I, bipolar II and unipolar depression patients have a similar pattern of cognitive impairment during the state of acute depressive episodes. At the clinically remission, still both bipolar disorder and unipolar depression patients showed cognitive deficits in processing speed and visual memory, and executive dysfunction might be a status-maker for bipolar disorder, but a trait-marker for unipolar depression
The main aim of this study is to investigate the capacity of a number of variables from four dimensions (clinical, psychosocial, cognitive and genetic domains) to predict the antidepressant treatment outcome, and combined the predictors in one integrate regression model with the aim to investigate which predictor contributed most.
In a semi-naturalistic prospective cohort study with a total of 241 fully assessed MDD patients, decrease in HAM-D scores from baseline to after 6 weeks of treatment was used to measure the antidepressant treatment outcome.
The clinical and psychosocial model (R2 = 0.451) showed that HAM-D scores at baseline and MMPI-2 scale paranoia was the best clinical and psychosocial predictor of treatment outcome respectively. The cognitive model (R2 = 0.502) revealed that combination of better performance on TMT-B test and worse performance on TOH and WAIS-R Digit Backward testes could predict decline in HAM-D scores. The genetics analysis only found median of percent improvement in HAM-D scores in G-allele of GR gene BclI polymorphism carriers (72.2%) was significant lower than that in non-G allele carriers (80.1%). The integrate model showed that three predictors, combination of HAM-D scores at baseline, MMPI-2 scale paranoia and TMT-B test, explained 57.1% of the variance.
Three markers, HAM-D scores at baseline, MMPI-2 scale paranoia and TMT-B test, might serve as predictor of antidepressant outcome in daily psychiatric practice.
Many institutions are attempting to implement patient-reported outcome (PRO) measures. Because PROs often change clinical workflows significantly for patients and providers, implementation choices can have major impact. While various implementation guides exist, a stepwise list of decision points covering the full implementation process and drawing explicitly on a sociotechnical conceptual framework does not exist.
To facilitate real-world implementation of PROs in electronic health records (EHRs) for use in clinical practice, members of the EHR Access to Seamless Integration of Patient-Reported Outcomes Measurement Information System (PROMIS) Consortium developed structured PRO implementation planning tools. Each institution pilot tested the tools. Joint meetings led to the identification of critical sociotechnical success factors.
Three tools were developed and tested: (1) a PRO Planning Guide summarizes the empirical knowledge and guidance about PRO implementation in routine clinical care; (2) a Decision Log allows decision tracking; and (3) an Implementation Plan Template simplifies creation of a sharable implementation plan. Seven lessons learned during implementation underscore the iterative nature of planning and the importance of the clinician champion, as well as the need to understand aims, manage implementation barriers, minimize disruption, provide ample discussion time, and continuously engage key stakeholders.
Highly structured planning tools, informed by a sociotechnical perspective, enabled the construction of clear, clinic-specific plans. By developing and testing three reusable tools (freely available for immediate use), our project addressed the need for consolidated guidance and created new materials for PRO implementation planning. We identified seven important lessons that, while common to technology implementation, are especially critical in PRO implementation.
Deep-brain magnetic stimulation (DMS) is an effective therapy for various neuropsychiatric disorders including major depression disorder. The molecular and cellular mechanisms underlying the impacts of DMS on the brain remain unclear. Studies have reported abnormalities in the white matter of depressive brains, suggesting the involvement of myelin and oligodendrocyte pathologies in the development of major depressive disorder. In this study, we use a cuprizone induced demyelination animal model to generate depressive like behaviours and white matter and oligodendrocyte damages. Meanwhile, we treated the animal with DMS 20 minutes daily during the cuprizone challenge or recovery period. Behavioural tests, including nesting, new objective recognition, working memory and depression-like behaviours were tested periodically. Histological staining and western blotting were used to examine the underlying mechanism of DMS. We found that DMS reverse cuprizone induced behavioural deficits in acute demyelination but not during the recovery period. DMS alleviated demyelination and inflammation induced by cuprizone. During the recovery period, DMS had no impacts on overall neural progenitor cell proliferation, but enhanced the maturation of oligodendrocyte. This data suggest that DMS may be a promising treatment option for improving white matter function in psychiatric disorders and neurological diseases in future.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Short-term peripheral venous catheter–related bloodstream infection (PVCR-BSI) rates have not been systematically studied in resource-limited countries, and data on their incidence by number of device days are not available.
Prospective, surveillance study on PVCR-BSI conducted from September 1, 2013, to May 31, 2019, in 727 intensive care units (ICUs), by members of the International Nosocomial Infection Control Consortium (INICC), from 268 hospitals in 141 cities of 42 countries of Africa, the Americas, Eastern Mediterranean, Europe, South East Asia, and Western Pacific regions. For this research, we applied definition and criteria of the CDC NHSN, methodology of the INICC, and software named INICC Surveillance Online System.
We followed 149,609 ICU patients for 731,135 bed days and 743,508 short-term peripheral venous catheter (PVC) days. We identified 1,789 PVCR-BSIs for an overall rate of 2.41 per 1,000 PVC days. Mortality in patients with PVC but without PVCR-BSI was 6.67%, and mortality was 18% in patients with PVC and PVCR-BSI. The length of stay of patients with PVC but without PVCR-BSI was 4.83 days, and the length of stay was 9.85 days in patients with PVC and PVCR-BSI. Among these infections, the microorganism profile showed 58% gram-negative bacteria: Escherichia coli (16%), Klebsiella spp (11%), Pseudomonas aeruginosa (6%), Enterobacter spp (4%), and others (20%) including Serratia marcescens. Staphylococcus aureus were the predominant gram-positive bacteria (12%).
PVCR-BSI rates in INICC ICUs were much higher than rates published from industrialized countries. Infection prevention programs must be implemented to reduce the incidence of PVCR-BSIs in resource-limited countries.
Cytomegalovirus (CMV) enters latency after primary infection and can reactivate periodically with virus excreted in body fluids which can be called shedding. CMV shedding during the early stage of pregnancy is associated with adverse pregnancy outcome. The shedding pattern in healthy seropositive women who plan to have babies has not been well characterised. Vaginal swabs, urine and blood were collected from 1262 CMV IgG-positive women who intended to have babies and tested for CMV DNA by fluorogenic quantitative PCR method. Serum IgM was also detected. The association between sociodemographic characteristics and CMV shedding prevalence was analysed. Among 1262 seropositive women, 12.8% (161/1262) were detected CMV DNA positive in at least one body fluid. CMV DNA was more frequently detected in vaginal secretion (10.5%) than in urine (3.2%) and blood (0.6%) also with higher viral loads (P < 0.00). CMV shedding was more likely detected in IgM-positive women than IgM-negative women (29.5% (13/44) vs. 12.2% (148/1218); OR 3.03, 95% CI 1.55–5.93; P = 0.001). CMV shedding in vaginal secretion was highly correlated with shedding in urine, the immune state of IgM, the adverse pregnant history and younger age. CMV shedding was more commonly detected in vaginal secretion than in urine or blood with higher viral loads among healthy seropositive women of reproductive age. Further studies are needed to figure out whether the shedding is occasional or continuous and whether it is associated with adverse pregnancy outcomes.
A principled approach to understand networks is to formulate generative models and infer their parameters from given network data. Due to the scarcity of data in the form of multiple networks that have evolved from the same process, generative models are typically formulated to learn parameters from a single network observation, hence ignoring the natural variability of the “true” process. In this paper, we highlight the importance of variability in evaluating generative models and present two ways of quantifying the variability for a finite set of networks. The first evaluation scheme compares the statistical properties of networks in a dissimilarity space, while the other relies on data-driven entropy measures to compute variability in network populations. Using these measures, we evaluate the ability of four generative models to synthesize networks that capture the variability of the “true” process. Our empirical analysis suggests that generative models fitted for a single network observation fail to capture the variability in the network population. Our work highlights the need for rethinking the way we evaluate the goodness-of-fit of new and existing network models and devising models that are capable of matching the variability of network populations when available.
Multiple lines of evidence suggest the presence of altered neuroimmune processes in patients with schizophrenia (Sz) and severe mood disorders. Recent studies using a novel free water diffusion tensor imaging (FW DTI) approach, proposed as a putative biomarker of neuroinflammation, atrophy, or edema, have shown significantly increased FW in patients with Sz. However no studies to date have investigated the longitudinal stability of FW alterations during the early course of psychosis, nor have studies focused separately on FE psychosis patients with Sz or bipolar disorder (BD) with psychotic features.
The current study included 188 participants who underwent diffusion magnetic resonance imaging scanning at baseline. Sixty-four participants underwent follow-up rescanning after 12 months. DTI-based alterations in patients were calculated using voxelwise tract-based spatial statistics and region of interest analyses.
Patients with FE psychosis, both Sz and BD, exhibited increased FW at illness onset which remained unchanged over the 12-month follow-up period. Preliminary analyses suggested that antipsychotic medication exposure was associated with higher FW in gray matter that reached significance in the BD group. Higher FW in white matter correlated with negative symptom severity.
Our results support the presence of elevated FW at the onset of psychosis in both Sz and BD, which remains stable during the early course of the illness, with no evidence of either progression or remission.
We aimed to investigate the heterogeneity of seasonal suicide patterns among multiple geographically, demographically and socioeconomically diverse populations.
Weekly time-series data of suicide counts for 354 communities in 12 countries during 1986–2016 were analysed. Two-stage analysis was performed. In the first stage, a generalised linear model, including cyclic splines, was used to estimate seasonal patterns of suicide for each community. In the second stage, the community-specific seasonal patterns were combined for each country using meta-regression. In addition, the community-specific seasonal patterns were regressed onto community-level socioeconomic, demographic and environmental indicators using meta-regression.
We observed seasonal patterns in suicide, with the counts peaking in spring and declining to a trough in winter in most of the countries. However, the shape of seasonal patterns varied among countries from bimodal to unimodal seasonality. The amplitude of seasonal patterns (i.e. the peak/trough relative risk) also varied from 1.47 (95% confidence interval [CI]: 1.33–1.62) to 1.05 (95% CI: 1.01–1.1) among 12 countries. The subgroup difference in the seasonal pattern also varied over countries. In some countries, larger amplitude was shown for females and for the elderly population (≥65 years of age) than for males and for younger people, respectively. The subperiod difference also varied; some countries showed increasing seasonality while others showed a decrease or little change. Finally, the amplitude was larger for communities with colder climates, higher proportions of elderly people and lower unemployment rates (p-values < 0.05).
Despite the common features of a spring peak and a winter trough, seasonal suicide patterns were largely heterogeneous in shape, amplitude, subgroup differences and temporal changes among different populations, as influenced by climate, demographic and socioeconomic conditions. Our findings may help elucidate the underlying mechanisms of seasonal suicide patterns and aid in improving the design of population-specific suicide prevention programmes based on these patterns.
Radio-echo sounding (RES) can be used to understand ice-sheet processes, englacial flow structures and bed properties, making it one of the most popular tools in glaciological exploration. However, RES data are often subject to ‘strip noise’, caused by internal instrument noise and interference, and/or external environmental interference, which can hamper measurement and interpretation. For example, strip noise can result in reduced power from the bed, affecting the quality of ice thickness measurements and the characterization of subglacial conditions. Here, we present a method for removing strip noise based on combined wavelet and two-dimensional (2-D) Fourier filtering. First, we implement discrete wavelet decomposition on RES data to obtain multi-scale wavelet components. Then, 2-D discrete Fourier transform (DFT) spectral analysis is performed on components containing the noise. In the Fourier domain, the 2-D DFT spectrum of strip noise keeps its linear features and can be removed with a ‘targeted masking’ operation. Finally, inverse wavelet transforms are performed on all wavelet components, including strip-removed components, to restore the data with enhanced fidelity. Model tests and field-data processing demonstrate the method removes strip noise well and, incidentally, can remove the strong first reflector from the ice surface, thus improving the general quality of radar data.