We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Design education increasingly blends technology learning with sociotechnical challenges, but little is understood about how students simultaneously engage with both of these elements. In this preliminary study, we describe the results of two offerings of a design course focusing on disaster response at a major public research institution. We present a preliminary analysis of 52 students’ course reflections suggesting that sociotechnical challenges uniquely contextualize technology during project-based learning, presenting promising opportunities for future design education and research study.
To assess the training and the future workforce needs of paediatric cardiac critical care faculty.
Design:
REDCap surveys were sent May−August 2019 to medical directors and faculty at the 120 US centres participating in the Society of Thoracic Surgeons Congenital Heart Surgery Database. Faculty and directors were asked about personal training pathway and planned employment changes. Directors were additionally asked for current faculty numbers, expected job openings, presence of training programmes, and numbers of trainees. Predictive modelling of the workforce was performed using respondents’ data. Patient volume was projected from US Census data and compared to projected provider availability.
Measurements and main results:
Sixty-six per cent (79/120) of directors and 62% (294/477) of contacted faculty responded. Most respondents had training that incorporated critical care medicine with the majority completing training beyond categorical fellowship. Younger respondents and those in dedicated cardiac ICUs were more significantly likely to have advanced training or dual fellowships in cardiology and critical care medicine. An estimated 49–63 faculty enter the workforce annually from various training pathways. Based on modelling, these faculty will likely fill current and projected open positions over the next 5 years.
Conclusions:
Paediatric cardiac critical care training has evolved, such that the majority of faculty now have dual fellowship or advanced training. The projected number of incoming faculty will likely fill open positions within the next 5 years. Institutions with existing or anticipated training programmes should be cognisant of these data and prepare graduates for an increasingly competitive market.
Beachpea (Vigna marina) is a halophytic wild leguminous plant which occurs throughout tropical and subtropical beaches of world. As quantitative trait loci (QTLs) for salt tolerance in V. marina and its crossability with other Vigna species are known, the current study was undertaken to know the presence of these QTLs in the V. marina accessions along with check varieties of pulses. Accordingly, 20 Vigna genotypes (15 accessions of V. marina collected from sea-shore areas of Andaman and Nicobar Islands along with five check varieties of green gram and black gram) were subjected to molecular characterization using seven simple sequence repeat (SSR) markers associated with salt tolerance. Of the markers used, only four SSR markers amplified in the studied germplasm. Number of alleles detected per primer and size of alleles ranged from 1 to 3 and 100 to 325 bp, respectively. Polymorphism information content and heterozygosity values ranged from 0.305 to 0.537 and 0.375 to 0.612, respectively. Three major clusters, cluster I, II and III were obtained at Jaccard's similarity coefficient value of 0.48 through the un-weighted paired group method with arithmetic means method of cluster analysis. It grouped green gram and black gram genotypes in clusters I (04) and II (01), whereas all V. marina genotypes were grouped in cluster III (15). Principal co-ordinate analysis explained 85.9% of genetic variation among genotypes which was further confirmed by cluster analysis. This study indicated the effectiveness of SSR markers in separating cultivated Vigna species from wild V. marina. The findings will be useful for transferring trait of robust salt tolerance of V. marina in cultivated Vigna species using marker-assisted breeding.
Dementia due to probable Alzheimer’s disease (AD) represents between 60 and 80% of all dementias. The total number of estimated AD cases worldwide by 2030 is 65.7 million and 115.4 million by 2050; this represents a twofold population increase in the next 20 years.
Magnetic resonance imaging (MRI) has been the primary tool of interest to link hippocampal volume loss with dementia firmly.
MRI-based volumetry has been proposed as a promising biomarker.
Hippocampal volumetry is useful in discriminating not only cognitively normal individuals from those with dementia but can also differentiate Mild Cognitive Impairment (MCI) from various types of dementia.
Research objective:
To measure hippocampal volume in various types of dementia. (MMSE) and Activities of daily living (ADL) in patients with dementia.
Method:
A cross-sectional study conducted for period of one year among 21 patients with Alzheimer’s, vascular dementia, amnestic mild cognitive impairment and 20 healthy age matched controls. MMSE scale was used to stratify patients on cognitive function impairments. ADL scale to assess functional status of the patient ability to perform activities of daily living independently in diverse settings. Hippocampal volume measured using MRI 1.5 T Philips Ingenia, a coronal T1-weighted FFE (Fast Field Echo) 3D sequence.
Results:
Total Hippocampal volume was reduced by 35% in Alzheimer’s disease, 27% in vascular dementia and 10% in amnestic mild cognitive impairment, compared with control group
Conclusions:
Moderate positive correlation between mean total hippocampal volume and MMSE scores in patients with dementia which was statistically significant. (P value= 0.001).
ABSTRACT IMPACT: Our goal is to identify bacterial biomarkers of adverse Clostridioides difficile infection outcomes OBJECTIVES/GOALS: We characterized microbiota features of Clostridioides difficile infections (CDIs) and will investigate the association between bacterial taxa and adverse outcomes, which includes severe and recurrent CDIs. METHODS/STUDY POPULATION: 1,517 stool samples were collected from patients diagnosed with a CDI at the University of Michigan along with 1,516 unformed and 910 formed stool control samples. We characterized the microbiota of the 3,943 stool samples by sequencing the V4 region of the 16S rRNA gene and used the Dirichlet Multinomial Mixtures method to cluster samples into community types. Severe CDI cases were defined using the Infectious Diseases Society of America criteria and recurrent CDIs were defined as CDIs that occurred within 2-12 weeks of the primary CDI. We will use machine learning to examine whether specific bacterial taxa can predict severe or recurrent CDIs. We will test 5 machine learning models with 80% training and 20% testing data split. RESULTS/ANTICIPATED RESULTS: Similar to findings from a previous study with 338 samples, we found there was no difference in diversity between CDI cases and unformed controls (Inverse Simpson index, p > 0.5) and samples from the 3 groups (CDIs, unformed controls, and formed controls) clustered into 12 community types. To investigate the bacterial taxa that are important for predicting adverse CDI outcomes, we will select the best machine learning model based on performance and training time and examine how much each feature contributes to performance. We anticipate the large number of CDI cases in our cohort and robust machine learning approaches will enable us to identify more bacteria associated with adverse outcomes compared to other studies that have attempted to predict CDI recurrence with fewer CDI cases. DISCUSSION/SIGNIFICANCE OF FINDINGS: Adverse CDI outcomes are a significant source of the morbidities, mortalities, and healthcare costs associated with CDIs. Identifying bacterial biomarkers of severe and recurrent CDIs could enhance our ability to stratify patients into risk groups and may lead to the development of more targeted therapeutics.
Hidden hunger is widespread in India. Individual dietary diversity score (IDDS) is a measure of the nutrient adequacy of the diet. The FAO has set guidelines for the measurement of dietary diversity: the IDDS and the minimum dietary diversity score for women (MDD-W) to assess nutritional deficiency, but validation against nutritional biomarkers is required. Using available data among rural youth (17 years) from the Pune Maternal Nutrition Study, the validity of DDS was assessed to measure deficiencies of vitamin B12, folate and Hb. Of the 355 boys and 305 girls, 19 % were classified as underweight, 57 % as vitamin B12 deficient (<150 pmol/l) and 22 % as anaemic (<120/130 g/l). Cereals, legumes and ‘other-vegetables’ were the most frequently consumed foods. More boys than girls consumed milk, flesh, eggs and micronutrient-dense foods. Median IDDS of 4 (interquartile range (IQR) 3–4) and MDD-W of 6 (IQR 5–7) were low. Youth with vitamin B12 deficiency had a higher likelihood of an IDDS ≤ 4 (1·89; 95 % CI 1·24, 2·87) or an MDD-W ≤ 5 (1·40; 95 % CI 1·02, 1·94). Youth with anaemia were more likely to have an IDDS ≤ 4 (1·76; 95 % CI 1·01, 3·14) adjusted for socio-economic scores, BMI, energy intake and sex. Folate deficiency was low (3 %) and was not associated with either score. Youth with lowest plasma vitamin B12 and Hb infrequently or never consumed dairy products/non-vegetarian foods. These rural Indian youth were underweight, had low DDS and consumed foods low in good-quality proteins and micronutrients. Associations of DDS with circulating micronutrients indicate that DDS is a valid measure to predict vitamin B12 deficiency and anaemia.
Case fatality rate (CFR) and doubling time are important characteristics of any epidemic. For coronavirus disease 2019 (COVID-19), wide variations in the CFR and doubling time have been noted among various countries. Early in the epidemic, CFR calculations involving all patients as denominator do not account for the hospitalised patients who are ill and will die in the future. Hence, we calculated cumulative CFR (cCFR) using only patients whose final clinical outcomes were known at a certain time point. We also estimated the daily average doubling time. Calculating CFR using this method leads to temporal stability in the fatality rates, the cCFR stabilises at different values for different countries. The possible reasons for this are an improved outcome rate by the end of the epidemic and a wider testing strategy. The United States, France, Turkey and China had high cCFR at the start due to low outcome rate. By 22 April, Germany, China and South Korea had a low cCFR. China and South Korea controlled the epidemic and achieved high doubling times. The doubling time in Russia did not cross 10 days during the study period.
This paper offers a framework for measuring global growth and inflation, built on standard index number theory, national accounts principles, and the concepts and methods for international macro-economic comparisons. Our approach provides a sound basis for purchasing power parity (PPP)- and exchange rate (XR)-based global growth and inflation measures. The Sato–Vartia index number system advocated here offers very similar results to a Fisher system but has the added advantage of allowing a complete decomposition with PPP or XR effects. For illustrative purposes, we present estimates of global growth and inflation for 141 countries over the years 2005 and 2011. The contribution of movements in XRs and PPPs to global inflation are presented. The aggregation properties of the method are also discussed.
To evaluate whether incorporating mandatory prior authorization for Clostridioides difficile testing into antimicrobial stewardship pharmacist workflow could reduce testing in patients with alternative etiologies for diarrhea.
Design:
Single center, quasi-experimental before-and-after study.
Setting:
Tertiary-care, academic medical center in Ann Arbor, Michigan.
Patients:
Adult and pediatric patients admitted between September 11, 2019 and December 10, 2019 were included if they had an order placed for 1 of the following: (1) C. difficile enzyme immunoassay (EIA) in patients hospitalized >72 hours and received laxatives, oral contrast, or initiated tube feeds within the prior 48 hours, (2) repeat molecular multiplex gastrointestinal pathogen panel (GIPAN) testing, or (3) GIPAN testing in patients hospitalized >72 hours.
Intervention:
A best-practice alert prompting prior authorization by the antimicrobial stewardship program (ASP) for EIA or GIPAN testing was implemented. Approval required the provider to page the ASP pharmacist and discuss rationale for testing. The provider could not proceed with the order if ASP approval was not obtained.
Results:
An average of 2.5 requests per day were received over the 3-month intervention period. The weekly rate of EIA and GIPAN orders per 1,000 patient days decreased significantly from 6.05 ± 0.94 to 4.87 ± 0.78 (IRR, 0.72; 95% CI, 0.56–0.93; P = .010) and from 1.72 ± 0.37 to 0.89 ± 0.29 (IRR, 0.53; 95% CI, 0.37–0.77; P = .001), respectively.
Conclusions:
We identified an efficient, effective C. difficile and GIPAN diagnostic stewardship approval model.
Alcoholism has a high prevalence and impacts on morbidity, mortality, life quality, and the economy. Heritability estimates of alcohol dependence are 50-61%. Putative psychological, cultural, and genetic susceptibilities to alcoholism have been identified but understanding of the genetic components is still underdeveloped.
Aim
Identify genetic vulnerabilities predisposing individuals to alcoholism and co-morbid psychiatric disorders in the largest study of its kind.
Method
12 centres including 10 trainees are currently collecting blood and clinical samples. Nearly 1700 of 2000 cases of ICD-10/DSM-IV alcohol dependence have been collected; 500 with standardized assessments of alcohol use and comorbdity; and 2000 ancestrally-matched supernormal controls from UCL/collaborators. Genomic DNA will be isolated following standard procedures. Genotyping will be performed using the Affymetrix Gene Chip Human Mapping 1M Array to type up to 1 million single nucleotide polymorphism (SNP) and copy number variant (CNV) markers. Chi-square analysis of allelic association for the alcoholic sample versus controls will occur.
Results
n=65; 57% male; mean age 45years; mean age onset harmful alcohol use 19years; mean age onset withdrawals 32y; mean alcohol intake 21 units; primary depression 27%; secondary depression 49%; antisocial personality disorder 14%. The candidate gene approach in this sample has shown that the GABA receptor B1 (GABRB1) and the tachykinin receptor 1 (TACR1) are involved in genetic susceptibility to alcoholism. The D2 dopamine receptor is next.
Conclusion
Preliminary data suggests high psychiatric comorbidity in a clinical alcohol dependence sample and implicated candidate genes. Next is genomewide analysis of markers, sequencing and biological pathway/systems alterations.
Suicide is one of leading cause of preventable deaths. Recent data suggest South India as one of the regions with highest suicide rates in World. In 2012, 1,35,445 people committed suicide in India according to the statistics released by the National Crime Records Bureau
Suicide note is one of the most important sources to understand suicide, which may be beneficial in suicide prevention. Studies on suicidal notes from this part of world are sparse
Objective
The aim was to study the themes in suicide notes that might be useful in prevention strategies
Materials and Methods
A descriptive study of all Suicide notes of those individuals who committed suicide between 2010 – 2013 available with Police Department,Mysore District was obtained and analysed.
Result
A total of 22 suicide note were available. A majority of suicide note were in age group of 16-40 years (86%) and most were males (59%). All suicide notes were handwritten, majority (16) in regional language Kannada. Length of notes varied from just few words to few pages. Contents of suicide notes included apology, shame, guilt in 80%, love for those left behind (55%),instruction regarding practical affairs (23%). Most have blamed none for the act (50%). 23% committed suicide to prove their innocence. 32% mentioned a last wish.
Conclusion
Majority of suicidal note contained ‘guilt’ which is strong indicator of possible depression in deceased. Creating awareness about suicide among public ensuring access to professionals trained in suicide prevention is need of the hour in this part of world.
The main aim of the present studies is to determine whether, or to some extent, specific cognitive domains could differentiate the main subtypes of mood disorder in the depressed and clinically remitted status respectively.
Method
Three groups of bipolar I (n = 92), bipolar II (n = 131) and unipolar depression patients (n = 293) were tested with a battery of neuropsychological tests at baseline and after 6 weeks of treatment, contrasting with 202 healthy controls on cognitive performance.
Results
At the acute depressive state, the three patients groups (bipolar I, bipolar II and unipolar depression) showed cognitive dysfunction in processing speed, memory, verbal fluency and executive function but not attention compared with controls. And post comparisons revealed that bipolar I patients performed significantly worse in these impaired cognitive domain than bipolar II and unipolar depression patients in verbal fluency and executive function. After treatment, clinically remitted bipolar I and bipolar II patients only displayed cognitive impairment in processing speed and visual memory in relative to controls, while remitted unipolar depression patients showed cognitive impairment in executive function in addition to processing speed and visual memory.
Conclusion
Bipolar I, bipolar II and unipolar depression patients have a similar pattern of cognitive impairment during the state of acute depressive episodes. At the clinically remission, still both bipolar disorder and unipolar depression patients showed cognitive deficits in processing speed and visual memory, and executive dysfunction might be a status-maker for bipolar disorder, but a trait-marker for unipolar depression
The main aim of this study is to investigate the capacity of a number of variables from four dimensions (clinical, psychosocial, cognitive and genetic domains) to predict the antidepressant treatment outcome, and combined the predictors in one integrate regression model with the aim to investigate which predictor contributed most.
Methods
In a semi-naturalistic prospective cohort study with a total of 241 fully assessed MDD patients, decrease in HAM-D scores from baseline to after 6 weeks of treatment was used to measure the antidepressant treatment outcome.
Results
The clinical and psychosocial model (R2 = 0.451) showed that HAM-D scores at baseline and MMPI-2 scale paranoia was the best clinical and psychosocial predictor of treatment outcome respectively. The cognitive model (R2 = 0.502) revealed that combination of better performance on TMT-B test and worse performance on TOH and WAIS-R Digit Backward testes could predict decline in HAM-D scores. The genetics analysis only found median of percent improvement in HAM-D scores in G-allele of GR gene BclI polymorphism carriers (72.2%) was significant lower than that in non-G allele carriers (80.1%). The integrate model showed that three predictors, combination of HAM-D scores at baseline, MMPI-2 scale paranoia and TMT-B test, explained 57.1% of the variance.
Conclusion
Three markers, HAM-D scores at baseline, MMPI-2 scale paranoia and TMT-B test, might serve as predictor of antidepressant outcome in daily psychiatric practice.
In this study, we estimate the burden of foodborne illness (FBI) caused by five major pathogens among nondeployed US Army service members. The US Army is a unique population that is globally distributed, has its own food procurement system and a food protection system dedicated to the prevention of both unintentional and intentional contamination of food. To our knowledge, the burden of FBI caused by specific pathogens among the US Army population has not been determined. We used data from a 2015 US Army population survey, a 2015 US Army laboratory survey and data from FoodNet to create inputs for two model structures. Model type 1 scaled up case counts of Campylobacter jejuni, Shigella spp., Salmonella enterica non-typhoidal and STEC non-O157 ascertained from the Disease Reporting System internet database from 2010 to 2015. Model type 2 scaled down cases of self-reported acute gastrointestinal illness (AGI) to estimate the annual burden of Norovirus illness. We estimate that these five pathogens caused 45 600 (5%–95% range, 30 300–64 000) annual illnesses among nondeployed active duty US Army Service members. Of these pathogens, Norovirus, Campylobacter jejuni and Salmonella enterica non-typhoidal were responsible for the most illness. There is a tremendous burden of AGI and FBI caused by five major pathogens among US Army Soldiers, which can have a tremendous impact on readiness of the force. The US Army has a robust food protection program in place, but without a specific active FBI surveillance system across the Department of Defence, we will never have the ability to measure the effectiveness of modern, targeted, interventions aimed at the reduction of specific foodborne pathogens.
Throughout history, acute gastrointestinal illness (AGI) has been a significant cause of morbidity and mortality among US service members. We estimated the magnitude, distribution, risk factors and care seeking behaviour of AGI among the active duty US Army service members using a web-based survey. The survey asked about sociodemographic characteristics, dining and food procurement history and any experience of diarrhoea in the past 30 days. If respondents reported diarrhoea, additional questions about concurrent symptoms, duration of illness, medical care seeking and stool sample submission were asked. Univariable and multivariable logistic regression were used to identify the factors associated with AGI and factors associated with seeking care and submitting a stool sample. The 30-day prevalence of AGI was 18.5% (95% CI 16.66–20.25), the incidence rate was 2.24 AGI episodes per person-year (95% CI 2.04–2.49). Risk factors included a region of residence, eating at the dining facility and eating at other on-post establishments. Individuals with AGI missed 2.7–3.7 days of work, which costs approximately $ 847 451 629 in paid wages. Results indicate there are more than 1 million cases of AGI per year among US Army Soldiers, which can have a major impact on readiness. We found that care-seeking behaviours for AGI are different among US Army Service Members than the general population. Army Service Members with AGI report seeking care and having a stool sample submitted less often, especially for severe (bloody) diarrhoea. Factors associated with seeking care included rank, experiencing respiratory symptoms (sore throat, cough), experiencing vomiting and missing work for their illness. Factors associated with submitting a stool sample including experiencing more than five loose stools in 24 h and not experiencing respiratory symptoms. US Army laboratory-based surveillance under-estimates service members with both bloody and non-bloody diarrhoea. To our knowledge, this is the first study to estimate the magnitude, distribution, risk factors and care-seeking behaviour of AGI among Army members. We determined Army service members care-seeking behaviours, AGI risk factors and stool sample submission rates are different than the general population, so when estimating burden of AGI caused by specific foodborne pathogens using methods like Scallan et al. (2011), unique multipliers must be used for this subset of the population. The study legitimises not only the importance of AGI in the active duty Army population but also highlights opportunities for public health leaders to engage in simple strategies to better capture AGI impact so more modern intervention strategies can be implemented to reduce burden and indirectly improve operational readiness across the Enterprise.
Outpatient parenteral antimicrobial therapy (OPAT) programmes facilitate hospital discharge, but patients remain at risk of complications and consequent healthcare utilisation (HCU). Here we elucidated the incidence of and risk factors associated with HCU in OPAT patients. This was a retrospective, single-centre, case–control study of adult patients discharged on OPAT. Cases (n = 63) and controls (n = 126) were patients that did or did not utilise the healthcare system within 60 days. Characteristics associated with HCU in bivariate analysis (P ≤ 0.2) were included in a multivariable logistic regression model. Variables were retained in the final model if they were independently (P < 0.05) associated with 60-day HCU. Among all study patients, the mean age was 55 ± 16, 65% were men, and wound infection (22%) and cellulitis (14%) were common diagnoses. The cumulative incidence of 60-day unplanned HCU was 27% with a disproportionately higher incidence in the first 30 days (21%). A statin at discharge (adjusted odds ratios (aOR) 0.23, 95% confidence intervals (CIs) 0.09–0.57), number of prior admissions in past 12 months (aOR 1.48, 95% CIs 1.05–2.10), and a sepsis diagnosis (aOR 4.62, 95% CIs 1.23–17.3) were independently associated with HCU. HCU was most commonly due to non-infection related complications (44%) and worsening primary infection (31%). There are multiple risk factors for HCU in OPAT patients, and formal OPAT clinics may help to risk stratify and target the highest risk groups.
Nitrification potential of a tropical vertisol saturated with water was estimated during sequential reduction of nitrate (NO3−), ferric iron (Fe3+), sulphate (SO42−) and carbon dioxide (CO2) in terminal electron-accepting processes (TEAPs). In general, the TEAPs enhanced potential nitrification rate (PNR) of the soil. Nitrification was highest at Fe3+ reduction followed by SO42− reduction, NO3− reduction and lowest in unreduced control soil. Predicted PNR correlated significantly with the observed PNR. Electron donor Fe2+ stimulated PNR, while S2− inhibited it significantly. Terminal-restriction fragment length polymorphism targeting the amoA gene of ammonia-oxidizing bacteria (AOB) and ammonia-oxidizing archaea (AOA) highlighted population dynamics during the sequential reduction of terminal electron acceptors. Only the relative abundance of AOA varied significantly during the course of soil reduction. Relative abundance of AOB correlated with NO3− and Fe2+. Linear regression models predicted PNR from the values of NO3−, Fe2+ and relative abundance of AOA. Principal component analysis of PNR during different reducing conditions explained 72.90% variance by PC1 and 19.52% variance by PC2. Results revealed that AOA might have a significant role in nitrification during reducing conditions in the tropical flooded ecosystem of a vertisol.
We consider the time dependence of a hierarchy of scaled $L^{2m}$-norms $D_{m,\unicode[STIX]{x1D714}}$ and $D_{m,\unicode[STIX]{x1D703}}$ of the vorticity $\unicode[STIX]{x1D74E}=\unicode[STIX]{x1D735}\times \boldsymbol{u}$ and the density gradient $\unicode[STIX]{x1D735}\unicode[STIX]{x1D703}$, where $\unicode[STIX]{x1D703}=\log (\unicode[STIX]{x1D70C}^{\ast }/\unicode[STIX]{x1D70C}_{0}^{\ast })$, in a buoyancy-driven turbulent flow as simulated by Livescu & Ristorcelli (J. Fluid Mech., vol. 591, 2007, pp. 43–71). Here, $\unicode[STIX]{x1D70C}^{\ast }(\boldsymbol{x},t)$ is the composition density of a mixture of two incompressible miscible fluids with fluid densities $\unicode[STIX]{x1D70C}_{2}^{\ast }>\unicode[STIX]{x1D70C}_{1}^{\ast }$, and $\unicode[STIX]{x1D70C}_{0}^{\ast }$ is a reference normalization density. Using data from the publicly available Johns Hopkins turbulence database, we present evidence that the $L^{2}$-spatial average of the density gradient $\unicode[STIX]{x1D735}\unicode[STIX]{x1D703}$ can reach extremely large values at intermediate times, even in flows with low Atwood number $At=(\unicode[STIX]{x1D70C}_{2}^{\ast }-\unicode[STIX]{x1D70C}_{1}^{\ast })/(\unicode[STIX]{x1D70C}_{2}^{\ast }+\unicode[STIX]{x1D70C}_{1}^{\ast })=0.05$, implying that very strong mixing of the density field at small scales can arise in buoyancy-driven turbulence. This large growth raises the possibility that the density gradient $\unicode[STIX]{x1D735}\unicode[STIX]{x1D703}$ might blow up in a finite time.
Among pathogens shed by cattle, Escherichia coli O157 ranks highest in those causing human illness. To date, prevalence and risk factors for O157 shedding have been assessed in feedlot, but not dairy cattle. The study aimed to determine prevalence levels and risk factors for O157 atypical enteropathogenic E. coli (aEPEC) and enterohaemorrhagic E. coli (EHEC) shedding in dairy cattle. Dairy cattle (n = 899) within the first 21 days of lactation were sampled monthly over the course of 1 year, on three dry lot dairies surrounding Fort Collins, CO. During visits multiple factors were measured (disease history, pharmaceutical use, climate measures, etc.), and cattle faeces were collected and assessed for presence of O157 and virulence genes. Logistic regression analysis was performed using O157 outcomes and measured factors. Prevalence of O157 aEPEC was 3·7%, while EHEC was 3·0%. Many potential risk factors were highly correlated, and used to build separate multivariable models. An increase in humidity was positively associated with aEPEC, while fluid faeces and history of disease showed a negative association. Meanwhile, an increase in temperature and antibiotic treatment was positively associated with EHEC, while more days in milk, higher hygiene score and cow contact were negatively associated. These results may guide mitigation strategies that reduce O157 shedding, and contamination of the human food chain.