We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Using microsatellite loci, we assessed the mating system and genetic diversity of the dioecious tropical tree Genipa americana in a natural population (NP) and a progeny test (PT). For NP, we also estimated the paternity correlation within and among fruits and mean pollen dispersal distance. As expected for dioecious species, all offspring originated from outcrossing (t = 1). Mating among relatives (1 − ts) and paternity correlation (rp) were variable among progenies (1 − ts = 0.03–0.19; rp = 0.04–0.40), but greater in NP than in PT. Fixation index (F) was generally significant and lower in adults than in offspring, indicating selection against inbred individuals. Paternity correlation was higher within (0.40) than among (0.26) fruits, indicating a lower effective number of pollen donors (Nep) within (2.5) than among (3.8) fruits. Due to the higher rp in NP, the effective size within progenies (Ne) was lower (2.69) than PT (3.27). The pollen dispersal pattern was strongly leptokurtic, suggesting long-distance pollen dispersal (mean of 179 m). The results show that both populations can be used for seed collection in environmental reforestation programmes; however, considering that PT is structured in maternal progenies, NP is more suitable for seed collection due to the lower probability of mating among related trees.
The spatial agency bias predicts that people whose native language is rightward written will predominantly envisage action along the same direction. Two mechanisms contribute jointly to this asymmetry: (a) an embodied process related to writing/reading; (b) a linguistic regularity according to which sentence subjects (typically the agent) tend to precede objects (typically the recipient). Here we test a novel hypothesis in relation to the second mechanism, namely, that this asymmetry will be most pronounced in languages with rigid word order. A preregistered study on 14 European languages (n = 420) varying in word order flexibility confirmed a rightward bias in drawings of interactions between two people (agent and recipient). This bias was weaker in more flexible languages, confirming that embodied and linguistic features of language interact in producing it.
To identify dietary patterns associated with subclinical atherosclerosis measured as coronary artery calcification (CAC).
Design:
Cross-sectional analysis of data from the Brazilian Longitudinal Study of Adult Health. Dietary data were assessed using a FFQ, and a principal component factor analysis was used to derive the dietary patterns. Scree plot, eigenvalues > 1 and interpretability were considered to retain the factors. CAC was measured using a computed tomography scanner and an electrocardiography-gated prospective Ca score examination and was categorised into three groups based on the CAC score: 0, 1–100 and >100 Agatston units. Multinomial regression models were conducted for dietary patterns and CAC severity categories.
Setting:
Brazil, São Paulo, 2008–2010.
Participants:
Active and retired civil servants who lived in São Paulo and underwent a CAC exam were included (n 4025).
Results:
Around 10 % of participants (294 men, 97 women) had a detectable CAC (>0), 6·5 % (182 men, 73 women) had a CAC of 1–100 and 3·5 % (110 men, 23 women) had a CAC > 100. Three dietary patterns were identified: convenience food, which was positively associated with atherosclerotic calcification; plant-based and dairy food, which showed no association with CAC; and the traditional Brazilian food pattern (rice, legumes and meats), which was inversely associated with atherosclerotic calcification.
Conclusions:
Our results showed that a dietary pattern consisting of traditional Brazilian foods could be important to reducing the risk of atherosclerotic calcification and prevent future cardiovascular events, whereas a convenience dietary pattern was positively associated with this outcome.
Few studies are focused on sugar consumption around the first 1000 d of life. Thus, this work modelled the pathways linking the consumption of sugary drinks in pregnancy and maternal pre-gestational BMI to early child’s exposure to products with high sugar content and to BMI z-score in the second year of life.
Design:
BRISA cohort, São Luís, Brazil was used from the baseline to the follow-up at the second year of life.
Setting:
A theoretical model was constructed to analyse associations between variables from prenatal period (socio-economic status, age, frequency of sugary drinks consumption during pregnancy and pre-gestational BMI), birth weight, exclusive breast-feeding and two outcomes: higher calories from products with added sugar as a percentage of the total daily energy intake and BMI z-score at follow-up at the first 2 years of life, using structural equation modelling.
Participants:
Data of pregnant women (n 1136) and their offspring.
Results:
Higher pre-gestational BMI (standardised coefficient (SC) = 0·100; P = 0·008) and higher frequency of sugary drinks consumption during pregnancy (SC = 0·134; P < 0·001) resulted in high percentage of daily calories from products with added sugar in the second year of child, although no yet effect was observed on offspring weight at that time.
Conclusions:
Maternal obesity and sugary drinks consumption in pregnancy increased the risk of early exposure (before to 2 years) and high exposure of child to added sugar, showing perpetuation of the unhealthy dietary behaviours in the first 1000 d of life.
Babies born small-for-gestational age (SGA) have an increased risk of mortality, morbidity and adverse functional consequences. Studies suggest that pre-pregnancy maternal diet may influence newborns’ size. This study aimed to determine whether maternal pre-pregnancy dietary patterns (DP) are associated with delivering SGA newborns in the ProcriAr Cohort Study, Sao Paulo-Brazil. Pre-pregnancy DP of 299 women were investigated using factor analysis with principal component’s estimation, based on intake reported on a validated 110-item FFQ. Newborns were classified as SGA if their weight and/or length, adjusted by gestational age and sex, were below the 10th percentile of the INTERGROWTH-21st standards. Multivariate Poisson regression modelling with robust error variance was performed to examine associations between the different DP (in quintiles) and SGA. In a model adjusted by maternal sociodemographic and health behaviours, women who scored in the highest quintile of the DP ‘Snacks, sandwiches, sweets and soft drinks’ (in relation to the women who scored in the lowest quintile) were significantly more likely to deliver SGA babies (relative risk 1·92; 95 % CI 1·08, 3·39). This study verified that women’s pre-pregnancy dietary behaviour characterised by an energy-dense nutrient-poor food intake was a risk factor for delivering SGA newborns. Investments in education and improved access to healthful food and nutritional information before pregnancy should be prioritised due to their potential positive impact on child health. However, further studies are warranted to identify specific metabolic pathways that may be underlying these associations.
We investigated the population dynamics of the spider crab Libinia ferreirae, focusing on the frequency distribution of individuals in size classes, sex ratio and the action of environmental variables (temperature, salinity, texture and organic matter content in the sediment) on reproduction and recruitment. Monthly collections were made in the Ubatuba region from January 1998 to December 2000. A total of 222 individuals were collected, including 123 juveniles (males and females), 43 adult males, 25 non-ovigerous adult females and 31 ovigerous females. Unlike most adult brachyurans, there was no significant size difference between sexes, and sexual dimorphism seems to be a varying characteristic for this crab genus. The reproductive period and recruitment were continuous with peaks that could be related to water mass dynamics and higher food availability in the Ubatuba region. In addition, our results increase knowledge about part of the life cycle of L. ferreirae, which could be useful for comparative studies.
The objectives of this study were to evaluate the cross-cultural measurement equivalence of the Healthy Eating Index (HEI) for children aged 1–2 years and to analyse the quality of nutrition of preterm infants. This was a cross-sectional study with 106 premature infants attended in two specialised outpatient clinics of university hospitals. The quality of the diet was analysed through an adapted HEI to meet the dietary recommendations of Brazilian children aged 1–2 years. Food consumption was measured by 24-h recalls. The reliability of the instrument was evaluated by internal consistency analysis and inter-observer reliability using Cronbach’s α coefficient and κ with quadratic ponderation. The construct validity was evaluated by principal component analysis and by Spearman’s correlation coefficient with total energy and consumption of some groups’ food. The diet quality was considered adequate when the total HEI score was over 80 points. Cronbach’s α was 0·54. Regarding inter-observer reliability, ten items showed strong agreement (κ > 0·8). The item scores had low correlations with energy consumed (r ≤ 0·30), and positive and moderate correlation of fruit (r 0·67), meat (r 0·60) and variety of diet (r 0·57) with total scores. When analysing the overall quality of the diet, most patients need improvement (median 78·7 points), which can be attributed to low total vegetable intake and the presence of ultraprocessed foods in the diet. The instrument showed auspicious psychometric properties, being promising to evaluate the quality of the diet in children aged 1–2 years.
Governance in higher education has been described as ambiguous, elusive, and abstract. Both the concept and the practice of governance are recognized as contested, given tensions between different levels of authority and constituency interests: lay or state, academic or institutional, faculty or students. We focus on developments in public and private higher education to illuminate potentially contradictory trends of convergence and divergence in emerging governance arrangements. The chapter draws on a range of disciplinary and theoretical perspectives for interpreting current governance arrangements in the field of higher education and to highlight gaps in our understanding. The first section addresses the changing landscape of higher education and public–private distinctions in particular. The second focuses on governance arrangements in the arenas of public and private higher education and at the levels of system and institutional governance. The third section discusses theories of governance and their application to public and private higher education domains. The conclusion draws the analyses together, noting gaps and pointing to directions for further research.
Trypanosoma cruzi has three biochemically and morphologically distinct developmental stages that are programmed to rapidly respond to environmental changes the parasite faces during its life cycle. Unlike other eukaryotes, Trypanosomatid genomes contain protein coding genes that are transcribed into polycistronic pre-mRNAs and have their expression controlled by post-transcriptional mechanisms. Transcriptome analyses comparing three stages of the T. cruzi life cycle revealed changes in gene expression that reflect the parasite adaptation to distinct environments. Several genes encoding RNA binding proteins (RBPs), known to act as key post-transcriptional regulatory factors, were also differentially expressed. We characterized one T. cruzi RBP, named TcZH3H12, which contains a zinc finger domain and is up-regulated in epimastigotes compared to trypomastigotes and amastigotes. TcZC3H12 knockout (KO) epimastigotes showed decreased growth rates and increased capacity to differentiate into metacyclic trypomastigotes. Transcriptome analyses comparing wild type and TcZC3H12 KOs revealed a TcZC3H12-dependent expression of epimastigote-specific genes such as genes encoding amino acid transporters and proteins associated with differentiation (PADs). RNA immunoprecipitation assays showed that transcripts from the PAD family interact with TcZC3H12. Taken together, these findings suggest that TcZC3H12 positively regulates the expression of genes involved in epimastigote proliferation and also acts as a negative regulator of metacyclogenesis.
The aim of the current study was to identify and describe the meal and snack patterns (breakfast, mid-morning snack, lunch, mid-afternoon snack, dinner and evening snack) of public schoolchildren.
Design:
Cross-sectional study. Information on the previous day’s food intake was obtained through the Web-CAAFE (Food Intake and Physical Activity of Schoolchildren), an interactive questionnaire, which divides daily food consumption into three meals (breakfast, lunch and dinner) and three snacks (mid-morning, mid-afternoon and evening). Each meal contains thirty-one food items and the schoolchildren clicked on the food items consumed in each meal. Factor analysis was used to identify meal and snack patterns. The descriptions of the dietary patterns (DP) were based on food items with factor loads ≥ 0·30 that were considered representative of each DP.
Setting:
Schoolchildren, Florianopolis, Brazil.
Participants:
Children (n 1074) aged 7–13 years.
Results:
Lunch was the most consumed meal (96·0 %), followed by dinner (86·4 %), breakfast (85·3 %) and mid-afternoon snack (81·7 %). Four DP were identified for breakfast, mid-morning snack, lunch, dinner and evening snack, and three for mid-afternoon snack. Breakfast, lunch and dinner patterns included traditional Brazilian foods. DP consisting of fast foods and sugary beverages were also observed, mainly for the evening snack.
Conclusions:
The results of the current study provide important information regarding the meal and snack patterns of schoolchildren to guide the development of nutrition interventions in public health.
The aim of the study was to assess the inflammatory potential of the Brazilian population’s diet and its association with demographic, socio-economic and anthropometric characteristics. A cross-sectional study was performed with 34 003 individuals aged 10 years and older, evaluated by the National Diet and Nutrition Survey from the Consumer Expenditure Survey (POF 2008–2009). The Energy-adjusted Dietary Inflammatory Index (E-DII™) was determined using thirty-four dietary parameters calculated through non-consecutive 2-d dietary records. Positive scores indicate a pro-inflammatory diet, while negative scores indicate an anti-inflammatory diet. A bivariate and multivariate linear regression analysis based on a hierarchical theoretical model was performed to verify the factors associated with the E-DII. The mean of the E-DII was 1·04 (range of −4·77 to +5·98). The highest values of the pro-inflammatory E-DII were found among adolescents (1·42; P < 0·001) and individuals with higher income (1·10; P < 0·001) and level of education (1·18; P < 0·001). In the final model, the E-DII was associated with higher income quartiles and was higher in the Northeast and South regions, in white people, individuals with ≥9 years of education and adults and adolescents age group. The Brazilian population consumes a diet with high inflammatory potential, especially adolescents, white people and those with higher income and level of education. Thus, the index presented uneven distribution among the population, emphasising groups with higher dietary inflammatory potential. The socio-economic risk profile of a diet with higher inflammatory potential in medium-income countries is different from what is observed in high-income nations.
The coronavirus 2019 (COVID-19) outbreak in China rapidly spread throughout the world, becoming a threatening pandemic with unprecedented consequences. Mobile technologies have led to a revolution in health and their applicability in the context of COVID-19 is promising. In this commentary, we provide an overview of the role that mobile technologies play in the COVID-19 pandemic context and discuss the main issues associated. Four main domains stood out: health communication, prevention, support and research. Strengthening local surveillance systems, geographic contact tracing, support for clinical practice and data collection of real-time longitudinal data at the population level are some of the main advantages of the applications reported so far. The potential conflict to data privacy urges for discussion on their use in a responsible manner. Along with fair regulation and close monitoring of data collection and process, data anonymisation must be a standard and personal data must be deleted after its usage. Preparation is key for effective containment of a public health crisis and learning lessons on the role of mobile technologies is useful for future challenges in global health. It is noteworthy that their use must be driven by an equitable and inclusive orientation, and mostly integrated into an articulated policy to respond to the crisis.
Background: The identification of risk factors for infections in surgical patients with lower-limb fractures and blood transfusions has increased in recent years. Surgical site infections (SSIs) increase hospitalization, care costs, and patient suffering. Correction surgery for lower-limb fractures and blood transfusion is quite common between surgical procedures. The aim of this study was to describe the relationship between blood transfusion and SSI in patients undergoing orthopedic surgery on lower limbs. Methods: We conducted a prospective cohort study to identify risk factors for SSI in blood transfused patients undergoing fracture repair in lower-limb surgeries between February 2017 and May 2019 in 2 reference tertiary-care hospitals in Belo Horizonte, a city of 3 million people in Brazil. Data regarding patient characteristics, surgical procedures, blood transfusions, and surgical infections were collected. Patient characterization was performed by calculating the absolute and relative frequencies of categorical variables and calculating mean, median, minimum, maximum, standard deviation, and coefficient of variation for quantitative variables. The incidence of surgical site infection, the risk of postoperative hospital death, and the total length of hospital stay were calculated by point estimates and 95% confidence intervals identified by statistical tests of bilateral hypotheses, considering the level of significance of 5%. A multivariate analysis (logistic regression) was performed to identify SSI risk factors. Results: Patients who had an indication for blood transfusion (n = 38) but who did not receive blood (n = 4) had significantly lower hemoglobin, comparing discharge with admission, than the group who received blood. Intraoperative transfusion was a risk factor for SSI (OR, 4.7) (Fig. 1). Among the 205 patients with no indication for transfusion, 98 received blood even without the indication: there was no difference in hemoglobin outcome when discharge and admission were compared, and the 98 patients were exposed to unnecessary risk. Regarding restrictive versus liberal transfusion strategies, there were differences in the variables, age (P = .000), duration of surgery (P = .003), number of comorbidities (P = .000), body mass index (BMI) (P = .027), previous hemoglobin (P = .000), and high hemoglobin (P = .000), considering the transfusion practice employed (Fig. 2). Conclusions: The indications for and definition of protocols and careful evaluation of blood transfusion are critical to avoid infectious complications in orthopedic patients with lower-limb fractures.
Background: Bloodstream infection (BSI) is the most challenging conditions in patients who undergo hematopoietic stem cell transplantation (HSCT). These infections may be related to health care in cases of central-line–associated bloodstream infection (CLABSI) or to translocation secondary to mucosal barrier injury (MBI). In 2013, MBI surveillance was incorporated into the CDC NHSN. The aim was to increase the CLABSI diagnostic accuracy by proposing more effective preventive care measures. The objective of this study was to evaluate impact of the MBI surveillance on CLABSI incidence density in a Brazilian university hospital. Methods: The CLABSI incidence densities from the period before BMI surveillance (2007–2012) and the period after BMI surveillance was implemented (2013–2018) were analyzed and compared. Infections during the preintervention period were reclassified according to the MBI criterion to obtain an accurate CLABSI rate for the first period. The average incidence densities for the 2 periods were compared using the Student t test after testing for no autocorrelation (P > .05). Results: After reclassification, the preintervention period incidence density (10 infections per 1,000 patient days) was significantly higher than the postintervention period incidence density (6 infections per 1,000 patients day; P = .011) (Table 1). Therefore, the reclassification of nonpreventable infections (MBI) in the surveillance system made the diagnosis of CLABSI more specific. The hospital infection control service was able to introduce specific preventive measures related to the insertion and management of central lines in HSCT patient care. Conclusions: The MBI classification improved the CLABSI diagnosis, which upgraded central-line prevention measures, then contributed to the decrease of CLABSI rates in this high-risk population.
Background: In 5 hospitals in Belo Horizonte (population, 3 million) between July 2016 and June 2018, a survey was performed regarding surgical site infection (SSI). We statistically evaluated SSI incidents and optimized the power to predict SSI through pattern recognition algorithms based on support vector machines (SVMs). Methods: Data were collected on SSIs at 5 different hospitals. The hospital infection control committees (CCIHs) of the hospitals collected all data used in the analysis during their routine SSI surveillance procedures; these data were sent to the NOIS (Nosocomial Infection Study) Project. NOIS uses SACIH software (an automated hospital infection control system) to collect data from hospitals that participate voluntarily in the project. In the NOIS, 3 procedures were performed: (1) a treatment of the database collected for use of intact samples; (2) a statistical analysis on the profile of the hospitals collected; and (3) an assessment of the predictive power of SVM with a nonlinear separation process varying in configurations including kernel function (Laplace, Radial Basis, Hyperbolic Tangent and Bessel) and the k-fold cross-validation–based resampling process (ie, the use of data varied according to the amount of folders that cross and combine the evaluated data, being k = 3, 5, 6, 7, and 10). The data were compared by measuring the area under the curve (AUC; range, 0–1) for each of the configurations. Results: From 13,383 records, 7,565 were usable, and SSI incidence was 2.0%. Most patients were aged 35–62 years; the average duration of surgery was 101 minutes, but 76% of surgeries lasted >2 hours. The mean hospital length of stay without SSI was 4 days versus 17 days for the SSI cases. The survey data showed that even with a low number of SSI cases, the prediction rate for this specific surgery was 0.74, which was 14% higher than the rate reported in the literature. Conclusions: Despite the high noise index of the database, it was possible to sample relevant data for the evaluation of general surgery patients. For the predictive process, our results were >0.50 and were 14% better than those reported in the literature. However, the database requires more SSI case samples because only 2% of positive samples unbalanced the database. To optimize data collection and to enable other hospitals to use the SSI prediction tool, a mobile application was developed (available at www.sacihweb.com).
Background: Acute viral bronchiolitis caused by respiratory syncytial virus (RSV) may be a manifestation of high severity in neonatal-ICU (NICU) patients, with high risk of in-hospital cross transmission and outbreaks. During the epidemic seasonal period, intense viral circulation occurs in community; thus, transmission in the NICU is difficult to control. Objective: We describe an outbreak that occurred in a NICU in a public hospital in So Paulo state, Brazil. We also discuss the role of admitting external newborns with community-acquired virus in the incidence of these outbreaks in the NICU. Methods: In 2017 in Campinas, an RSV epidemic occurred during the seasonal period, resulting in a outbreak at the Campinas maternity hospital. A retrospective investigation was performed, and patients were analyzed for clinical and epidemiological characteristics and for risk factors for poor prognosis. We included neonates admitted in NICU with positive nasal lavage for RSV from April to July 2017. Statistical analysis were performed with 2 test for the categorical variables and the Student t test for the continuous variables comparing the newborn group from the community (external) with infected newborns in the hospital (internal). P < .05 was considered significant. Results: Of 44 neonates with RSV during this period, 32 were external and 12 were internal (Fig. 1). The mean gestational age of the external neonates was 38 weeks and 2 days, whereas the mean gestational age of the internal neonates was 29 weeks and 1 day (P < .001). The hospitalization time was higher in the internal group (P < .001). Table 1. One death associated with infection occurred in the internal group. Community neonates (external group) were mostly term-born, with no comorbidities, and they had a more favorable clinical course. In the literature, neonates infected with RSV at the hospital have several risk factors for poor prognosis, with a 13.5% mortality rate. Discussion: RSV outbreaks have great relevance in hospital settings, especially in the NICU, where there are a large number of vulnerable patients and a high risk of in-hospital cross transmission. Neonates infected with RSV at the hospital have several risk factors for poor prognosis, including high mortality. Therefore, it is important to discuss the exposure of this population to community-based infectious agents, mainly viral, and the risk of accepting patients from the community to be admitted to the NICU.
Background: Respiratory syncytial virus (RSV) and influenza virus (flu) contribute substantially to the overall burden of severe respiratory tract infection in children. However, the molecular etiological diagnostic methods of viral infection are still insufficiently accessible in public hospitals. Rapid immunochromatographic tests can add important information at the point of care, including antiviral or antibiotic indication, viral , and effective precaution measures to prevent outbreaks. The aim of this study was to evaluate this impact for pediatric patients under 5 years of age in our hospital. Methods: We conducted a retrospective, observational study of clinical outcomes of children under 5 years requiring hospitalization from 2013 to 2018 for viral respiratory disease, and who had positive RSV and/or flu immunochromatographic rapid test results. Results: In total, we identified 221 cases: RSV, 193; flu, 6; codetections, 19. (Table 1). The mortality rate was 1.8% (2 cases), and 88% of our patients were <1 year of age. Variables significantly associated with orotracheal intubation, the most intensive intervention, were younger age in months, comorbidities, RSV and flu codetection, and bacterial pneumonia diagnosis during hospitalization. Conclusions: In the multivariate analysis, RSV and flu codetection was associated with the least favorable clinical prognoses. Rapid test diagnosis may provide important information at the point of care, and molecular panels are not yet widely accessible in public hospitals. Hence, we believe that immunochromatographic rapid tests represent a valuable and feasible diagnostic alternative facilitating timely evaluation and treatment implementation.
Nutritional status (NS) monitoring is an essential step of the nutrition care process. To assess changes in NS throughout hospitalisation and its ability to predict clinical outcomes, a prospective cohort study with patients over 18 years of age was conducted. The Subjective Global Assessment (SGA) was performed within 48 h of admission and 7 d later. For each patient, decline in NS was assessed by two different methods: changes in SGA category and severe weight loss alone (≥2 % during the first week of hospitalisation). Patients were followed up until discharge to assess length of hospital stay (LOS) and in-hospital mortality and contacted 6 months post-discharge to assess hospital readmission and death. Out of the 601 patients assessed at admission, 299 remained hospitalised for at least 7 d; of those, 16·1 % had a decline in SGA category and 22·8 % had severe weight loss alone. In multivariable analysis, decline in SGA category was associated with 2-fold (95 % CI 1·06, 4·21) increased odds of prolonged LOS and 3·6 (95 % CI 1·05, 12·26) increased odds of hospital readmission at 6 months. Severe weight loss alone was associated with 2·5-increased odds (95 % CI 1·40, 4·64) of prolonged LOS. In conclusion, deterioration of NS was more often identified by severe weight loss than by decline in SGA category. While both methods were associated with prolonged LOS, only changes in the SGA predicted hospital readmission. These findings reinforce the importance of nutritional monitoring and provide guidance for further research to prevent short-term NS deterioration from being left undetected.
We conducted a quasi-experimental study to evaluate a bundle to prevent nonventilator hospital-acquired pneumonia (NV-HAP) in patients on enteral tube feeding. After the intervention, there was an increase in bundle compliance from 55.9% to 70.5% (P < .01) and a significant decrease (34%) in overall NV-HAP rates from 5.71 to 3.77 of 1,000 admissions.