To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to COVID-19 with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplemental materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
We aimed to assess the validity of maternal recall of exclusive breastfeeding (EBF) at 3 months obtained 12 months after childbirth.
A population-based birth cohort study. The gold standard is maternal report of EBF at the age of 3 months (yes or no) and age of introduction of other foods in the infant’s diet. EBF was considered when the mother reported that no liquid, semi-solid or solid food was introduced up to that moment. The variable to be validated was obtained at 12 months after childbirth when the mother was asked about the age of food introduction. The prevalence of EBF at 3 months, and sensitivity, specificity, positive (PPV) and negative predictive values (NPV), and accuracy of 12-month recall with 95 % CI were calculated.
3700 mothers of participants of the Pelotas 2004 Birth Cohort.
The prevalence of EBF at 3 months was 27·8 % (95 % CI 26·4, 29·3) and 49·0 % (95 % CI 47·4, 50·6) according to gold standard and maternal recall, respectively. The sensitivity of maternal recall at 12 months was 98·3 % (95 % CI 97·4, 99·0), specificity 70·0 % (95 % CI 68·2, 71·7), PPV 55·8 % (95 % CI 53·4, 58·1), NPV 99·1 % (95 % CI 98·6, 99·5) and accuracy 77·9 % (95 % CI 76·6, 79·2). When the analyses were stratified by maternal and infant characteristics, the sensitivity remained around 98 %, and the specificity ranged from 64·4 to 81·8 %.
EBF recalled at the end of the first year of infant’s life is a valid measure to be used in epidemiological investigations.
Postoperative cognitive impairment is among the most common medical complications associated with surgical interventions – particularly in elderly patients. In our aging society, it is an urgent medical need to determine preoperative individual risk prediction to allow more accurate cost–benefit decisions prior to elective surgeries. So far, risk prediction is mainly based on clinical parameters. However, these parameters only give a rough estimate of the individual risk. At present, there are no molecular or neuroimaging biomarkers available to improve risk prediction and little is known about the etiology and pathophysiology of this clinical condition. In this short review, we summarize the current state of knowledge and briefly present the recently started BioCog project (Biomarker Development for Postoperative Cognitive Impairment in the Elderly), which is funded by the European Union. It is the goal of this research and development (R&D) project, which involves academic and industry partners throughout Europe, to deliver a multivariate algorithm based on clinical assessments as well as molecular and neuroimaging biomarkers to overcome the currently unsatisfying situation.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Prospective cohort study at 2 academic medical centers.
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Two experiments were conducted to investigate the use of sorghum, cottonseed meal and millet in broiler diets and their interaction when they are used simultaneously. In Experiment 1, a corn-soybean meal control diet was compared with eight experimental treatments based on low tannin sorghum (S30, S45 and S60), cottonseed meal (CM15, CM40) or both ingredients included in the same diet (S30/CM40, S45/CM25 and S60CM15). Results showed that BW gain was not affected by the inclusion of sorghum or cottonseed meal. However, feed intake tended to be affected by the cereal type with the highest values with sorghum-based diets. Feed conversion ratio increased (P<0.001) with sorghum-based diets compared with the control diet, whereas a combination of cottonseed meal and sorghum in the same diet did not affect the feed conversion ratio. Significant differences (P<0.001) were observed in apparent ileal digestibility (%) of protein and energy with the cottonseed meal and sorghum/cottonseed meal-based diets having lower protein and energy digestibility compared with corn-based diets. In Experiment 2, a control diet was compared with six diets in which corn was substituted at 60%, 80% or 100% by either sorghum or millet and other three diets with simultaneous inclusion of these two ingredients (S30/M30, S40/M40, S50/M50). Single or combined inclusion of sorghum and millet resulted in similar feed intake and growth performance as the control diet. Apparent ileal digestibility of protein and energy was higher with millet-based diets (P<0.001). Total tract digestibility of protein in sorghum and millet-based diets tended to decrease linearly with the increasing level of substitution. Sorghum-based diets resulted in lower total tract digestibility of fat compared with millet and sorghum/millet-based diets (P<0.001). Higher total tract digestibility of starch were obtained with the control diet and millet-based diets compared with the sorghum-based treatments. Results of the two experiments suggest that broiler growth performance was not affected by the dietary level of sorghum, millet or cottonseed meal. Nutrient digestion can, however, be affected by these feed ingredients.
In humans, maximum brain development occurs between the third trimester of gestation and 2 years of life. Nutrition during these critical windows of rapid brain development might be essential for later cognitive functioning and behaviour. In the last few years, trends on protein recommendations during infancy and childhood have tended to be lower than that in the past. It remains to be demonstrated that lower protein intakes among healthy infants, a part of being able to reduce obesity risk, is safe in terms of mental performance achievement. Secondary analyses of the EU CHOP, a clinical trial in which infants from five European countries were randomised to be fed a higher or a lower protein content formula during the 1st year of life. Children were assessed at the age of 8 years with a neuropsychological battery of tests that included assessments of memory (visual and verbal), attention (visual, selective, focused and sustained), visual-perceptual integration, processing speed, visual-motor coordination, verbal fluency and comprehension, impulsivity/inhibition, flexibility/shifting, working memory, reasoning, visual-spatial skills and decision making. Internalising, externalising and total behaviour problems were assessed using the Child Behaviour Checklist 4–18. Adjusted analyses considering factors that could influence neurodevelopment, such as parental education level, maternal smoking, child’s gestational age at birth and head circumference, showed no differences between feeding groups in any of the assessed neuropsychological domains and behaviour. In summary, herewith we report on the safety of lower protein content in infant formulae (closer to the content of human milk) according to long-term mental performance.
An efficient and robust method to measure vitamin D (25-hydroxy vitamin D3 (25(OH)D3) and 25-hydroxy vitamin D2 in dried blood spots (DBS) has been developed and applied in the pan-European multi-centre, internet-based, personalised nutrition intervention study Food4Me. The method includes calibration with blood containing endogenous 25(OH)D3, spotted as DBS and corrected for haematocrit content. The methodology was validated following international standards. The performance characteristics did not reach those of the current gold standard liquid chromatography-MS/MS in plasma for all parameters, but were found to be very suitable for status-level determination under field conditions. DBS sample quality was very high, and 3778 measurements of 25(OH)D3 were obtained from 1465 participants. The study centre and the season within the study centre were very good predictors of 25(OH)D3 levels (P<0·001 for each case). Seasonal effects were modelled by fitting a sine function with a minimum 25(OH)D3 level on 20 January and a maximum on 21 July. The seasonal amplitude varied from centre to centre. The largest difference between winter and summer levels was found in Germany and the smallest in Poland. The model was cross-validated to determine the consistency of the predictions and the performance of the DBS method. The Pearson’s correlation between the measured values and the predicted values was r 0·65, and the sd of their differences was 21·2 nmol/l. This includes the analytical variation and the biological variation within subjects. Overall, DBS obtained by unsupervised sampling of the participants at home was a viable methodology for obtaining vitamin D status information in a large nutritional study.
A meta-analysis was conducted (i) to evaluate broiler response to partial or total substitution of corn by sorghum and millet and (ii) to determine the effect of soybean meal replacement by cottonseed meal in broiler diet. The database included 190 treatments from 29 experiments published from 1990 to 2013. Bird responses to an experimental diet were calculated relative to the control (Experimental−Control), and were submitted to mixed-effect models. Results showed that diets containing millet led to similar performance as the corn-based ones for all parameters, whereas sorghum-based diets decreased growth performance. No major effect of the level of substitution was observed with millet or cottonseed meal. No effect of the level of substitution of sorghum on feed intake was found; however, growth performance decreased when the level of substitution of corn by sorghum increased. Cottonseed meal was substituted to soybean meal up to 40% and found to increase feed intake while reducing growth performance. Young birds were not more sensitive to these ingredients than older birds since there was no negative effect of these ingredients on performance in the starter phase. Results obtained for sorghum pointed out the necessity to find technological improvements that will increase the utilization of these feedstuffs in broiler diet. An additional work is scheduled to validate these statistical results in vivo and to evaluate the interactions induced with the simultaneous inclusions of sorghum, millet and cottonseed meal in broiler feeding.
Nanosphere lithography (NSL) is a technique capable of creating large-area arrays of small objects with tailor-made shapes. Here we present an algorithm, which simulates the shape and morphology of nanoparticles produced via NSL in combination with physical vapor deposition from variable angles. The key idea is based on a ray-tracing technique. Mask clogging effects have a major influence on the shape of resulting nanoobjects and are therefore taken into account. In addition, we implemented a metaball concept for the precise description of thermally modified masks. The calculated results are compared exemplarily with atomic force microscopy (AFM) data of experimentally fabricated nanostructures.
Thin films of nanocrystalline ceria on a Si substrate have been irradiated with 3 MeV Au+ ions to fluences of up to 1x1016 ions cm-2, at temperatures ranging between 160 to 400 K. During the irradiation, a band of contrast is observed to form at the thin film/substrate interface. Analysis by scanning transmission electron microscopy in conjunction with energy dispersive and electron energy loss spectroscopy techniques revealed that this band of contrast was a cerium silicate amorphous phase, with an approximate Ce:Si:O ratio of 1:1:3.
To reduce mortality among suckling piglets, lactating sows are traditionally housed in farrowing crates. Alternatively, lactating sows can be housed in farrowing pens where the sow is loose to ensure more behavioural freedom and consequently a better welfare for the sow, although under commercial conditions, farrowing pens have been associated with increased piglet mortality. Most suckling piglets that die do so within the first week of life, so potentially lactating sows do not have to be restrained during the entire lactation period. Therefore, the aim of the current study was to investigate whether confinement of the sow for a limited number of days after farrowing would affect piglet mortality. A total of 210 sows (Danish Landrace × Danish Yorkshire) were farrowed in specially designed swing-aside combination farrowing pens measuring 2.6 m × 1.8 m (combi-pen), where the sows could be kept loose or in a crate. The sows were either: (a) loose during the entire experimental period, (b) crated from days 0 to 4 postpartum, (c) crated from days 0 to 7 postpartum or (d) crated from introduction to the farrowing pen to day 7 postpartum. The sows and their subsequent litters were studied from introduction to the combi-pen ∼1 week before expected farrowing and until 10 days postpartum. Confinement period of the sow failed to affect the number of stillborn piglets; however, sows that were crated after farrowing had fewer live-born mortality deaths (P < 0.001) compared with the sows that were loose during the experimental period. The increased piglet mortality among the loose sows was because of higher mortality in the first 4 days after farrowing. In conclusion, the current study demonstrated that crating the sow for 4 days postpartum was sufficient to reduce piglet mortality.
Vaterite is one of the thermodynamically less stable polymorphs of calcium carbonate. Under ambient conditions it transforms into calcite, the most stable form of calcium carbonate. Organisms are able to stabilize minerals such as vaterite by means of organic molecules. The exact mechanisms how biomineralization proteins interact with metastable mineral phases are, however, less well understood. Many in vitro studies were performed using calcite as a model system. A deeper understanding of the interaction of organic molecules with metastable mineral phases would make them useful as a tool to control mineralization processes in vitro. In this study, we report on the co-precipitation of a natively soluble histidine-tagged GFP (green fluorecent protein) with a metastable vaterite phase and the subsequent insolubility of the fluorescent organic matrix in a 30μl calcium carbonate precipitation assay. The intrinsic fluorescence of GFP is conserved during the interaction with the mineral phase, indicating proper folding even in the insoluble state. This experiment can be extended to obtain deeper insights into some mechanistic models of biomineralization proteins by tracking native and modified GFP proteins microscopically during various stages of mineral precipitation and dissolution.
Accurate data on West Nile virus (WNV) cases help guide public health education and control activities, and impact regional WNV blood product screening procedures. During an outbreak of WNV disease in Arizona, records from patients with meningitis or encephalitis were reviewed to determine the proportion tested for WNV. Of 60 patients identified with meningitis or encephalitis, 24 (40%) were tested for WNV. Only 12 (28%) of 43 patients aged <50 years were tested for WNV compared to 12 (71%) of 17 patients aged ⩾50 years (P<0·01). Patients with clinical signs of weakness or paralysis, elevated CSF protein, admitted to an inpatient facility, or discharged to a rehabilitation facility were also more likely to have WNV testing performed. The lack of testing in younger age groups and in those with less severe disease probably resulted in substantial underestimates of WNV neuroinvasive disease burden.
To describe the identification, management, and clinical characteristics of hospitalized patients with influenza-like illness (ILI) during the peak period of activity of the 2009 pandemic strain of influenza A virus subtype H1N1 (2009 H1N1).
Retrospective review of electronic medical records.
Patients and Setting.
Hospitalized patients who presented to the emergency department during the period October 18 through November 14, 2009, at 4 hospitals in Cook County, Illinois, with the capacity to perform real-time reverse-transcriptase polymerase chain reaction testing for influenza.
Vital signs and notes recorded within 1 calendar day after emergency department arrival were reviewed for signs and symptoms consistent with ILI. Cases of ILI were classified as recognized by healthcare providers if an influenza test was performed or if influenza was mentioned as a possible diagnosis in the physician notes. Logistic regression was used to determine the patient attributes and symptoms that were associated with ILI recognition and with influenza infection.
We identified 460 ILI case patients, of whom 412 (90%) had ILI recognized by healthcare providers, 389 (85%) were placed under airborne or droplet isolation precautions, and 243 (53%) were treated with antiviral medication. Of 401 ILI case patients tested for influenza, 91 (23%) had a positive result. Fourteen (3%) ILI case patients and none of the case patients who tested positive for influenza had sore throat in the absence of cough.
Healthcare providers identified a high proportion of hospitalized ILI case patients. Further improvements in disease detection can be made through the use of advanced electronic health records and efficient diagnostic tests. Future studies should evaluate the inclusion of sore throat in the ILI case definition.
To present the auditory implant manipulator, a navigation-controlled mechanical and electronic system which enables minimally invasive (‘keyhole’) transmastoid access to the tympanic cavity.
Materials and methods:
The auditory implant manipulator is a miniaturised robotic system with five axes of movement and an integrated drill. It can be mounted on the operating table. We evaluated the surgical work field provided by the system, and the work sequence involved, using an anatomical whole head specimen.
The work field provided by the auditory implant manipulator is considerably greater than required for conventional mastoidectomy. The work sequence for a keyhole procedure included pre-operative planning, arrangement of equipment, the procedure itself and post-operative analysis.
Although system improvements are necessary, our preliminary results indicate that the auditory implant manipulator has the potential to perform keyhole insertion of implantable hearing devices.