To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine whether there were changes in the prevalence or healthiness of products carrying claims post-implementation of Standard 1.2.7: Nutrition, Health and Related Claims in the Australia New Zealand Food Standards Code.
Observational survey of claims on food packages in three categories: non-alcoholic beverages, breakfast cereals and cereal bars. Nutrient profiling was applied to products to determine their eligibility to carry health claims under Standard 1.2.7. The Standard came into effect in 2013. The proportion of products carrying claims and the proportion of those not meeting the nutrient profiling criteria were calculated. A comparative analysis was conducted to determine changes between 2011 and 2016.
Three large metropolitan stores from the three major supermarket chains in Sydney, Australia were surveyed in 2011 and 2016.
All claims on all available products in 2016 (n 1737). Nutrition composition and ingredients were collected from the packaging.
Overall in 2016, 76 % of products carried claims and there were 7367 claims identified in the three food categories. Of products in 2016 with health claims, 34 % did not meet nutrient profiling criteria. These may breach Standard 1.2.7. Comparison of 2011–2016 showed a significant increase in the number of products carrying claims (66 v. 76 %, P < 0·001).
The proportion of products carrying claims that do not meet nutrient profiling and consumers’ tendency to infer health benefits from nutrition content claims warrants the regulation of all claims using the nutrient profiling. This will ensure consumers are not misled by claims on unhealthy food products.
Little is known about who would benefit from Internet-based personalised nutrition (PN) interventions. This study aimed to evaluate the characteristics of participants who achieved greatest improvements (i.e. benefit) in diet, adiposity and biomarkers following an Internet-based PN intervention. Adults (n 1607) from seven European countries were recruited into a 6-month, randomised controlled trial (Food4Me) and randomised to receive conventional dietary advice (control) or PN advice. Information on dietary intake, adiposity, physical activity (PA), blood biomarkers and participant characteristics was collected at baseline and month 6. Benefit from the intervention was defined as ≥5 % change in the primary outcome (Healthy Eating Index) and secondary outcomes (waist circumference and BMI, PA, sedentary time and plasma concentrations of cholesterol, carotenoids and omega-3 index) at month 6. For our primary outcome, benefit from the intervention was greater in older participants, women and participants with lower HEI scores at baseline. Benefit was greater for individuals reporting greater self-efficacy for ‘sticking to healthful foods’ and who ‘felt weird if [they] didn’t eat healthily’. Participants benefited more if they reported wanting to improve their health and well-being. The characteristics of individuals benefiting did not differ by other demographic, health-related, anthropometric or genotypic characteristics. Findings were similar for secondary outcomes. These findings have implications for the design of more effective future PN intervention studies and for tailored nutritional advice in public health and clinical settings.
Fetal growth restriction (FGR) is defined as failure of the fetus to achieve its genetically determined growth potential due to an underlying pathological process . FGR affects approximately 10% of all pregnancies and is a major determinant of perinatal and childhood mortality and morbidity, as well as chronic disease in adulthood [2–4]. A challenge in studying FGR is the lack of a gold standard definition and clear diagnostic criteria. Small for gestational age (SGA) is often used interchangeably with FGR but fails to differentiate between the constitutionally small but healthy fetus and the pathologically growth-restricted fetus. SGA is typically defined as a baby <10th centile, but 40% of these babies are physiologically small and healthy, therefore fetal size alone cannot be used to differentiate SGA from FGR. Assessment of functional parameters has been proposed to improve diagnostic accuracy but may still miss the larger baby (>10th centile) that is also in fact growth restricted. The importance of accurately diagnosing FGR is that it identifies the potential risk of fetal demise or perinatal complications, which may be averted via appropriate monitoring and optimized delivery.
Preterm birth before 37 weeks gestation affects 10–15% of all births, with nearly 15 million babies born preterm every year . Prematurity is the leading cause of neonatal mortality, accounting for in excess of 75% of perinatal deaths . Infants born preterm are at high risk of both short- and long-term neurological morbidity, including developmental delay, cognitive problems, hearing loss, visual impairment, behavioral problems, and cerebral palsy . The impact of these sequelae is high, with 27.9% (IQR [interquartile range] 18.6–46.6) of preterm neonates suffering from at least one, and 8.1% (IQR 3.7–10.2) suffering multiple morbidities . Despite improvements in perinatal care the incidence of preterm birth has changed little in decades. In contrast, improvements in neonatal care mean nearly 90% of all babies born less than 28 weeks in high-income countries survive, including babies born as early as 23 weeks’ gestation . Despite this improvement in survival, babies born at extreme preterm gestations are at the highest risk of neurological injury, with rates of cerebral palsy and severe disability in these survivors remaining static .
A quarter of Australian children are overweight or obese. Research conducted in 2010 found that fast-food children’s meals were energy-dense and nutrient-poor. Since then, menu labelling and self-regulation of marketing have been introduced in Australia. The present study aimed to: (i) investigate the nutrient composition of children’s meals offered at fast-food chains; (ii) compare these with children’s daily requirements and recommendations and the food industry’s own criteria for healthier children’s meals; and (iii) determine whether results have changed since last investigated in 2010.
An audit of nutrition information for fast-food children’s meals was conducted. Meals were compared with 30 % (recommended contribution for a meal) and 100 % of children’s daily recommendations and requirements. A comparative analysis was conducted to determine if the proportion of meals that exceeded meal requirements and recommendations, and compliance with the food industry’s own criteria, changed between 2010 and 2016.
Large Australian fast-food chains.
All possible children’s meal combinations.
Overall, 289 children’s meals were included. Most exceeded 30 % of daily recommendations and requirements for a 4-year-old’s energy, saturated fat, sugars and Na. Results were also substantial for 8- and 13-year-olds, particularly for Na. When compared with mean energy and nutrient contents from 2010, there were minimal changes overall.
Children’s meals can provide excess energy, saturated fat, sugar and Na to children’s diets. Systematic reformulation of energy, saturated fat, sugars and Na would improve the nutrient composition of the meals.
No standardized surveillance criteria exist for surgical site infection after breast tissue expander (BTE) access. This report provides a framework for defining postaccess BTE infections and identifies contributing factors to infection during the expansion period. Implementing infection prevention guidelines for BTE access may reduce postaccess BTE infections.
We used a survey to characterize contemporary infection prevention and antibiotic stewardship program practices across 64 healthcare facilities, and we compared these findings to those of a similar 2013 survey. Notable findings include decreased frequency of active surveillance for methicillin-resistant Staphylococcus aureus, frequent active surveillance for carbapenem-resistant Enterobacteriaceae, and increased support for antibiotic stewardship programs.
We compared the fluorescent gel removal rate using fewer high-touch surfaces (HTSs) and rooms and determined the optimum number of HTSs and rooms needed to ensure accuracy using 2,942 HTSs in 228 rooms on 13 units. Randomly selecting 3 HTS in 2 rooms predicted the optimal removal rate.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
The Food Standards Code regulates health claims on Australian food labels. General-level health claims highlight food–health relationships, e.g. ‘contains calcium for strong bones’. Food companies making claims must notify Food Standards Australia New Zealand (FSANZ) and certify that a systematic literature review (SLR) substantiating the food–health relationship has been conducted. There is no pre- or post-notification assessment of the SLR, potentially enabling the food industry to make claims based on poor-quality research. The present study assessed the rigour of self-substantiation.
Food–health relationships notified to FSANZ were monitored monthly between 2013 and 2017. These relationships were assessed by scoping published literature. Where evidence was equivocal/insufficient, the relevant government food regulatory agency was asked to investigate. If not investigated, or the response was unsatisfactory, the project team conducted an independent SLR which was provided to the government agency.
Self-substantiated food–health relationships.
There were sixty-seven relationships notified by thirty-eight food companies. Of these, thirty-three relationships (52 %) from twenty companies were deemed to have sufficient published evidence. Four were excluded as they originated in New Zealand. Three relationships were removed before investigations were initiated. The project initiated twenty-seven food–health relationship investigations. Another six relationships were withdrawn, and three relationships were awaiting government assessment.
To ensure that SLR underpinning food–health relationships are rigorous and reduce regulatory enforcement burden, pre-market approval of food–health relationships should be introduced. This will increase consumer and public health confidence in the regulatory process and prevent potentially misleading general-level health claims on food labels.
The second year of life is a period of nutritional vulnerability. We aimed to investigate the dietary patterns and nutrient intakes from 1 to 2 years of age during the 12-month follow-up period of the Growing Up Milk – Lite (GUMLi) trial. The GUMLi trial was a multi-centre, double-blinded, randomised controlled trial of 160 healthy 1-year-old children in Auckland, New Zealand and Brisbane, Australia. Dietary intakes were collected at baseline, 3, 6, 9 and 12 months post-randomisation, using a validated FFQ. Dietary patterns were identified using principal component analysis of the frequency of food item consumption per d. The effect of the intervention on dietary patterns and intake of eleven nutrients over the duration of the trial were investigated using random effects mixed models. A total of three dietary patterns were identified at baseline: ‘junk/snack foods’, ‘healthy/guideline foods’ and ‘breast milk/formula’. A significant group difference was observed in ‘breast milk/formula’ dietary pattern z scores at 12 months post-randomisation, where those in the GUMLi group loaded more positively on this pattern, suggesting more frequent consumption of breast milk. No difference was seen in the other two dietary patterns. Significant intervention effects were seen on nutrient intake between the GUMLi (intervention) and cows’ milk (control) groups, with lower protein and vitamin B12, and higher Fe, vitamin D, vitamin C and Zn intake in the GUMLi (intervention) group. The consumption of GUMLi did not affect dietary patterns, however, GUMLi participants had lower protein intake and higher Fe, vitamins D and C and Zn intake at 2 years of age.
L’article discute des changements et des continuités dans les pratiques de l’apprentissage et ses normes sociales, à Paris, avant et après la Révolution française. Il souligne l’intérêt d’une analyse quantifiée et de comparaisons entre sources hétérogènes pour aborder ce sujet. Il traite d’abord de la question du rapport entre nombre d’apprenti·es et nombre de maîtres·ses dans chaque métier, ainsi que des aspirations qui pouvaient être celles des apprenti·es et de leurs parents : avec quel espoir entrait-on en apprentissage ? L’étude de trajectoires individuelles pour le xviiie siècle et d’une statistique de l’époque pour le xixe siècle permet notamment d’indiquer que l’apprentissage était loin de toujours déboucher sur une carrière au sein du métier appris. Un travail sur les sources judiciaires montre ensuite que le rôle des tribunaux dans le règlement des conflits autour de l’apprentissage s’est à la fois accru et transformé, en se concentrant sur le respect du temps imparti. L’article illustre enfin la persistance, sur la longue durée, de normes largement partagées définissant les bons apprentissages et de pratiques s’écartant nettement de ces normes, différentes selon le genre et le métier, que ce soit avant ou après la fin des corporations.
In this systematic evaluation of fluorescent gel markers (FGM) applied to high-touch surfaces with a metered applicator (MA) made for the purpose versus a generic cotton swab (CS), removal rates were 60.5% (476 of 787) for the MA and 64.3% (506 of 787) for the CS. MA-FGM removal interpretation was more consistent, 83% versus 50% not removed, possibly due to less varied application and more adhesive gel.
To describe the process by which the 12 community-based primary health care (CBPHC) research teams worked together and fostered cross-jurisdictional collaboration, including collection of common indicators with the goal of using the same measures and data sources.
A pan-Canadian mechanism for common measurement of the impact of primary care innovations across Canada is lacking. The Canadian Institutes for Health Research and its partners funded 12 teams to conduct research and collaborate on development of a set of commonly collected indicators.
A working group representing the 12 teams was established. They undertook an iterative process to consider existing primary care indicators identified from the literature and by stakeholders. Indicators were agreed upon with the intention of addressing three objectives across the 12 teams: (1) describing the impact of improving access to CBPHC; (2) examining the impact of alternative models of chronic disease prevention and management in CBPHC; and (3) describing the structures and context that influence the implementation, delivery, cost, and potential for scale-up of CBPHC innovations.
Nineteen common indicators within the core dimensions of primary care were identified: access, comprehensiveness, coordination, effectiveness, and equity. We also agreed to collect data on health care costs and utilization within each team. Data sources include surveys, health administrative data, interviews, focus groups, and case studies. Collaboration across these teams sets the foundation for a unique opportunity for new knowledge generation, over and above any knowledge developed by any one team. Keys to success are each team’s willingness to engage and commitment to working across teams, funding to support this collaboration, and distributed leadership across the working group. Reaching consensus on collection of common indicators is challenging but achievable.
The detection and monitoring of meltwater within firn presents a significant monitoring challenge. We explore the potential of small wireless sensors (ETracer+, ET+) to measure temperature, pressure, electrical conductivity and thus the presence or absence of meltwater within firn, through tests in the dry snow zone at the East Greenland Ice Core Project site. The tested sensor platforms are small, robust and low cost, and communicate data via a VHF radio link to surface receivers. The sensors were deployed in low-temperature firn at the centre and shear margins of an ice stream for 4 weeks, and a ‘bucket experiment’ was used to test the detection of water within otherwise dry firn. The tests showed the ET+ could log subsurface temperatures and transmit the recorded data through up to 150 m dry firn. Two VHF receivers were tested: an autonomous phase-sensitive radio-echo sounder (ApRES) and a WinRadio. The ApRES can combine high-resolution imaging of the firn layers (by radio-echo sounding) with in situ measurements from the sensors, to build up a high spatial and temporal resolution picture of the subsurface. These results indicate that wireless sensors have great potential for long-term monitoring of firn processes.
Using samples collected for VRE surveillance, we evaluated unit admission prevalence of carbapenem-resistant Enterobacteriaceae (CRE) perirectal colonization and whether CRE carriers (unknown to staff) were on contact precautions for other indications. CRE colonization at unit admission was infrequent (3.9%). Most CRE carriers were not on contact precautions, representing a reservoir for healthcare-associated CRE transmission.
As described in Chernyak-Hai and Rabenu's (2018) focal article, the workplace has changed tremendously over the past few decades. These changes, undoubtedly, have affected how individuals interact and build relationships in the workplace. We live in a “networked society,” where the advances in technology and subsequent spread of communication and information have reorganized the way individuals are connected to one another (Castells, 2004; Wellman, 1999). In other words, we exist in complex networks, where underlying interconnections and interdependencies are the keys to scientific understanding. In their focal article, Chernyak-Hai and Rabenu highlight the need to adapt social exchange theories and research to incorporate the change in workplace relationships resulting from advances in technology and changes in the global market and workforce (e.g., freelancers, contract workers).