We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Poor diets, including excess added sugar consumption, contribute to the global burden of disease. Subsequently, many nutrition policies have been implemented to reduce added sugar intake and improve population health, including taxes, education, labelling and environmental interventions. A potential consequence of these policy actions is the substitution of added sugars with non-nutritive sweeteners (NNS) in a variety of foods and beverages. NNS are used to reduce the energy and sugar content of foods and beverages while maintaining their palatability. Evidence of the toxicological risks of NNS is inconsistent, though concerns have been raised over the potential substitution effects of ultra-processed foods containing NNS for whole foods. This review aimed to provide an overview of current NNS food supply and consumption patterns, assess added sugar-reduction policies and their impact on NNS, and determine the impact of NNS on food choice, energy intake and diet quality. NNS are widely available in a variety of products, though most commonly in carbonated beverages, dairy products, confectionery, table-top sweeteners and fruit drinks. However, the longitudinal trends of different product categories, and differences between geographies and economy-income levels, require further study. Few studies have examined NNS consumption trends globally, though an increase in NNS consumption in beverages has been observed in some regions. Research examining how the increased availability of low-sugar, NNS-containing products affects global dietary patterns is limited, particularly in terms of their potential substitution effects.
A comprehensive nutrition policy containing a broad package of cross-sector and synergistic policy actions is required to attenuate the systemic drivers of poor nutrition. The current study aims to critically analyse trends in the scope of federal nutrition policy actions in Australia between 2007 and 2018 by: (1) describing the changes in nutrition policy actions, benchmarked against an international best-practice policy framework and (2) investigating how and why the scope of these policy actions have changed over time by examining the decision-making processes that led to the establishment of Australia’s Healthy Food Partnership (the Partnership).
Design:
Qualitative case study involving documentary analysis and key-informant interviews. Australian federal government documents (n 10) were analysed against the NOURISHING framework. Key informants (n 6) were interviewed and asked about the Partnership’s decision-making and establishment processes.
Setting:
Australia.
Participants:
Executive Committee (the Partnership’s governing body) and working group members.
Results:
From 2007 to 2018, the scope of Australian national nutrition policy has fluctuated from evidence-informed recommendations for a comprehensive policy to the mostly discrete policy actions of the Partnership. Themes of ‘pragmatism and compromise’, ‘actor relationships and lobbying’ and ‘political context’ were critical drivers for establishing the Partnership.
Conclusion:
The narrowing of Australian nutrition policy reflects a response to political expediency and compromise. This political dynamic highlights a dilemma facing nutrition policy advocates: should (and if so, how) a balance be sought between the aspirational but possibly unrealistic goals, and the limited but likely deliverable outcomes during policy-making processes? These findings have relevance for developing a future comprehensive national nutrition policy.
Dietary guidelines should be underpinned by the best available evidence on relationships between diet and health, including evidence from nutrient-based, food-based and dietary patterns research. The primary aim of the present study was to analyse the systematic reviews conducted to inform the 2013 Australian Dietary Guidelines according to dietary exposure. The secondary aim was to analyse the reviews by health outcome, and design of included studies. To identify the systematic reviews, the dietary guidelines report was used as a starting point and relevant references were retrieved. The evidence report contained the data used in this analysis. Descriptive statistics were used to analyse reviews according to exposure, outcome, and design of included studies. A total of 143 systematic reviews were included in this analysis. Foods were the most common exposure (86·7 % of reviews), followed by nutrients (10·5 %) and dietary patterns (2·8 %). Chronic disease morbidity and/or mortality was the most common outcome (80·4 %), followed by chronic disease risk factors (19·6 %). Most reviews included evidence from cohort or nested case–control studies (92·3 %), many included evidence from case–control studies (61·5 %) and some included evidence from randomised controlled trials (28·7 %). These results reflect the research questions that were asked, the systematic review methods that were used, and the evidence that was available. In developing future iterations of the Australian Dietary Guidelines, there is an opportunity to review the latest evidence from dietary patterns research.
With significant shifts in the dietary recommendations between the 2007 and 2019 Canadian dietary guidelines, such as promoting plant-based food intake, reducing highly processed food intake and advocating the practice of food skills, we compared their differences in guideline development methods.
Design:
Two reviewers used twenty-five guided criteria to appraise the methods used to develop the most recent dietary guidelines against those outlined in the 2014 WHO Handbook for Guideline Development.
Setting:
Canada.
Participants:
2007 and 2019 dietary guidelines.
Results:
We found that the 2019 guidelines were more evidence-based and met 80 % (20/25) of the WHO criteria. For example, systematic reviews and health organisation authoritative reports, but not industry reports, constituted the evidence base for the dietary recommendations. However, recommendations on food sustainability and food skill practice were driven primarily by stakeholders’ interests. By contrast, less information was recorded about the process used to develop the 2007 guidelines, resulting in 24 % (6/25) consistency with the WHO standards.
Conclusions:
Our analysis suggests that a more transparent and evidence-based approach is used to develop the 2019 Canadian dietary guidelines and that method criteria should support further incorporation of nutrition priorities (food sustainability and food skills) in future dietary guideline development.
Personal protective equipment (PPE) is worn by prehospital providers (PHPs) for protection from hazardous exposures. Evidence regarding the ability of PHPs to perform resuscitation procedures has been described in adult but not pediatric models. This study examined the effects of PPE on the ability of PHPs to perform resuscitation procedures on pediatric patients.
Methods:
This prospective study was conducted at a US simulation center. Paramedics wore normal attire at the baseline session and donned full Level B PPE for the second session. During each session, they performed timed sets of psychomotor tasks simulating clinical care of a critically ill pediatric patient. The difference in time to completion between baseline and PPE sessions per task was examined using Wilcoxon signed-rank tests.
Results:
A total of 50 paramedics completed both sessions. Median times for task completion at the PPE sessions increased significantly from baseline for several procedures: tracheal intubation (+4.5 s; P = 0.01), automated external defibrillator (AED) placement (+9.5 s; P = 0.01), intraosseous line insertion (+7 s; P < 0.0001), tourniquet (+8.5 s; P < 0.0001), intramuscular injection (+21-23 s, P < 0.0001), and pulse oximetry (+4 s; P < 0.0001). There was no significant increase in completion time for bag-mask ventilation or autoinjector use.
Conclusions:
PPE did not have a significant impact on PHPs performing critical tasks while caring for a pediatric patient with a highly infectious or chemical exposure. This information may guide PHPs faced with the situation of resuscitating children while wearing Level B PPE.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
${\sim}60\%$
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
The updated common rule, for human subjects research, requires that consents “begin with a ‘concise and focused’ presentation of the key information that will most likely help someone make a decision about whether to participate in a study” (Menikoff, Kaneshiro, Pritchard. The New England Journal of Medicine. 2017; 376(7): 613–615.). We utilized a community-engaged technology development approach to inform feature options within the REDCap software platform centered around collection and storage of electronic consent (eConsent) to address issues of transparency, clinical trial efficiency, and regulatory compliance for informed consent (Harris, et al. Journal of Biomedical Informatics 2009; 42(2): 377–381.). eConsent may also improve recruitment and retention in clinical research studies by addressing: (1) barriers for accessing rural populations by facilitating remote consent and (2) cultural and literacy barriers by including optional explanatory material (e.g., defining terms by hovering over them with the cursor) or the choice of displaying different videos/images based on participant’s race, ethnicity, or educational level (Phillippi, et al. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2018; 47(4): 529–534.).
Methods:
We developed and pilot tested our eConsent framework to provide a personalized consent experience whereby users are guided through a consent document that utilizes avatars, contextual glossary information supplements, and videos, to facilitate communication of information.
Results:
The eConsent framework includes a portfolio of eight features, reviewed by community stakeholders, and tested at two academic medical centers.
Conclusions:
Early adoption and utilization of this eConsent framework have demonstrated acceptability. Next steps will emphasize testing efficacy of features to improve participant engagement with the consent process.
Junk-food marketing contributes significantly to childhood obesity, which in turn imposes major health and economic burdens. Despite this, political priority for addressing junk-food marketing has been weak in many countries. Competing interests, worldviews and beliefs of stakeholders involved with the issue contribute to this political inertia. An integral group of actors for driving policy change are parliamentarians, who champion policy and enact legislation. However, how parliamentarians interpret and portray (i.e. frame) the causes and solutions of public health nutrition problems is poorly understood. The present study aimed to understand how Australian parliamentarians from different political parties frame the problem of junk-food marketing.
Design:
Framing analysis of transcripts from the Australian Government’s Parliamentary Hansard, involving development of a theoretical framework, data collection, coding transcripts and thematic synthesis of results.
Settings:
Australia.
Participants:
None.
Results:
Parliamentarian framing generally reflected political party ideology. Liberal parliamentarians called for minimal government regulation and greater personal responsibility, reflecting the party’s core values of liberalism and neoliberalism. Greens parliamentarians framed the issue as systemic, highlighting the need for government intervention and reflecting the core party value of social justice. Labor parliamentarians used both frames at varying times.
Conclusions:
Parliamentarians’ framing was generally consistent with their party ideology, though subject to changes over time. This project provides insights into the role of framing and ideology in shaping public health policy responses and may inform communication strategies for nutrition advocates. Advocates might consider using frames that resonate with the ideologies of different political parties and adapting these over time.
The deliberate use of chemical, biological, radiological, and nuclear (CBRN) materials in war or terrorist attacks is perceived as a great threat globally. In the event of a release of CBRN agents, protection by means of medical countermeasures (MedCMs) could reduce health vulnerability. Nonetheless, for some diseases caused by these agents, innovative MedCMs do not exist and many of those that do might not be readily available. Inappropriate research and development funding and government procurement efforts can result in adverse economic consequences (eg, lost income, cost per loss of life, medical expenses) far exceeding the costs of strong and comprehensive preparedness initiatives. By illustrating factors of demand-side rationale for CBRN MedCMs, this article aims to strengthen integrity of policy-making associated with current demand requirements. Namely, an approach to inspire broader assessment is outlined by compiling and adapting existing economic models and concepts to characterize both soft and hard factors that influence demand-side rationale. First, the soft factor context is set by describing the impact of behavioral and political economics. Then, lessons learned from past public health funding models and associated collaborative access infrastructure are depicted to represent hard factors that can enhance the viability of MedCM preparedness evaluations.
To assess current performance and identify opportunities and reforms necessary for positioning a food standards programme to help protect public health against dietary risk factors.
Design
A case study design in which a food standards programme’s public health protection performance was analysed against an adapted Donabedian model for assessing health-care quality. The criteria were the food standards programme’s structure (governance arrangements and membership of its decision-making committees), process (decision-making tools, public engagement and transparency) and food standards outcomes, which provided the information base on which performance quality was inferred.
Setting
The Australia and New Zealand food standards programme.
Participants
The structure, process and outcomes of the Programme.
Results
The Programme’s structure and processes produce food standards outcomes that perform well in protecting public health from risks associated with nutrient intake excess or inadequacy. The Programme performs less well in protecting public health from the proliferation and marketing of ‘discretionary’ foods that can exacerbate dietary risks. Opportunities to set food standards to help protect public health against dietary risks are identified.
Conclusions
The structures and decision-making processes used in food standards programmes need to be reformed so they are fit for purpose for helping combat dietary risks caused by dietary excess and imbalances. Priorities include reforming the risk analysis framework, including the nutrient profiling scoring criterion, by extending their nutrition science orientation from a nutrient (reductionist) paradigm to be more inclusive of a food/diet (holistic) paradigm.
Investigate short- and long-term effects of Superstorm Sandy on multiple morbidities among the elderly.
Methods
We examined emergency department visits; outpatient visits; and hospital admissions for cardiovascular disease (CVD), respiratory disease, and injury among residents residing in 8 affected counties immediately, 4 months, and 12 months following Superstorm Sandy. Control groups were defined as visits/admissions during the identical time window in the 5 years before (2007-2011) and 1 year after (2013-2014) the storm in affected and nonaffected counties in New York. We performed Poisson regression to test whether there was an association of increased visits/admissions for periods following Superstorm Sandy while controlling for covariates.
Results
We found that the risk for CVD, respiratory disease, and injury visits/admissions was more than twice as high immediately, 4 months, and 12 months after the storm than it was in the control periods. Women were at greater risk at all time periods for CVD (risk ratio [RR], 2.04) and respiratory disease (RRs: 1.89 to 1.92). Whites had higher risk for CVD, respiratory disease, and injury than other racial groups during each period.
Conclusion
We observed increases in CVD, respiratory disease, and injury up to a year following Superstorm Sandy. Findings demonstrate the need to incorporate short- and long-term health effects into public health recovery. (Disaster Med Public Health Preparedness. 2019;13:28-32)
Australian mosquito species significantly impact human health through nuisance biting and the transmission of endemic and exotic pathogens. Surveillance programmes designed to provide an early warning of mosquito-borne disease risk require reliable identification of mosquitoes. This study aimed to investigate the viability of Matrix-Assisted Laser Desorption/Ionization–Time-of-Flight Mass Spectrometry (MALDI-TOF MS) as a rapid and inexpensive approach to the identification of Australian mosquitoes and was validated using a three-step taxonomic approach. A total of 300 mosquitoes representing 21 species were collected from south-eastern New South Wales and morphologically identified. The legs from the mosquitoes were removed and subjected to MALDI-TOF MS analysis. Fifty-eight mosquitoes were sequenced at the cytochrome c oxidase subunit I (cox1) gene region and genetic relationships were analysed. We create the first MALDI-TOF MS spectra database of Australian mosquito species including 19 species. We clearly demonstrate the accuracy of MALDI-TOF MS for identification of Australian mosquitoes. It is especially useful for assessing gaps in the effectiveness of DNA barcoding by differentiating closely related taxa. Indeed, cox1 DNA barcoding was not able to differentiate members of the Culex pipiens group, Cx. quinquefasciatus and Cx. pipiens molestus, but these specimens were correctly identified using MALDI-TOF MS.
The triazines are one of the most widely used herbicide classes ever developed and are critical for managing weed populations that have developed herbicide resistance. These herbicides are traditionally valued for their residual weed control in more than 50 crops. Scientific literature suggests that atrazine, and perhaps other s-triazines, may no longer remain persistent in soils due to enhanced microbial degradation. Experiments examined the rate of degradation of atrazine and two other triazine herbicides, simazine and metribuzin, in both atrazine-adapted and non-history Corn Belt soils, with similar soils being used from each state as a comparison of potential triazine degradation. In three soils with no history of atrazine use, the t1/2 of atrazine was at least four times greater than in three soils with a history of atrazine use. Simazine degradation in the same three sets of soils was 2.4 to 15 times more rapid in history soils than non-history soils. Metribuzin in history soils degraded at 0.6, 0.9, and 1.9 times the rate seen in the same three non-history soils. These results indicate enhanced degradation of the symmetrical triazine simazine, but not of the asymmetrical triazine metribuzin.
To determine whether probiotic prophylaxes reduce the odds of Clostridium difficile infection (CDI) in adults and children.
DESIGN
Individual participant data (IPD) meta-analysis of randomized controlled trials (RCTs), adjusting for risk factors.
METHODS
We searched 6 databases and 11 grey literature sources from inception to April 2016. We identified 32 RCTs (n=8,713); among them, 18 RCTs provided IPD (n=6,851 participants) comparing probiotic prophylaxis to placebo or no treatment (standard care). One reviewer prepared the IPD, and 2 reviewers extracted data, rated study quality, and graded evidence quality.
RESULTS
Probiotics reduced CDI odds in the unadjusted model (n=6,645; odds ratio [OR] 0.37; 95% confidence interval [CI], 0.25–0.55) and the adjusted model (n=5,074; OR, 0.35; 95% CI, 0.23–0.55). Using 2 or more antibiotics increased the odds of CDI (OR, 2.20; 95% CI, 1.11–4.37), whereas age, sex, hospitalization status, and high-risk antibiotic exposure did not. Adjusted subgroup analyses suggested that, compared to no probiotics, multispecies probiotics were more beneficial than single-species probiotics, as was using probiotics in clinical settings where the CDI risk is ≥5%. Of 18 studies, 14 reported adverse events. In 11 of these 14 studies, the adverse events were retained in the adjusted model. Odds for serious adverse events were similar for both groups in the unadjusted analyses (n=4,990; OR, 1.06; 95% CI, 0.89–1.26) and adjusted analyses (n=4,718; OR, 1.06; 95% CI, 0.89–1.28). Missing outcome data for CDI ranged from 0% to 25.8%. Our analyses were robust to a sensitivity analysis for missingness.
CONCLUSIONS
Moderate quality (ie, certainty) evidence suggests that probiotic prophylaxis may be a useful and safe CDI prevention strategy, particularly among participants taking 2 or more antibiotics and in hospital settings where the risk of CDI is ≥5%.
A field study was conducted for the 2014 and 2015 growing season in Arkansas, Indiana, Illinois, Missouri, Ohio, and Tennessee to determine the effect of cereal rye and either oats, radish, or annual ryegrass on the control of Amaranthus spp. when integrated with comprehensive herbicide programs in glyphosate-resistant and glufosinate-resistant soybean. Amaranthus species included redroot pigweed, waterhemp, and Palmer amaranth. The two herbicide programs included were: a PRE residual herbicide followed by POST application of foliar and residual herbicide (PRE/POST); or PRE residual herbicide followed by POST application of foliar and residual herbicide, followed by another POST application of residual herbicide (PRE/POST/POST). Control was not affected by type of soybean resistance trait. At the end of the season, herbicides controlled 100 and 96% of the redroot pigweed and Palmer amaranth, respectively, versus 49 and 29% in the absence of herbicides, averaged over sites and other factors. The PRE/POST and PRE/POST/POST herbicide treatments controlled 83 and 90% of waterhemp at the end of the season, respectively, versus 14% without herbicide. Cover crop treatments affected control of waterhemp and Palmer amaranth and soybean yield, only in the absence of herbicides. The rye cover crop consistently reduced Amaranthus spp. density in the absence of herbicides compared to no cover treatment.
A field study was conducted in 2014 and 2015 in Arkansas, Illinois, Indiana, Ohio, Tennessee, Wisconsin, and Missouri to determine the effects of tillage system and herbicide program on season-long emergence of Amaranthus species in glufosinate-resistant soybean. The tillage systems evaluated were deep tillage (fall moldboard plow followed by (fb) one pass with a field cultivator in the spring), conventional tillage (fall chisel plow fb one pass with a field cultivator in the spring), minimum tillage (one pass of a vertical tillage tool in the spring), and no-tillage (PRE application of paraquat). Each tillage system also received one of two herbicide programs; PRE application of flumioxazin (0.09 kg ai ha–1) fb a POST application of glufosinate (0.59 kg ai ha−1) plus S-metolachlor (1.39 kg ai ha–1), or POST-only applications of glufosinate (0.59 kg ha−1). The deep tillage system resulted in a 62, 67, and 73% reduction in Amaranthus emergence when compared to the conventional, minimum, and no-tillage systems, respectively. The residual herbicide program also resulted in an 87% reduction in Amaranthus species emergence compared to the POST-only program. The deep tillage system, combined with the residual program, resulted in a 97% reduction in Amaranthus species emergence when compared to the minimum tillage system combined with the POST-only program, which had the highest Amaranthus emergence. Soil cores taken prior to planting and herbicide application revealed that only 28% of the Amaranthus seed in the deep tillage system was placed within the top 5-cm of the soil profile compared to 79, 81, and 77% in the conventional, minimum, and no-tillage systems. Overall, the use of deep tillage with a residual herbicide program provided the greatest reduction in Amaranthus species emergence, thus providing a useful tool in managing herbicide-resistant Amaranthus species where appropriate.