To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 2015, the Victorian Salt Reduction Partnership launched a 4-year multifaceted salt reduction intervention designed to reduce salt intake by 1 g/d in children and adults living in Victoria, Australia. Child-relevant intervention strategies included a consumer awareness campaign targeting parents and food industry engagement seeking to reduce salt levels in processed foods. This study aimed to assess trends in salt intake, dietary sources of salt and discretionary salt use in primary schoolchildren pre- and post-delivery of the intervention.
Repeated cross-sectional surveys were completed at baseline (2010–2013) and follow-up (2018–2019). Salt intake was measured via 24-h urinary Na excretion, discretionary salt use behaviours by self-report and sources of salt by 24-h dietary recall. Data were analysed with multivariable-adjusted regression models.
Children aged 4–12 years
Complete 24-h urine samples were collected from 666 children at baseline and 161 at follow-up. Mean salt intake remained unchanged from baseline (6·0; se 0·1 g/d) to follow-up (6·1; 0·4 g/d) (P = 0·36), and there were no clear differences in the food sources of salt and at both time points approximately 70 % of children exceeded Na intake recommendations. At follow-up, 14 % more parents (P = 0·001) reported adding salt during cooking, but child use of table salt and inclusion of a saltshaker on the table remained unchanged.
These findings show no beneficial effect of the Victorian Salt Reduction Partnership intervention on children’s salt intake. More intensive, sustained and coordinated efforts between state and federal stakeholders are required.
There are various models for supporting students with disability and their teachers in mainstream schools. In New South Wales, each school has a learning and support teacher allocation and the New South Wales Department of Education recommends each school have a learning support team. This paper draws on in-depth interviews with school staff from 22 schools, including 16 learning and support teachers, 20 class teachers, 25 school executives and other stakeholders. We report here on the role of learning and support teachers and learning support teams in planning, implementing and evaluating adjustments and on the operation of learning support teams. Qualitative analysis of the interview transcripts revealed two kinds of learning support teams: those that focus on a particular student and those that oversee the education and resource provision for all students with disability in a school. Some teams had more of a focus on administration and resourcing, while others dealt more with educational adjustments. Similarly, some learning and support teachers were more involved in administrative and liaison roles, while others were more active in supporting teachers and providing services directly to students. The most detailed descriptions of support were provided by learning and support teachers with special education qualifications.
Self-determination skills, including competencies such as decision-making, are regarded by parents and teachers as important for students with special needs. Although not necessarily regarded as appropriate, teaching assistants often take substantial responsibility for delivering educational programs to students and little is known about their perspectives on self-determination. Perspectives of teaching assistants may impact on their support of programs to enhance self-determination that are developed by teachers. Teaching assistants in New South Wales mainstream schools (N = 320) were surveyed regarding their views on the importance and frequency of instruction of seven competencies related to self-determination of students with special needs. Consistent with previous research, assistants rated all the competencies highly in terms of importance, but frequency of implementation was more variable. Moderate correlations were found between ratings of importance and frequency of implementation, suggesting that greater instructional time was devoted to competencies viewed as more important. Limited differences were found between assistants working at primary and secondary levels. Although features of the interactions of teaching assistants that can inhibit self-determination have been often identified in previous research, it is argued that, paradoxically, assistants may be well positioned to facilitate the development of self-determination with appropriate training and supervision. Directions for future research are identified.
Plans for allocation of scarce life-sustaining resources during the coronavirus disease 2019 (COVID-19) pandemic often include triage teams, but operational details are lacking, including what patient information is needed to make triage decisions.
A Delphi study among Washington state disaster preparedness experts was performed to develop a list of patient information items needed for triage team decision-making during the COVID-19 pandemic. Experts proposed and rated their agreement with candidate information items during asynchronous Delphi rounds. Consensus was defined as ≥80% agreement. Qualitative analysis was used to describe considerations arising in this deliberation. A timed simulation was performed to evaluate feasibility of data collection from the electronic health record.
Over 3 asynchronous Delphi rounds, 50 experts reached consensus on 24 patient information items, including patients’ age, severe or end-stage comorbidities, the reason for and timing of admission, measures of acute respiratory failure, and clinical trajectory. Experts weighed complex considerations around how information items could support effective prognostication, consistency, accuracy, minimizing bias, and operationalizability of the triage process. Data collection took a median of 227 seconds (interquartile range = 205, 298) per patient.
Experts achieved consensus on patient information items that were necessary and appropriate for informing triage teams during the COVID-19 pandemic.
Early “ptychoparioid” trilobites are likely to play a crucial role in determining the currently unresolved phylogenetic relationships among the numerous Cambrian libristomate clades. However, such phylogenetic analyses are hindered by the fact that many species and genera of early “ptychoparioids” are known from very limited material and are typically defined only on cranidial features. Herein, we report new “ptychoparioid” assemblages from the Saline Valley Tongue, Harkless Formation at Clayton Ridge, Nevada. These middle to upper Dyeran Stage (Cambrian Series 2, Stage 4) assemblages contain well-preserved early “ptychoparioids” that are represented by articulated exoskeletons as well as isolated cranidia, rostral plates, librigenae, thoracic segments, and pygidia, which together provide morphological details for the complete or nearly complete exoskeleton of three new genera and four new species: Anebocephalus silverpeakensis n. gen. n. sp.; Coenoides scholteni n. gen. n. sp.; Harklessaspis rasettii n. gen. n. sp.; and H. parvigranulosus n. gen. n. sp. Pending a formal phylogenetic analysis, it is deemed preferable to assign these taxa to new genera rather than to shoehorn them in to potentially ill-diagnosed existing genera. These new genera are compared to the Cambrian Series 2 taxa Crassifimbra and Eokochaspis, and the Miaolingian Series taxon Elrathina, which are similar in morphology and known from sclerites other than just cranidia.
Also documented in this study are the “ptychoparioids” Crassifimbra walcotti, Cr.? sp. A, Cr. sp. B, and “ptychoparioid” spp. A, B, C, and D from the Saline Valley Tongue and the overlying Mule Spring Limestone and lowermost Emigrant Formation.
The current study was conducted to examine the types of adjustments used to support students with special educational needs in mainstream classrooms and how schools monitored the effectiveness of the adjustments they use. A range of stakeholders were interviewed in 22 mainstream schools across New South Wales, Australia, and the interviews were analysed for key themes. Some schools had a narrow focus on a few key areas, with teaching assistants being the most commonly reported adjustment. Few schools used formal formative monitoring to evaluate the effectiveness of adjustments. Options for improvement schools could consider include examining the breadth of adjustments, establishing clear measurable goals, considering alternative strategies for use of teaching assistants, and ensuring adjustments are monitored.
This study aimed to investigate the association between individual and combinations of macronutrients with premature death, CVD and dementia. Sex differences were investigated. Data were utilised from a prospective cohort of 120 963 individuals (57 % women) within the UK Biobank, who completed ≥ two 24-h diet recalls. The associations of macronutrients, as percentages of total energy intake, with outcomes were investigated. Combinations of macronutrients were defined using k-means cluster analysis, with clusters explored in association with outcomes. There was a higher risk of death with high carbohydrate intake (hazard ratios (HR), 95 % CI upper v. lowest third 1·13 (1·03, 1·23)), yet a lower risk with higher intakes of protein (upper v. lowest third 0·82 (0·76, 0·89)). There was a lower risk of CVD with moderate intakes (middle v. lowest third) of energy and protein (sub distribution HR (SHR), 0·87 (0·79, 0·97) and 0·87 (0·79, 0·96), respectively). There was a lower risk of dementia with moderate energy intake (SHR 0·71 (0·52, 0·96)). Sex differences were identified. The dietary cluster characterised by low carbohydrate, low fat and high protein was associated with a lower risk of death (HR 0·84 (0·76, 0·93)) compared with the reference cluster and a lower risk of CVD for men (SHR 0·83 (0·71, 0·97)). Given that associations were evident, both as single macronutrients and for combinations with other macronutrients for death, and for CVD in men, we suggest that the biggest benefit from diet-related policy and interventions will be when combinations of macronutrients are targeted.
Most Cambrian Series 2 faunas of Laurentia are dominated by olenelline trilobites; however, non-olenelline trilobites occur with the olenellines and sometimes dominate the assemblages. Reported here are such non-olenelline trilobites from the Harkless Formation and Mule Spring Limestone at Clayton Ridge, Nevada. At the bottom of the Saline Valley Tongue, Harkless Formation, are two assemblages that are characterized by corynexochines and/or ptychoparioids, with olenellines occurring as only rare components. The corynexochines present in these assemblages include Bonnia cf. B. brennus (Walcott, 1916), Ovatoryctocara cf. O. yaxiensis Yuan et al., 2009, Protoryctocephalus? aff. P.? arcticus Geyer and Peel, 2011, Ogygopsis sp. indet., and Oryctocephalops frischenfeldi Lermontova, 1940. These assemblages are from the mid-Dyeran Stage, below the lowermost zone in the upper Dyeran (Arcuolenellus arcuatus Biozone), and can be correlated to Series 2 Stage 4 (Cambrian) assemblages in Greenland, Siberia, and South China based on the corynexochines.
Also in the Saline Valley Tongue and the overlying Mule Spring Limestone and lowermost Emigrant Formation are olenelloid-dominated assemblages that contain the corynexochines Bonnia columbensis Resser, 1936, Zacanthopsis aff. Z. levis (Walcott, 1886), and Z. sp. indet.
In order to maximize the utility of future studies of trilobite ontogeny, we propose a set of standard practices that relate to the collection, nomenclature, description, depiction, and interpretation of ontogenetic series inferred from articulated specimens belonging to individual species. In some cases, these suggestions may also apply to ontogenetic studies of other fossilized taxa.
In Victoria, Australia, a statewide salt reduction partnership was launched in 2015. The aim was to measure Na intake, food sources of Na (level of processing, purchase origin) and discretionary salt use in a cross-section of Victorian adults prior to a salt reduction initiative. In 2016/2017, participants completed a 24-h urine collection (n 338) and a subsample completed a 24-h dietary recall (n 142). Participants were aged 41·2 (sd 13·9) years, and 56 % were females. Mean 24-h urinary excretion was 138 (95 % CI 127, 149) mmol/d for Na. Salt equivalent was 8·1 (95 % CI 7·4, 8·7) g/d, equating to about 8·9 (95 % CI 8·1, 9·6) g/d after 10 % adjustment for non-urinary losses. Mean 24-h intake estimated by diet recall was 118 (95 % CI 103, 133) mmol/d for Na (salt 6·9 (95 % CI 6·0, 7·8 g/d)). Leading dietary sources of Na were cereal-based mixed dishes (12 %), English muffins, flat/savoury/sweet breads (9 %), regular breads/rolls (9 %), gravies and savoury sauces (7 %) and processed meats (7 %). Over one-third (38 %) of Na consumed was derived from discretionary foods. Half of all Na consumed came from ultra-processed foods. Dietary Na derived from foods was obtained from retail stores (51 %), restaurants and fast-food/takeaway outlets (28 %) and fresh food markets (9 %). One-third (32 %) of participants reported adding salt at the table and 61 % added salt whilst cooking. This study revealed that salt intake was above recommended levels with diverse sources of intake. Results from this study suggest a multi-faceted salt reduction strategy focusing on the retail sector, and food reformulation would most likely benefit Victorians and has been used to inform the ongoing statewide salt reduction initiative.
Oryctocephalid trilobites are seldom abundant and often tectonically deformed, creating problems for robust species delimitation and compromising their utility in biostratigraphic and evolutionary studies. By studying more than 140 specimens recovered from the upper portion of the Combined Metals Member (Pioche Formation, Nevada; Cambrian Stage 4, Series 2), we exploit a rare opportunity to explore how morphological variation among oryctocephalid specimens is partitioned into intraspecific variation versus interspecific disparity. Qualitative and quantitative analyses reveal that two species are represented: Oryctocephalites palmeri Sundberg and McCollum, 1997 and Oryctocephalites sp. A, the latter known from a single cranidium stratigraphically below all occurrences of the former. In contrast to the conclusions of a previous study, there is no evidence of cranidial dimorphism in O. palmeri. However, that species exhibits considerable variation in cranidial shape and pygidial spine arrangement and number. Cranidial shape variation within O. palmeri is approximately one-half of the among-species disparity within the genus. Comparison of cranidial shape between noncompacted and compacted samples reveals that compaction causes significant change in mean shape and an increase in shape variation; such changes are interpretable in terms of observed fracture patterns. Nontaphonomic variation is partitioned into ontogenetic and nonallometric components. Those components share similar structure with each other and with interspecific disparity, suggesting that ontogenetic shape change might be an important source of variation available for selection. This highlights the importance of ontogenetic and taphonomic sources of variation with respect to species delimitation, morphospace occupation, and investigation of evolutionary patterns and processes.
The Ediacaran to lower Cambrian Chilhowee Group of the southern and central Appalachians records the rift-to-drift transition of the newly formed Iapetan margin of Laurentia. Body fossils are rare within the Chilhowee Group, and correlations are based almost exclusively on lithological similarities. A critical review of previous work highlights the relatively weak biostratigraphic and radiometric age constraints on the various units within the succession. Herein, we document a newly discovered fossil-bearing locality within the Murray Shale (upper Chilhowee Group) on Chilhowee Mountain, eastern Tennessee, and formally describe a nevadioid trilobite, Buenellus chilhoweensis n. sp., from that site. This trilobite indicates that the Murray Shale is of Montezuman age (provisional Cambrian Stage 3), which is older than the Dyeran (provisional late Stage 3 to early Stage 4) age suggested by the historical (mis)identification of “Olenellus sp.” from within the unit as reported by workers more than a century ago. Buenellus chilhoweensis n. sp. represents only the second known species of Buenellus, and demonstrates that the genus occupied both the Innuitian and Iapetan margins of Laurentia during the Montezuman. It is the oldest known trilobite from the Iapetan margin, and proves that the hitherto apparent absence of trilobites from that margin during the Montezuman was an artifact of inadequate sampling rather than a paleobiogeographic curiosity. The species offers a valuable biostratigraphic calibration point within a rock succession that has otherwise proven recalcitrant to refined dating.
To assess if there is a difference in salt intake (24 h urine collection and dietary recall) and dietary sources of salt (Na) on weekdays and weekend days.
A cross-sectional study of adults who provided one 24 h urine collection and one telephone-administered 24 h dietary recall.
Community-dwelling adults living in the State of Victoria, Australia.
Adults (n 598) who participated in a health survey (53·5 % women; mean age 57·1 (95 % CI 56·2, 58·1) years).
Mean (95 % CI) salt intake (dietary recall) was 6·8 (6·6, 7·1) g/d and 24 h urinary salt excretion was 8·1 (7·8, 8·3) g/d. Mean dietary and 24 h urinary salt (age-adjusted) were 0·9 (0·1, 1·6) g/d (P=0·024) and 0·8 (0·3, 1·6) g/d (P=0·0017), respectively, higher at weekends compared with weekdays. There was an indication of a greater energy intake at weekends (+0·6 (0·02, 1·2) MJ/d, P=0·06), but no difference in Na density (weekday: 291 (279, 304) mg/MJ; weekend: 304 (281, 327) mg/MJ; P=0·360). Cereals/cereal products and dishes, meat, poultry, milk products and gravy/sauces accounted for 71 % of dietary Na.
Mean salt intake (24 h urine collection) was more than 60 % above the recommended level of 5 g salt/d and 8–14 % more salt was consumed at weekends than on weekdays. Substantial reductions in the Na content of staple foods, processed meat, sauces, mixed dishes (e.g. pasta), convenience and takeaway foods are required to achieve a significant consistent reduction in population salt intake throughout the week.
The decrease in quality of Australian iron ore, coupled with the demand for more efficient energy use, means that closer monitoring and optimisation of process conditions for iron ore sinter production is required. Here, the suitability of using partial least-squares regression analysis of powder X-ray diffraction data, collected for iron ore sinter samples, for the prediction of iron ore sinter strength has been further assessed. In addition, a preliminary assessment of the effect of 2θ range on the quality of prediction has been made. For the purposes of process control, the level of correlation between predicted strength and actual sinter strength would inform an operator whether or not the process was operating within the acceptable limits, or whether there was a potential problem requiring further investigation or rapid intervention. Reducing the 2θ range was found to reduce the level of correlation between predicted and actual strength, to a point where the particular analysis may no longer be suitable for process control.
The electrochemical behaviour of a number of Pb-based anode alloys, under simulated electrowinning conditions, in a 1.6 M H2SO4 electrolyte at 45 °C was studied. Namely, the evolution of PbO2 and PbSO4 surface layers was investigated by quantitative in situ synchrotron X-ray diffraction (S-XRD) and subsequent Rietveld-based quantitative phase analysis (QPA). In the context of seeking new anode alloys, this research shows that the industry standard Pb-0.08Ca-1.52Sn (wt%) anode, when exposed to a galvanostatic current and intermittent power interruptions, exhibited poor electrochemical performance relative to select custom Pb-based binary alloys; Pb–0.73Mg, Pb–5.05Ag, Pb–0.07Rh, and Pb–1.4Zn (wt%). The in situ S-XRD measurements and subsequent QPA indicated that this was linked to a lower proportion of β-PbO2, relative to PbSO4, on the Pb-0.08Ca-1.52Sn alloy at all stages of the electrochemical cycling. The best performing alloy, in terms of minimisation of overpotential during normal electrowinning operation and minimising the deleterious effects of repeated power interruptions – both of which are significant factors in energy consumption – was determined to be Pb–0.07Rh.
The thermal decomposition of mill scale, and the effect of mill scale addition on the formation and decomposition of Silico-Ferrite of Calcium and Aluminium (SFCA) and SFCA-I iron ore sinter bonding phases, has been investigated using in situ X-ray diffraction. Application of the external standard method of quantitative phase analysis of the in situ data collected during decomposition of the mill scale highlighted the applicability of this method for the determination of the nature and abundance of amorphous material in a mineral sample. Increasing mill scale addition from 2.6 to 10.6 and to 21.2 wt% in an otherwise synthetic sinter mixture composition designed to form SFCA did not significantly affect the thermal stability ranges of SFCA-I or SFCA, nor did it significantly affect the amount of each of SFCA or SFCA-I, which formed. This was attributed to the low impurity (i.e. Mn, Mg) concentration in the mill scale, and also the transformation to hematite during heating of the wüstite and magnetite present in the mill scale, with the hematite available for reaction to form SFCA and SFCA-I.
Field experiments were established at Columbus and near South Charleston, OH to determine the effects of giant ragweed population density on soybean yield and to characterize the development of giant ragweed grown in 76-cm soybean rows. An economic threshold was calculated for Ohio using a common treatment for giant ragweed control in soybean. A cost of $41/ha was estimated for a farmer to apply 0.56 kg/ha bentazon plus 0.28 kg/ha fomesafen plus COC (1.25% v/v). Assuming a soybean value of $0.22/kg, the cost of control was equivalent to 5.4 and 7.1% of the soybean yield in 1991 and 1992, respectively, which corresponded to the yield loss caused by 0.08 and 0.03 giant ragweed plants/m2. The competitiveness of giant ragweed can be at least partly attributed to its ability to initiate and maintain axillary leaves and branches within the shaded confines of the soybean canopy.
The objectives of this study were to determine how the timing of weed management treatments in winter wheat stubble affects weed control the following season and to determine if spring herbicide rates in corn can be reduced with appropriately timed stubble management practices. Field studies were conducted at two sites in Ohio between 1993 and 1995. Wheat stubble treatments consisted of glyphosate (0.84 kg ae/ha) plus 2,4-D (0.48 kg ae/ha) applied in July, August, or September, or at all three timings, and a nontreated control. In the following season, spring herbicide treatments consisted of a full rate of atrazine (1.7 kg ai/ha) plus alachlor (2.8 kg ai/ha) preemergence, a half rate of these herbicides, or no spring herbicide treatment. Across all locations, a postharvest treatment of glyphosate plus 2,4-D followed by alachlor plus atrazine at half or full rates in the spring controlled all broadleaf weeds, except giant ragweed, at least 88%. Giant foxtail control at three locations was at least 83% when a postharvest glyphosate plus 2,4-D treatment was followed by spring applications of alachlor plus atrazine at half or full rates. Weed control in treatments without alachlor plus atrazine was variable, although broadleaf control from July and August glyphosate plus 2,4-D applications was greater than from September applications. Where alachlor and atrazine were not applied, August was generally the best timing of herbicide applications to wheat stubble for reducing weed populations the following season.
To update the estimate of mean salt intake for the Australian population made by the Australian Health Survey (AHS).
A secondary analysis of the data collected in a cross-sectional survey was conducted. Estimates of salt intake were made in Lithgow using the 24 h diet recall methodology employed by the AHS as well as using 24 h urine collections. The data from the Lithgow sample were age- and sex-weighted, to provide estimates of daily salt intake for the Australian population based upon (i) the diet recall data and (ii) the 24 h urine samples.
Lithgow, New South Wales, Australia.
Individuals aged ≥20 years residing in Lithgow and listed on the 2009 federal electoral roll.
Mean (95 % CI) salt intake estimated from the 24 h diet recalls was 6·4 (6·2, 6·7) g/d for the Lithgow population compared with a corresponding figure of 6·2 g/d for the Australian population derived from the AHS. The corresponding estimate of salt intake for Lithgow adults based upon the 24 h urine collections was 9·0 (8·6, 9·4) g/d. When the age- and sex-specific estimates of salt intake obtained from the 24 h urine collections in the Lithgow sample were weighted using Australian census data, estimated salt intake for the Australian population was 9·0 (8·6, 9·5) g/d. Further adjustment for non-urinary Na excretion made the best estimate of daily salt intake for both Lithgow and Australia about 9·9 g/d.
The dietary recall method used by the AHS likely substantially underestimated mean population salt consumption in Australia.