To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The EAT-Lancet Commission on Food, Planet, Health promulgated a universal reference diet. Subsequently, researchers constructed an EAT-Lancet diet score (0-14 points), with lower bound intake values for various dietary components set at 0 g/d, and reported inverse associations with risks of major health outcomes in a high-income population. We assessed associations between EAT-Lancet diet scores, without or with (>0 g/d) minimum intake values, and the Mean Probability of Micronutrient Adequacy (MPA) in food and nutrition insecure women of reproductive age (WRA) from low- and middle-income countries (LMICs). We analysed single 24-h diet recall data (n=1,950) from studies in rural Democratic Republic of the Congo, Ecuador, Kenya, Sri Lanka, and Vietnam. Associations between EAT-Lancet diet scores and MPA were assessed by fitting linear mixed-effects models with random intercept and slope. EAT-Lancet diet scores (mean ± SD) were 8.8 ± 1.3 and 1.9 ± 1.1 without or with minimum intake values, respectively. Furthermore, pooled MPA was 0.58 ± 0.22 and total energy intake was 2521 ± 1100 kcal/d. One-point increase in the EAT-Lancet diet score, without minimum intake values, was associated with a 2.6 ± 0.7 percentage points decrease in MPA (P<0.001). In contrast, the EAT-Lancet diet score, with minimum intake values, was associated with a 2.4 ± 1.3 percentage points increase in MPA (P=0.07). Further analysis indicated positive associations between EAT-Lancet diet scores and MPA adjusted for total energy intake (P<0.05). Our findings indicate that the EAT-Lancet diet score requires minimum intake values for nutrient-dense dietary components to avoid positively scoring non-consumption of food groups and subsequently predicting lower MPA of diets, when applied to rural WRA in LMICs.
Introduction: A critical component for successful implementation of any innovation is an organization's readiness for change. Competence by Design (CBD) is the Royal College's major change initiative to reform the training of medical specialists in Canada. The purpose of this study was to measure readiness to implement CBD among the 2019 launch disciplines. Methods: An online survey was distributed to program directors of the 2019 CBD launch disciplines one month prior to implementation. Questions were developed based on the R = MC2 framework for organizational readiness. They addressed program motivation to implement CBD, general capacity for change, and innovation-specific capacity. Questions related to motivation and general capacity were scored using a 5-point scale of agreement. Innovation-specific capacity was measured by asking participants whether they had completed 33 key pre-implementation tasks (yes/no) in preparation for CBD. Bivariate correlations were conducted to examine the relationship between motivation, general capacity and innovation specific capacity. Results: Survey response rate was 42% (n = 79). A positive correlation was found between all three domains of readiness (motivation and general capacity, r = 0.73, p < 0.01; motivation and innovation specific capacity, r = 0.52, p < 0.01; general capacity and innovation specific capacity, r = 0.47, p < 0.01). Most respondents agreed that successful launch of CBD was a priority (74%). Fewer felt that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). While most programs indicated that their leadership (94%) and faculty and residents (87%) were supportive of change, 42% did not have experience implementing large-scale innovation and 43% indicated concerns about adequate support staff. Programs had completed an average of 72% of pre-implementation tasks. No difference was found between disciplines (p = 0.11). Activities related to curriculum mapping, competence committees and programmatic assessment had been completed by >90% of programs, while <50% of programs had engaged off-service rotations. Conclusion: Measuring readiness for change aids in the identification of factors that promote or inhibit successful implementation. These results highlight several areas where programs struggle in preparation for CBD launch. Emergency medicine training programs can use this data to target additional implementation support and ongoing faculty development initiatives.
US Latinos have higher rates of substance use disorders (SUDs) than Latinas, but Latinas face substantial barriers to treatment and tend to enter care with higher SUD severity. Immigrant Latinas may face greater barriers to care than native-born despite lower overall SUD prevalence. This study aimed to identify how SUD treatment needs of Latinos are addressed depending on patient gender and immigrant status within an urban healthcare system serving a diverse population.
Data from electronic health records of adult Latino/a primary care patients (n = 29,887 person-years) were used to identify rates of SUD treatment in primary and specialty care. Treatment characteristics and receipt of adequate care were compared by gender and immigrant status.
Tobacco was the most frequently treated substance followed by alcohol and other drugs. Forty-six percent of SUD patients had a comorbid psychiatric condition. Treatment rates ranged from 2.52% (female non-immigrants) to 8.38% (male immigrants). Women had lower treatment rates than men, but male and female immigrants had significantly higher treatment rates than their non-immigrant counterparts. Receipt of minimally adequate outpatient care varied significantly by gender and immigrant status (female non-immigrants 12.5%, immigrants 28.57%; male non-immigrants 13.46%, immigrants 17.09%) in unadjusted and adjusted analyses.
Results indicate overall low prevalence of SUD treatment in the healthcare system. Low rates of minimally adequate care evidence the challenge of delivering integrated behavioral healthcare for Latinos with SUD. Results also demonstrate gender and immigrant status disparities in an unexpected direction, with immigrant women receiving the highest rates of adequate care.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
A major threat to the Scotch whisky industry is the sale of counterfeit single malt whiskies with purported distillation years in the 19th and early- to mid-20th centuries. However, these are often much more recent spirits, distilled in the latter part of the 20th or first part of the 21st centuries. These sales impinge upon the reputation of auction houses, retailers, brand owners and distillers. The atmospheric testing of nuclear weapons in the 1950s and early 1960s has enabled a precise calibration curve to be created, however, there are several reasons why this may not be appropriate for establishing the year of whisky distillation. We have created a 14C calibration curve derived from known-age, single malt whiskies for the period 1950–2015 that enables whisky distilled during the period from 1955 onwards to have the distillation year determined to within 1–3 years for certain periods. However, because of the shape of the curve, two possible age ranges are often possible. The correct range can often be determined from a further plot of δ13C values against distillation year, which shows a trend of decreasing values through time. Several examples are given of the determination of both genuine and fake products.
The bovine appeasing substance (BAS) is expected to have calming effects in cattle experiencing stressful situations. Therefore, this study investigated the impacts of BAS administration during two of the most stressful events within beef production systems: weaning and feedlot entry. In experiment 1, 186 Bos indicus-influenced calves (73 heifers, 113 bulls) were weaned at 211 ± 1 days of age (day 0). At weaning, calves were ranked by sex and BW, and assigned to receive BAS (Nutricorp, Araras, SP, Brazil; n = 94) or water (CON; n = 92). Treatments (5 ml) were topically applied to the nuchal skin area of each animal. Calf BW was recorded and samples of blood and tail-switch hair were collected on days 0, 15 and 45. Calves that received BAS had greater (P < 0.01) BW gain from day 0 to 15 compared with CON. Overall BW gain (days 0 to 45) and BW on days 15 and 45 were also greater (P ≤ 0.03) in BAS v. CON. Plasma haptoglobin concentration was less (P < 0.01) in BAS v. CON on day 15, whereas cortisol concentrations in plasma and tail-switch hair did not differ between treatments (P ≥ 0.13). In experiment 2, 140 B. indicus-influenced bulls (∼27 months of age) from 2 different pasture-based systems (70 bulls/origin) were transported to a commercial feedlot (≤ 200-km transport; day -1). On day 0, bulls were ranked by source and BW, and assigned to receive BAS (n = 70) or CON (n = 70) and the same sampling procedures as in experiment 1. Bulls receiving BAS had greater (P = 0.04) BW gain from day 0 to 15, but less (P < 0.01) BW gain from day 15 to 45 compared to CON. No other treatment effects were detected (P > 0.14). Therefore, BAS administration to beef calves alleviated the haptoglobin response associated with weaning, and improved calf growth during the subsequent 45 days. Administration of BAS to beef bulls at feedlot entry improved BW gain during the initial 15 days, but these benefits were not sustained throughout the 45-day experiment.
In the emotionally intense field of healthcare, the ability to peacefully inhabit one's body, maintain good boundaries, and be fully present during care is essential. This study aimed to validate the recently developed Mindful Self-Care Scale (MSCS) among hospice and healthcare professionals and develop a brief version of the 33-item MSCS.
A sample of hospice and healthcare professionals from all 50 states (n = 858) was used. A confirmatory factor analysis was run using a rigorous methodology for validation and item reduction to develop a brief version of the 33-item MSCS. The brief MSCS (B-MSCS) was developed by identifying items for exclusion through examination of conceptual overlap, descriptive statistics by detecting sources of improvement model fit using confirmatory factor analysis. Model modifications were done sequentially and with regard to theoretical considerations.
The existing model, 33-item MSCS with six subscales, had good fit to the data with all indicators in acceptable ranges (chi-square/df = 3.08, df (480), p < 0.01, root mean square error of approximation = 0.059, comparative fit index = 0.915, Tucker and Lewis's index of fit = 0.907). Nine items were excluded on the basis of very low loadings and conceptual and empirical overlap with other items.
Significance of results
The final 24-item, B-MSCS model was consistent with the original conceptual model and had a closer fit to the data (chi-square/df = 1.85, df (215), p < 0.01, root mean square error of approximation = 0.041, comparative fit index = 0.961, Tucker and Lewis's index of fit = 0.955). In addition, the reliability, construct, and concurrent validity of the MSCS and B-MSCS were in the acceptable and good ranges, respectively. Mean and standard deviation of the MSCS and B-MSCS scores were similar; B-MSCS mean scores well approximated the MSCS scores. Informal mindful self-care, in the process of everyday life, was practiced more regularly and associated with increased wellness and reduced burnout risk than formal mind-body practices.
Each of the laboratory intercomparisons (from ICS onwards) has included wood samples, many of them dendrochronologically dated. In the early years, as a result of the majority of laboratories being radiometric, these samples were typically blocks of 20–40 rings, but more recently (SIRI), they have been single ring samples. The sample ages have spanned background through to modern. In some intercomparisons, we have examined different wood pretreatment effects, in others the focus has been on background samples. In this paper, we illustrate what we have learned from these extensive intercomparisons involving wood samples and how the results contribute to the global IntCal effort.
Over the past 30 years, the format of the radiocarbon (14C) intercomparison studies has changed, however, the selection of sample types used in these studies has remained constant—namely, natural and routinely dated materials that could subsequently be used as in-house reference materials. One such material is peat which has been used 12 times, starting with the ICS in 1988. Peat from Iceland (TIRI), Ellanmore (TIRI), Letham Moss (ICS, VIRI, and SIRI), and St Bees, UK (FIRI and VIRI) have been used, as well as a near-background peat from Siberia. In the main, these peat samples have been provided as the humic acid fraction, with the main advantage being that the humic acid is extracted in solution and then precipitated (the solution phase providing the homogenisation) which is a key requirement for a reference material. In this paper, we will revisit the peat results and explore their findings. In addition, for the last 8 years, the Letham Moss sample has been used in the SUERC 14C laboratory as an in-house standard or reference material. This has resulted in several thousand measurements. Such a rich data set is explored to illustrate the benefits arising from the intercomparison program.
Clostridium difficile, the most common cause of hospital-associated diarrhoea in developed countries, presents major public health challenges. The high clinical and economic burden from C. difficile infection (CDI) relates to the high frequency of recurrent infections caused by either the same or different strains of C. difficile. An interval of 8 weeks after index infection is commonly used to classify recurrent CDI episodes. We assessed strains of C. difficile in a sample of patients with recurrent CDI in Western Australia from October 2011 to July 2017. The performance of different intervals between initial and subsequent episodes of CDI was investigated. Of 4612 patients with CDI, 1471 (32%) were identified with recurrence. PCR ribotyping data were available for initial and recurrent episodes for 551 patients. Relapse (recurrence with same ribotype (RT) as index episode) was found in 350 (64%) patients and reinfection (recurrence with new RT) in 201 (36%) patients. Our analysis indicates that 8- and 20-week intervals failed to adequately distinguish reinfection from relapse. In addition, living in a non-metropolitan area modified the effect of age on the risk of relapse. Where molecular epidemiological data are not available, we suggest that applying an 8-week interval to define recurrent CDI requires more consideration.
The best method for quantifying the marine reservoir effect (MRE) using the global IntCal Marine13 calibration curve remains unresolved. Archaeologists frequently quantify uncertainty on MRE values as errors computed from single pairs of marine-terrestrial radiocarbon ages, which we argue significantly overstates their accuracy and precision. Here, we review the assumptions, methods, and applications of estimating MRE via an estimate of the additional regional offset between the marine and terrestrial calibration curves (ΔR) for the Prince Rupert Harbour (PRH) region of British Columbia, Canada. We acknowledge the influence on ΔR of MRE variation as (1) a dynamic oceanographic process, (2) its variable expression in biochemical and geochemical pathways, and (3) compounding errors in sample selection, measurement, and calculation. We examine a large set of marine-terrestrial pairs (n = 63) from PRH to compare a common archaeological practice of estimating uncertainty from means that generate an uncertainty value of ±49 years with a revised, more appropriate estimate of error of ± 230 years. However, we argue that the use of multiple-pair samples estimates the PRH ΔR as 273 ± 38 years for the last 5,000 years. Calculations of error that do not consider these issues may generate inaccurate age estimates with unjustifiable precision.
This article describes a CDI outbreak in a long-term care (LTC) facility that used molecular typing techniques and whole-genome sequencing to identify widespread dissemination of the clonal strain in the environment which was successfully removed after terminal cleaning.
This study was conducted in a long-term care facility in Texas.
A recently hospitalized LTC patient was diagnosed with CDI followed shortly thereafter by 7 subsequent CDI cases. A stool specimen was obtained from each patient for culturing and typing. An environmental point-prevalence study of the facility was conducted before and after terminal cleaning of the facility to assess environmental contamination. Cultured isolates were typed using ribotyping, multilocus variant analysis, and whole-genome sequencing.
Stool samples were available for 5 of 8 patients; of these specimens, 4 grew toxigenic C. difficile ribotype 027. Of 50 environmental swab samples collected throughout the facility prior to the facility-wide terminal cleaning, 19 (38%) grew toxigenic C. difficile (most commonly ribotype 027, 79%). The terminal cleaning was effective at reducing C. difficile spores in the environment and at eradicating the ribotype 027 strain (P<.001). Using multilocus variance analysis and whole-genome sequencing, clinical and environmental strains were highly related and, in some cases, were identical.
Using molecular typing techniques, we demonstrated reduced environmental contamination with toxigenic C. difficile and the eradication of a ribotype 027 clone. These techniques may help direct infection control efforts and decrease the burden of CDI in the healthcare system.
Febrile seizure (FS) in children is a common complication of infections with respiratory viruses and hand, foot and mouth disease (HFMD). We conducted a retrospective ecological time-series analysis to determine the temporal relationship between hospital attendances for FS and HFMD or respiratory virus infections. Epilepsy attendance was used as a control. Data from 2004 to 2012 FS and epilepsy hospital attendance, HFMD notifications to the Ministry of Health and from laboratory-confirmed viral respiratory infections among KK Women's and Children's Hospital inpatients were used. A multivariate linear regression analysis was conducted to evaluate the relationship between FS and the virus time series. Relative risks of FS by age were calculated using Bayesian statistical methods. Paediatric accident and emergency (A&E) attendances for FS were found to be associated with influenza A (extra 0.47 FS per influenza A case), B (extra 0.32 per influenza B case) and parainfluenza 3 (extra 0.35 per parainfluenza type 3 case). However, other viruses were not significantly associated with FS. None of the viruses were associated with epileptic seizure attendance. Influenza A, B and parainfluenza 3 viruses contributed to the burden of FS resulting in A&E attendance. Children at risk of FS should be advised to receive seasonal influenza vaccination.
OBJECTIVES/SPECIFIC AIMS: Central neuropathic pain is a severely disabling consequence of conditions that cause tissue damage in the central nervous system (CNS) such as multiple sclerosis (MS) and neuromyelitis optica (NMO). It impacts mood, mobility and quality of life, but is often refractory to common treatments. Scrambler Therapy is an emerging non-invasive pain modifying technique that utilizes transcutaneous electrical stimulation of nociceptive fibers with the intent of re-organizing maladaptive signaling pathways. It has been examined for treatment of peripheral neuropathy with favorable safety and efficacy outcomes, but its use in central neuropathic pain has not been reported. We aim to explore acceptability and safety of Scrambler Therapy through a Phase II sham-controlled trial in NMO, and describe its use to date in central neuropathic pain. METHODS/STUDY POPULATION: Two patients with longstanding central neuropathic pain who failed multiple drug trials were treated as proof-of-concept, supporting the recent launch of a Phase II randomized controlled trial in NMO where patients receive 10 daily Scrambler treatments versus sham. Safety and acceptability from those recruited to date will be reported. Acceptability is measured by adherence and responses to patient surveys. RESULTS/ANTICIPATED RESULTS: We plan to recruit 22 patients, randomized 1:1 into experimental and sham arms. We will present acceptability and safety data for Scrambler use in patients with NMO who have been recruited by the time of this conference, as well as effectiveness data from two cases that have been completed outside of the trial. One case involved a 65-year-old woman with a 4-year history of central neuropathic pain following a C3-C5 TM. Her numerical rating scale (NRS) pain score was reduced to 0/10 from a baseline score of 5/10. The second case involved a 52-year-old woman with a 13-year history of pain following a medullary cavernoma bleed. Her baseline NRS pain score was 9/10, which was reduced to 0.5/10 post-treatment. No adverse events were reported. Pain relief was sustained at 30 days’ post-treatment. DISCUSSION/SIGNIFICANCE OF IMPACT: We are investigating the acceptability and efficacy of Scrambler Therapy for central neuropathic pain treatment in NMO. Proof-of-concept was supported by two patients whose pain scores improved considerably more in response to this treatment than with previous pharmacologic and non-pharmacologic interventions. Results from this trial may support future investigation in other disorders that cause damage in the CNS, including MS and TM.
Vaccination is increasingly being recognised as a potential tool to supplement ‘stamping out’ for controlling foot-and-mouth disease (FMD) outbreaks in non-endemic countries. Infectious disease simulation models provide the opportunity to determine how vaccination might be used in the face of an FMD outbreak. Previously, consistent relative benefits of specific vaccination strategies across different FMD simulation modelling platforms have been demonstrated, using a UK FMD outbreak scenario. We extended this work to assess the relative effectiveness of selected vaccination strategies in five countries: Australia, New Zealand, the USA, the UK and Canada. A comparable, but not identical, FMD outbreak scenario was developed for each country with initial seeding of Pan Asia type O FMD virus into an area with a relatively high density of livestock farms. A series of vaccination strategies (in addition to stamping out (SO)) were selected to evaluate key areas of interest from a disease response perspective, including timing of vaccination, species considerations (e.g. vaccination of only those farms with cattle), risk area vaccination and resources available for vaccination. The study found that vaccination used with SO was effective in reducing epidemic size and duration in a severe outbreak situation. Early vaccination and unconstrained resources for vaccination consistently outperformed other strategies. Vaccination of only those farms with cattle produced comparable results, with some countries demonstrating that this could be as effective as all species vaccination. Restriction of vaccination to higher risk areas was less effective than other strategies. This study demonstrates consistency in the relative effectiveness of selected vaccination strategies under different outbreak start up conditions conditional on the assumption that each of the simulation models provide a realistic estimation of FMD virus spread. Preferred outbreak management approaches must however balance the principles identified in this study, working to clearly defined outbreak management objectives, while having a good understanding of logistic requirements and the socio-economic implications of different control measures.
With increased regulations regarding the use of feed-grade antimicrobials in livestock systems, alternative strategies to enhance growth and immunity of feedlot cattle are warranted. Hence, this experiment compared performance, health and physiological responses of cattle supplemented with feed-grade antibiotics or alternative feed ingredients during the initial 60 days in the feedlot. Angus×Hereford calves (63 steers+42 heifers) originating from two cow–calf ranches were weaned on day −3, obtained from an auction yard on day −2 and road-transported (800 km; 12 h) to the feedlot. Upon arrival on day −1, shrunk BW was recorded. On day 0, calves were ranked by sex, source and shrunk BW, and allocated to one of 21 pens. Pens were assigned to receive (7 pens/treatment) a free-choice total mixed ration containing: (1) lasalocid (360 mg/calf daily of Bovatec; Zoetis, Florham Park, NJ, USA)+chlortetracycline (350 mg/calf of Aureomycin at cycles of 5-day inclusion and 2-day removal from diet; Zoetis) from days 0 to 32, and monensin only (360 mg/calf daily of Rumensin; Elanco Animal Health, Greenfield, IN, USA) from days 33 to 60 (PC), (2) sodium saccharin-based sweetener (Sucram at 0.04 g/kg of diet dry matter; Pancosma SA; Geneva, Switzerland)+plant extracts containing eugenol, cinnamaldehyde and capsicum (800 mg/calf daily of XTRACT Ruminants 7065; Pancosma SA) from days 0 to 32 and XTRACT only (800 mg/calf daily) from days 33 to 60 (EG) or (3) no supplemental ingredients (CON; days 0 to 60). Calves were assessed for bovine respiratory disease (BRD) signs and dry matter intake was recorded from each pen daily. Calves were vaccinated against BRD pathogens on days 0 and 22. Shrunk BW was recorded on day 61, and blood samples collected on days 0, 6, 11, 22, 33, 43 and 60. Calf ADG was greater (P=0.04) in PC v. EG and tended (P=0.09) to be greater in PC v. CON. Feed efficiency also tended (P=0.09) to be greater in PC v. CON, although main treatment effect for this response was not significant (P=0.23). Mean serum titers against bovine respiratory syncytial virus were greater in EG v. PC (P=0.04) and CON (tendency; P=0.08). Collectively, the inclusion of alternative feed ingredients prevented the decrease in feed efficiency when chlortetracycline and ionophores were not added to the initial feedlot diet, and improved antibody response to vaccination against the bovine respiratory syncytial virus in newly weaned cattle.
This experiment evaluated the impacts of supplementing a yeast-derived product (Celmanax; Church & Dwight Co., Inc., Princeton, NJ, USA) on productive and health responses of beef steers, and was divided into a preconditioning (days 4 to 30) and feedlot receiving phase (days 31 to 69). In all, 84 Angus × Hereford steers were weaned on day 0 (BW=245±2 kg; age=186±2 days), and maintained in a single group from days 0 to 3. On day 4, steers were allocated according to weaning BW and age to a 21-pen drylot (4 steers/pen). Pens were randomly assigned to (n=7 pens/treatment): (1) no Celmanax supplementation during the study, (2) Celmanax supplementation (14 g/steer daily; as-fed) from days 14 to 69 or (3) Celmanax supplementation (14 g/steer daily; as-fed) from days 31 to 69. Steers had free-choice access to grass-alfalfa hay, and were also offered a corn-based concentrate beginning on day 14. Celmanax was mixed daily with the concentrate. On day 30, steers were road-transported for 1500 km (24 h). On day 31, steers returned to their original pens for the 38-day feedlot receiving. Shrunk BW was recorded on days 4, 31 and 70. Feed intake was evaluated daily (days 14 to 69). Steers were observed daily (days 4 to 69) for bovine respiratory disease (BRD) signs. Blood samples were collected on days 14, 30, 31, 33, 35, 40, 45, 54 and 69, and analyzed for plasma cortisol, haptoglobin, IGF-I, and serum fatty acids. Preconditioning results were analyzed by comparing pens that received (CELM) or not (CONPC) Celmanax during the preconditioning phase. Feedlot receiving results were analyzed by comparing pens that received Celmanax from days 14 to 69 (CELPREC), days 31 to 69 (CELRECV) or no Celmanax supplementation (CON). During preconditioning, BRD incidence was less (P=0.03) in CELM v. CONPC. During feedlot receiving, average daily gain (ADG) (P=0.07) and feed efficiency (P=0.08) tended to be greater in CELPREC and CELRECV v. CON, whereas dry matter intake was similar (P⩾0.29) among treatments. No other treatment effects were detected (P⩾0.20). Collectively, Celmanax supplementation reduced BRD incidence during the 30-day preconditioning. Moreover, supplementing Celmanax tended to improve ADG and feed efficiency during the 38-day feedlot receiving, independently of whether supplementation began during preconditioning or after feedlot entry. These results suggest that Celmanax supplementation benefits preconditioning health and feedlot receiving performance in beef cattle.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
We evaluated the extent to which providing training and technical assistance to early childcare centre (ECC) directors, faculty and staff in the implementation of evidence-based nutrition strategies improved the nutrition contexts, policies and practices of ECC serving racially and ethnically diverse, low-income children in Broward County, Florida, USA. The nutrition strategies targeted snack and beverage policies and practices, consistent with Caring for Our Children National Standards.
We used the nutrition observation and document review portions of the Environment and Policy Assessment and Observation (EPAO) instrument to observe ECC as part of a one-group pre-test/post-test evaluation design.
ECC located within areas of high rates of poverty, diabetes, minority representation and unhealthy food index in Broward County, Florida, USA.
Eighteen ECC enrolled, mean 112·9 (sd 53·4) children aged 2–5 years; 12·3 (sd 7·2) staff members; and 10·2 (sd 4·6) children per staff member at each centre.
We found significant improvements in centres’ overall nutrition contexts, as measured by total EPAO nutrition scores (P=0·01). ECC made specific significant gains within written nutrition policies (P=0·03) and nutrition training and education (P=0·01).
Our findings support training ECC directors, faculty and staff in evidence-based nutrition strategies to improve the nutrition policies and practices of ECC serving racially and ethnically diverse children from low-income families. The intervention resulted in improvements in some nutrition policies and practices, but not others. There remains a need to further develop the evaluation base involving the effectiveness of policy and practice interventions within ECC serving children in high-need areas.
Collagen associated with bone samples is frequently used for radiocarbon (14C) dating of bones recovered from archaeological sites. However, submersion and exposure to moisture favors the degradation of collagen, which leads to difficulty in reliably dating bones from tropical, humid, or previously submerged archaeological sites. In this paper, we characterized the preservation state of a series of bones, through parameters such as %C, %N, C/N ratio, and collagen recovery. We performed 14C analyses of three collagen fractions obtained through the pretreatment steps (total, ultrafiltered, and insoluble collagen) in order to link the preservation state and the reproducibility of 14C values obtained from the three fractions. Collagen ultrafiltration resulted in a decrease of C/N ratio, although collagen yield was reduced. When two or three collagen fractions were obtained, ages were reproducible and consistent with expected values, according to archaeological or hydrogeological criteria. The pretreatment steps were monitored by infrared spectroscopy in order to analyze the collagen fractions at the molecular level. The presence of collagen in the total and insoluble fractions was confirmed. Since many of the Mexican samples had poor ultrafiltered collagen yield (<3%) or nonexistent yield, our results show that if additional contextual information is carefully considered, the remnant collagen in the total and insoluble fraction can be dated, especially from sites where no other datable fraction exists.