To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Interfacility patient movement plays an important role in the dissemination of antimicrobial-resistant organisms throughout healthcare systems. We evaluated how 3 alternative measures of interfacility patient sharing were associated with C. difficile infection incidence in Ontario acute-care facilities.
The cohort included adult acute-care facility stays of ≥3 days between April 2003 and March 2016. We measured 3 facility-level metrics of patient sharing: general patient importation, incidence-weighted patient importation, and C. difficile case importation. Each of the 3 patient-sharing metrics were examined against the incidence of C. difficile infection in the facility per 1,000 stays, using Poisson regression models.
The analyzed cohort included 6.70 million stays at risk of C. difficile infection across 120 facilities. Over the 13-year period, we included 62,189 new cases of healthcare-associated CDI (incidence, 9.3 per 1,000 stays). After adjustment for facility characteristics, general importation was not strongly associated with C. difficile infection incidence (risk ratio [RR] per doubling, 1.10; 95% confidence interval [CI], 0.97–1.24; proportional change in variance [PCV], −2.0%). Incidence-weighted (RR per doubling, 1.18; 95% CI, 1.06–1.30; PCV, −8.4%) and C. difficile case importation (RR per doubling, 1.43; 95% CI, 1.29–1.58; PCV, −30.1%) were strongly associated with C. difficile infection incidence.
In this 13-year study of acute-care facilities in Ontario, interfacility variation in C. difficile infection incidence was associated with importation of patients from other high-incidence acute-care facilities or specifically of patients with a recent history of C. difficile infection. Regional infection control strategies should consider the potential impact of importation of patients at high risk of C. difficile shedding from outside facilities.
Nudging in microbiology is an antimicrobial stewardship strategy to influence decision making through the strategic reporting of microbiology results while preserving prescriber autonomy. The purpose of this scoping review was to identify the evidence that demonstrates the effectiveness of nudging strategies in susceptibility result reporting to improve antimicrobial use.
A search for studies in Ovid MEDLINE, Embase, PsycINFO, and All EBM Reviews was conducted. All simulated and vignette studies were excluded. Two independent reviewers were used throughout screening and data extraction.
Of a total of 1,346 citations screened, 15 relevant studies were identified. Study types included pre- and postintervention (n = 10), retrospective cohort (n = 4), and a randomized controlled trial (n = 1). Most studies were performed in acute-care settings (n = 13), and the remainder were in primary care (n = 2). Most studies used a strategy to alter the default antibiotic choices on the antibiotic report. All studies reported at least 1 outcome of antimicrobial use: utilization (n = 9), appropriateness (n = 7), de-escalation (n = 2), and cost (n = 1). Moreover, 12 studies reported an overall benefit in antimicrobial use outcomes associated with nudging, and 4 studies evaluated the association of nudging strategy with subsequent antimicrobial resistance, with 2 studies noting overall improvement.
The number of heterogeneous studies evaluating the impact of applying nudging strategies to susceptibility result reports is small; however, most strategies do show promise in altering prescriber’s antibiotic selection. Selective and cascade reporting of targeted agents in a hospital setting represent the majority of current research. Gaps and opportunities for future research identified from our scoping review include performing prospective randomized controlled trials and evaluating other approaches aside from selective reporting.
Antimicrobial stewardship program (ASP) interventions, such as prospective audit and feedback (PAF), have been shown to reduce antimicrobial use and improve patient outcomes. However, the optimal approach to PAF is unknown.
We examined the impact of a high–intensity interdisciplinary rounds–based PAF compared to low–intensity PAF on antimicrobial use on internal medicine wards in a 400–bed community hospital.
Prior to the intervention, ASP pharmacists performed low–intensity PAF with a focus on targeted antibiotics. Recommendations were made directly to the internist for each patient. High–intensity, rounds–based PAF was then introduced sequentially to 5 internal medicine wards. This PAF format included twice–weekly interdisciplinary rounds, with a review of all internal medicine patients receiving any antimicrobial agent. Antibiotic use and clinical outcomes were measured before and after the transition to high–intensity PAF. An interrupted time–series analysis was performed adjusting for seasonal and secular trends.
With the transition from low–intensity to high–intensity PAF, a reduction in overall usage was seen from 483 defined daily doses (DDD)/1,000 patient days (PD) during the low–intensity phase to 442 DDD/1,000 PD in the high–intensity phase (difference, −42; 95% confidence interval [CI], −74 to −9). The reduction in usage was more pronounced in the adjusted analysis, in the latter half of the high intensity period, and for targeted agents. There were no differences seen in clinical outcomes in the adjusted analysis.
High–intensity PAF was associated with a reduction in antibiotic use compared to a low–intensity approach without any adverse impact on patient outcomes. A decision to implement high–intensity PAF approach should be weighed against the increased workload required.
This study investigated the characteristics of subjective memory complaints (SMCs) and their association with current and future cognitive functions.
A cohort of 209 community-dwelling individuals without dementia aged 47–90 years old was recruited for this 3-year study. Participants underwent neuropsychological and clinical assessments annually. Participants were divided into SMCs and non-memory complainers (NMCs) using a single question at baseline and a memory complaints questionnaire following baseline, to evaluate differential patterns of complaints. In addition, comprehensive assessment of memory complaints was undertaken to evaluate whether severity and consistency of complaints differentially predicted cognitive function.
SMC and NMC individuals were significantly different on various features of SMCs. Greater overall severity (but not consistency) of complaints was significantly associated with current and future cognitive functioning.
SMC individuals present distinctive features of memory complaints as compared to NMCs. Further, the severity of complaints was a significant predictor of future cognition. However, SMC did not significantly predict change over time in this sample. These findings warrant further research into the specific features of SMCs that may portend subsequent neuropathological and cognitive changes when screening individuals at increased future risk of dementia.
Clostridium difficile spores play an important role in transmission and can survive in the environment for several months. Optimal methods for measuring environmental C. difficile are unknown. We sought to determine whether increased sample surface area improved detection of C. difficile from environmental samples.
Samples were collected from 12 patient rooms in a tertiary-care hospital in Toronto, Canada.
Samples represented small surface-area and large surface-area floor and bedrail pairs from single-bed rooms of patients with low (without prior antibiotics), medium (with prior antibiotics), and high (C. difficile infected) shedding risk. Presence of C. difficile in samples was measured using quantitative polymerase chain reaction (qPCR) with targets on the 16S rRNA and toxin B genes and using enrichment culture.
Of the 48 samples, 64·6% were positive by 16S qPCR (geometric mean, 13·8 spores); 39·6% were positive by toxin B qPCR (geometric mean, 1·9 spores); and 43·8% were positive by enrichment culture. By 16S qPCR, each 10-fold increase in sample surface area yielded 6·6 times (95% CI, 3·2–13) more spores. Floor surfaces yielded 27 times (95% CI, 4·9–181) more spores than bedrails, and rooms of C. difficile–positive patients yielded 11 times (95% CI, 0·55–164) more spores than those of patients without prior antibiotics. Toxin B qPCR and enrichment culture returned analogous findings.
Clostridium difficile spores were identified in most floor and bedrail samples, and increased surface area improved detection. Future research aiming to understand the role of environmental C. difficile in transmission should prefer samples with large surface areas.
Antibiotic use varies widely between hospitals, but the influence of antimicrobial stewardship programs (ASPs) on this variability is not known. We aimed to determine the key structural and strategic aspects of ASPs associated with differences in risk-adjusted antibiotic utilization across facilities.
Observational study of acute-care hospitals in Ontario, Canada
A survey was sent to hospitals asking about both structural (8 elements) and strategic (32 elements) components of their ASP. Antibiotic use from hospital purchasing data was acquired for January 1 to December 31, 2014. Crude and adjusted defined daily doses per 1,000 patient days, accounting for hospital and aggregate patient characteristics, were calculated across facilities. Rate ratios (RR) of defined daily doses per 1,000 patient days were compared for hospitals with and without each antimicrobial stewardship element of interest.
Of 127 eligible hospitals, 73 (57%) participated in the study. There was a 7-fold range in antibiotic use across these facilities (min, 253 defined daily doses per 1,000 patient days; max, 1,872 defined daily doses per 1,000 patient days). The presence of designated funding or resources for the ASP (RRadjusted, 0·87; 95% CI, 0·75–0·99), prospective audit and feedback (RRadjusted, 0·80; 95% CI, 0·67–0·96), and intravenous-to-oral conversion policies (RRadjusted, 0·79; 95% CI, 0·64–0·99) were associated with lower risk-adjusted antibiotic use.
Wide variability in antibiotic use across hospitals may be partially explained by both structural and strategic ASP elements. The presence of funding and resources, prospective audit and feedback, and intravenous-to-oral conversion should be considered priority elements of a robust ASP.
To study the antibody response to tetanus toxoid and measles by age following vaccination in children aged 4 months to 6 years in Entebbe, Uganda. Serum samples were obtained from 113 children aged 4–15 months, at the Mother-Child Health Clinic (MCHC), Entebbe Hospital and from 203 of the 206 children aged between 12 and 75 months recruited through the Outpatients Department (OPD). Antibodies to measles were quantified by plaque reduction neutralisation test (PRNT) and with Siemens IgG EIA. VaccZyme IgG EIA was used to quantify anti-tetanus antibodies. Sera from 96 of 113 (85.0%) children attending the MCHC contained Measles PRNT titres below the protective level (120 mIU/ml). Sera from 24 of 203 (11.8%) children attending the OPD contained PRNT titres <120 mIU/ml. There was no detectable decline in anti-measles antibody concentrations between 1 and 6 years. The anti-tetanus antibody titres in all 113 children attending MCHC and in 189 of 203 (93.1%) children attending the OPD were >0.15 IU/ml by EIA, a level considered protective. The overall concentration of anti-tetanus antibody was sixfold higher in children under 12 months compared with the older children, with geometric mean concentrations of 3.15 IU/ml and 0.49 IU/ml, respectively. For each doubling in age between 4 and 64 months, the anti-tetanus antibody concentration declined by 50%. As time since the administration of the third DTP vaccination doubled, anti-tetanus antibody concentration declined by 39%. The low measles antibody prevalence in the children presenting at the MCHC is consistent with the current measles epidemiology in Uganda, where a significant number of measles cases occur in children under 1 year of age and earlier vaccination may be indicated. The consistent fall in anti-tetanus antibody titre over time following vaccination supports the need for further vaccine boosters at age 4–5 years as recommended by the WHO.
A significant reason for death and long-term disability due to head injuries and pathologic conditions is an elevation in the intracranial pressure (ICP) due to vascular compromise and secondary sequelae causing edema. ICP measurements before and after injury in a completely closed-head environment have a significant research value, particularly in the acute postinjury period. With current technology, a tethered fiberoptic probe penetrates the brain and therefore can only remain implanted for relatively short time periods. Use of the probe also can cause complications such as infection and hemorrhage and prohibit immediate (at the time of injury) and long-term measurements of ICP. A small, fully embedded, wireless ICP device may simplify clinical management and research protocols by offering a means for semi-invasive and long-term ICP measurement following brain injury. In this chapter, a new digital wireless ICP (DICP) device is described. The dynamic ICP measurement performances of both the analog ICP (AICP) devices (described in Chapter 2) and the DICP devices are evaluated in a specific traumatic brain injury (TBI) (swine) model of closed-head rotational injury.
In Chapter 2, a prototype of an AICP device operating in the industrial-scientific-medical (ISM) band at 2.4 GHz was described that successfully simplified the surgical procedure by reducing the infection rate, the risk of hemorrhage, and the degree of tissue injury.
The AICP device was implanted in a canine model only for a static test, and hypo- and hyperventilation were used to affect variations in ICP. Dynamic ICP variations as a result of TBI in a completely closed-head environment are of paramount importance for understanding the development of a prolonged postconcussion syndrome and facilitating institution of the correct treatment at different stages, particularly in the acute postinjury period. Currently, in experimental (animal) models of TBI, a tethered fiberoptic probe (if inserted before the injury) has to be removed before an injury is induced in order to avoid significant focal damage at the point of probe insertion. Moreover, reinsertion of the probe is possible only after the animal's vital signs have stabilized. However, the act of breaching the cranium after the injury affects the fidelity of the ICP measurements. In addition, proposed noninvasive ICP (NICP) solutions, such as the pulsatility index method based on the use of trancranial Doppler, argued by Figaji et al. , have been shown to be insufficient for accurate ICP estimation.
Alexandre Fréchette, Department of Computer Science, University of British Columbia,
Neil Newman, Department of Computer Science, University of British Columbia,
Kevin Leyton-Brown, Department of Computer Science, University of British Columbia
Over 13 months in 2016–17, the US government held an innovative “incentive auction” for radio spectrum, in which television broadcasters were paid to relinquish broadcast rights via a “reverse auction”, remaining broadcasters were repacked into a narrower band of spectrum, and the cleared spectrum was sold to telecommunications companies. The stakes were enormous: the auction was forecast to net the government tens of billions of dollars, as well as creating massive economic value by reallocating spectrum to more socially beneficial uses (Congressional Budget Office 2015). As a result of both its economic importance and its conceptual novelty, the auction has been the subject of considerable recent study by the research community, mostly focusing on elements of the auction design (Bazelon, Jackson, and McHenry 2011; Kwerel, LaFontaine, and Schwartz 2012; Milgrom et al. 2012; Calamari et al. 2012; Marcus 2013; Milgrom and Segal 2014; Dütting, Gkatzelis, and Roughgarden 2014; Vohra 2014; Nguyen and Sandholm 2014; Kazumori 2014). After considerable study and discussion, the FCC has selected an auction design based on a descending clock (FCC 2014c; 2014a). Such an auction offers each participating station a price for relinquishing its broadcast rights, with this price offer falling for a given station as long as it remains repackable. A consequence of this design is that the auction must (sequentially!) solve hundreds of thousands of such repacking problems. This is challenging, because the repacking problem is NP-complete. It also makes the performance of the repacking algorithm extremely important, as every failure to solve a single, feasible repacking problem corresponds to a lost opportunity to lower a price offer. Given the scale of the auction, individual unsolved problems can cost the government millions of dollars each.
This chapter shows how the station repacking problem can be solved exactly and reliably at the national scale. It describes the results of an extensive, multi-year investigation into the problem, which culminated in a solver that we call SATFC.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
OBJECTIVES/SPECIFIC AIMS: A major limitation of cardiac stem cell transplantation following myocardial infarction (MI) is poor retention of cells in the ischemic microenvironment. Our study aims to better understand and promote the survival and differentiation of human cardiosphere-derived cells (hCDCs) in anoxia, a feature of infarcted myocardium. METHODS/STUDY POPULATION: We previously demonstrated that TGFβ1 and heparin-containing hydrogels (TH-hydrogel) can promote murine CDC survival. In this study, hCDCs were incubated in either normoxia or anoxia for 8 hours with and without TH-hydrogel. In addition, hCDCs without TH-hydrogel were assessed in 16 hours of anoxia. Following incubation, hCDCs were assayed for viability using calcein dye and immunostained for CD31, a marker of endothelial differentiation. RESULTS/ANTICIPATED RESULTS: hCDCs incubated for 8 hours in anoxia in both models equally demonstrated increased survival up to 30% when compared with cells incubated in normoxia. However, in contrast to hCDCs alone, hCDCs with TH-hydrogel additionally demonstrated increased differentiation into endothelial cells in both anoxia and normoxia. We found that hCDCs alone were able to upregulate CD31 only when subjected to 16 hours of anoxia. DISCUSSION/SIGNIFICANCE OF IMPACT: We demonstrate a new, previously unknown response of hCDCs to anoxia. This induces increased viability and differentiation of hCDCs into endothelial cells. The differentiation in anoxia was time dependent and could be expedited with use of TH-hydrogel. Anoxic preconditioning of hCDCs together with the TH-hydrogel system may improve the therapeutic potential of stem cell transplantation following MI.
Contemporary state authorities in the United Kingdom and elsewhere have increasingly sought to regulate the use of public space. This paper explores through a doctrinal and socio-legal analysis how recently introduced Public Spaces Protection Orders (PSPOs) are being used in England and Wales to enforce majoritarian sensibilities at the expense of due process and civil liberties. PSPOs were introduced in October 2014. These orders grant considerable discretion to local authorities to use the threat of criminal sanction to regulate activities in public spaces that they regard as being detrimental to the quality of life of residents. This paper provides the first comprehensive critique of how these orders are used to target minority and vulnerable groups, while curtailing fundamental freedoms. The paper includes suggestions for reforms to make the PSPO function in a manner that is more compatible with a rights-based approach.
To examine variation in antibiotic coverage and detection of resistant pathogens in community-onset pneumonia.
A total of 128 hospitals in the Veterans Affairs health system.
Hospitalizations with a principal diagnosis of pneumonia from 2009 through 2010.
We examined proportions of hospitalizations with empiric antibiotic coverage for methicillin-resistant Staphylococcus aureus (MRSA) and Pseudomonas aeruginosa (PAER) and with initial detection in blood or respiratory cultures. We compared lowest- versus highest-decile hospitals, and we estimated adjusted probabilities (AP) for patient- and hospital-level factors predicting coverage and detection using hierarchical regression modeling.
Among 38,473 hospitalizations, empiric coverage varied widely across hospitals (MRSA lowest vs highest, 8.2% vs 42.0%; PAER lowest vs highest, 13.9% vs 44.4%). Detection rates also varied (MRSA lowest vs highest, 0.5% vs 3.6%; PAER lowest vs highest, 0.6% vs 3.7%). Whereas coverage was greatest among patients with recent hospitalizations (AP for anti-MRSA, 54%; AP for anti-PAER, 59%) and long-term care (AP for anti-MRSA, 60%; AP for anti-PAER, 66%), detection was greatest in patients with a previous history of a positive culture (AP for MRSA, 7.9%; AP for PAER, 11.9%) and in hospitals with a high prevalence of the organism in pneumonia (AP for MRSA, 3.9%; AP for PAER, 3.2%). Low hospital complexity and rural setting were strong negative predictors of coverage but not of detection.
Hospitals demonstrated widespread variation in both coverage and detection of MRSA and PAER, but probability of coverage correlated poorly with probability of detection. Factors associated with empiric coverage (eg, healthcare exposure) were different from those associated with detection (eg, microbiology history). Providing microbiology data during empiric antibiotic decision making could better align coverage to risk for resistant pathogens and could promote more judicious use of broad-spectrum antibiotics.
Increasingly, archaeological research in Amazonia is revealing complex precolonial occupation in areas around riverine confluences. In 2014, the first site-based archaeological investigations were undertaken in Gurupá, Pará, Brazil, a municipality that spans the region of the Xingu-Amazon confluence. The Portuguese controlled access to Amazonia from 1623 onward through a network of settlements organized around Gurupá. Results from extensive excavations of terra preta sites, landscape archaeology, and analysis of ceramic evidence suggest that this was also a precolonial crossroads. Carrazedo, once a booming historical town (Arapijó), sits atop a significantly larger terra preta site. Excavations in historical and precolonial sectors of Carrazedo found well-preserved remains, including a precolonial house terrace complex. The extent of terra preta and earthworks at Carrazedo indicate that the precolonial occupation was more intensive than the colonial-historical period occupation. Regional survey revealed colonial-historical period sites consistently overlying expansive precolonial sites, the density and extent of which suggest a major precolonial center at the Xingu-Amazon confluence. Overall, ecological and landscape modifications appear to have been more intense in the precolonial past than during later periods. Short- and long-distance settlement networks also differed during the two periods. This as-of-yet understudied region promises to shed new light on deep-time human-environment interactions and spatial organization in the humid tropics of Amazonia.
The γ-ray burst brightness distribution is inhomogeneous and the distribution on the sky is nearly isotropic. These features argue against an association of γ-ray bursts with those Galactic objects that are known to exhibit a strong concentration toward the Galactic center or plane. The observed statistical properties indicate a cosmological origin. Circumstantial evidence suggests that neutron stars are involved in the burst phenomenon. Here we consider Population II neutron stars in an extended Galactic Halo (EGH) as an alternative to cosmological scenarios. The BATSE data indicate a small deviation from isotropy near the 2 σ level of statistical significance. If confirmed for an increasing number of bursts, these anisotropies could rule out cosmological scenarios. On the other hand, EGH models require small anisotropies like those observed by BATSE. We consider simple distribution models to determine the generic properties such halos must have to be consistent with the observations and discuss the implications of the corresponding distance scale on burst models.
Subject headings: gamma rays: bursts — stars: neutron — stars: Population II — stars: statistics
Hospital-acquired infections (HAIs) develop rapidly after brief and transient exposures, and ecological exposures are central to their etiology. However, many studies of HAIs risk do not correctly account for the timing of outcomes relative to exposures, and they ignore ecological factors. We aimed to describe statistical practice in the most cited HAI literature as it relates to these issues, and to demonstrate how to implement models that can be used to account for them.
We conducted a literature search to identify 8 frequently cited articles having primary outcomes that were incident HAIs, were based on individual-level data, and used multivariate statistical methods. Next, using an inpatient cohort of incident Clostridium difficile infection (CDI), we compared 3 valid strategies for assessing risk factors for incident infection: a cohort study with time-fixed exposures, a cohort study with time-varying exposures, and a case-control study with time-varying exposures.
Of the 8 studies identified in the literature scan, 3 did not adjust for time-at-risk, 6 did not assess the timing of exposures in a time-window prior to outcome ascertainment, 6 did not include ecological covariates, and 6 did not account for the clustering of outcomes in time and space. Our 3 modeling strategies yielded similar risk-factor estimates for CDI risk.
Several common statistical methods can be used to augment standard regression methods to improve the identification of HAI risk factors.
Infect. Control Hosp. Epidemiol. 2016;37(4):411–419
To investigate biomarkers of nutrition associated with chronic disease absence for an Aboriginal cohort.
Screening for nutritional biomarkers was completed at baseline (1995). Evidence of chronic disease (diabetes, CVD, chronic kidney disease or hypertension) was sought from primary health-care clinics, hospitals and death records over 10 years of follow-up. Principal components analysis was used to group baseline nutritional biomarkers and logistic regression modelling used to investigate associations between the principal components and chronic disease absence.
Three Central Australian Aboriginal communities.
Aboriginal people (n 444, 286 of whom were without chronic disease at baseline) aged 15–82 years.
Principal components analysis grouped twelve nutritional biomarkers into four components: ‘lipids’; ‘adiposity’; ‘dietary quality’; and ‘habitus with inverse quality diet’. For the 286 individuals free of chronic disease at baseline, lower adiposity, lower lipids and better dietary quality components were each associated with the absence at follow-up of most chronic diseases examined, with the exception of chronic kidney disease. Low ‘adiposity’ component was associated with absence of diabetes, hypertension and CVD at follow-up. Low ‘lipid’ component was associated with absence of hypertension and CVD, and high ‘dietary quality’ component was associated with absence of CVD at follow-up.
Lowering or maintenance of the factors related to ‘adiposity’ and ‘lipids’ to healthy thresholds and increasing access to a healthy diet appear useful targets for chronic disease prevention for Aboriginal people in Central Australia.
The spectral and temporal behavior of exoplanet host stars is a critical input to models of the chemistry and evolution of planetary atmospheres. High-energy photons (X-ray to NUV) from these stars regulate the atmospheric temperature profiles and photochemistry on orbiting planets, influencing the production of potential “biomarker” gases. We report first results from the MUSCLES Treasury Survey, a study of time-resolved UV and X-ray spectroscopy of nearby M and K dwarf exoplanet host stars. This program uses contemporaneous Hubble Space Telescope and Chandra (or XMM) observations to characterize the time variability of the energetic radiation field incident on the habitable zones planetary systems at d ≲ 20 pc. We find that all exoplanet host stars observed to date exhibit significant levels of chromospheric and transition region UV emission. M dwarf exoplanet host stars display 30–7000% UV emission line amplitude variations on timescales of minutes-to-hours. The relative flare/quiescent UV flux amplitudes on weakly active planet-hosting M dwarfs are comparable to active flare stars (e.g., AD Leo), despite their weak optical activity indices (e.g., Ca II H and K equivalent widths). We also detect similar UV flare behavior on a subset of our K dwarf exoplanet host stars. We conclude that strong flares and stochastic variability are common, even on “optically inactive” M dwarfs hosting planetary systems. These results argue that the traditional assumption of weak UV fields and low flare rates on older low-mass stars needs to be revised.