We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Current adolescent substance use risk models have inadequately predicted use for African Americans, offering limited knowledge about differential predictability as a function of developmental period. Among a sample of 500 African American youth (ages 11–21), four risk indices (i.e., social risk, attitudinal risk, intrapersonal risk, and racial discrimination risk) were examined in the prediction of alcohol, marijuana, and cigarette initiation during early (ages 11–13), mid (ages 16–18), and late (ages 19–21) adolescence. Results showed that when developmental periods were combined, racial discrimination was the only index that predicted initiation for all three substances. However, when risk models were stratified based on developmental period, variation was found within and across substance types. Results highlight the importance of racial discrimination in understanding substance use initiation among African American youth and the need for tailored interventions based on developmental stage.
The Paleolithic diet excludes two major sources of fibre, grains and legumes. However, it is not known whether this results in changes to resistant starch (RS) consumption. Serum trimethylamine-N-oxide (TMAO) is produced mainly from colonic fermentation and hepatic conversion of animal protein and is implicated in CVD, but changes in RS intake may alter concentrations. We aimed to determine whether intake of RS and serum concentrations of TMAO varied in response to either the Paleolithic or the Australian Guide to Healthy Eating (AGHE) diets and whether this was related to changes in food group consumption. A total of thirty-nine women (mean age 47 (sd 13) years, BMI 27 (sd 4) kg/m2) were randomised to AGHE (n 17) or Paleolithic diets (n 22) for 4 weeks. Serum TMAO concentrations were measured using liquid chromatography–MS; food groups, fibre and RS intake were estimated from weighed food records. The change in TMAO concentrations between groups (Paleolithic 3·39 μmv. AGHE 1·19 μm, P = 0·654) did not reach significance despite greater red meat and egg consumption in the Paleolithic group (0·65 serves/d; 95 % CI 0·2, 1·1; P <0·01, and 0·22 serves/d; 95 % CI 0·1, 0·4, P <0·05, respectively). RS intake was significantly lower on the Paleolithic diet (P <0·01) and was not associated with TMAO concentrations. However, the limited data for RS and the small sample size may have influenced these findings. While there were no significant changes in TMAO concentrations, increased meat consumption and reduced RS intake warrant further research to examine the markers of gastrointestinal health of Paleolithic diet followers and to update Australian food databases to include additional fibre components.
The role of vegetable and fruit intake in reducing falls risk in elderly populations is uncertain. This study examined the associations of vegetable and fruit intake with falls-related hospitalisations in a prospective cohort study of elderly women (n 1429, ≥70 years), including effects on muscular function, which represented a potential causal pathway. Muscular function, measured using grip strength and timed-up-and-go (TUG), and vegetable and fruit intake, quantified using a validated FFQ, were assessed at baseline (1998). Incident falls-related hospitalisation over 14·5-year follow-up was captured by the Hospital Morbidity Data Collection, linked via the Western Australian Data Linkage System. Falls-related hospitalisation occurred in 568 (39·7 %) of women. In multivariable-adjusted models, falls-related hospitalisations were lower in participants consuming more vegetables (hazard ratio (HR) per 75 g serve: 0·90 (95 % CI 0·82, 0·99)), but not fruit intake (per 150 g serve: 1·03 (95 % CI 0·93, 1·14)). Only total cruciferous vegetable intake was inversely associated with falls-related hospitalisation (HR: per 20 g serve: 0·90 (95 % CI 0·83, 0·97)). Higher total vegetable intake was associated with lower odds for poor grip strength (OR: 0·87 (95 % CI 0·77, 0·97)) and slow TUG (OR: 0·88 (95 % CI 0·78, 0·99)). Including grip strength and TUG in the multivariable-adjusted model attenuated the association between total vegetable intake and falls-related hospitalisations. In conclusion, elderly women with higher total and cruciferous vegetable intake had lower injurious falls risk, which may be explained in a large part by better physical function. Falls reduction may be considered an additional benefit of higher vegetable intake in older women.
Chondrules contain ferromagnetic minerals that may retain a record of the magnetic field environments in which they cooled. Paleomagnetic experiments on separated chondrules can potentially reveal the presence of remanent magnetization from the time of chondrule formation. The existence of such a magnetization places quantitative bounds on the frequency of interchondrule collisions, while the intensity of magnetization may be used to infer the strength of nebular magnetic fields and thereby constrain the mechanism of chondrule formation. Recent advances in laboratory instrumentation and techniques have permitted the isolation of nebular remanent magnetization in chondrules, providing the potential basis to probe the formation environments of chondrules from a range of chondrite classes.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
The study of extremely metal-poor (EMP; [Fe/H] <−3.0) and ultra metal-poor (UMP; [Fe/H] <−4.0) stars is crucial for better understanding first-star nucleosynthesis and constraining the initial mass function in the early Universe. However, UMP stars discovered in the past 25 years only number ~25. A few recent theoretical studies have pointed out that there is likely to exist large numbers of EMP and UMP stars in the periphery of the Galactic halo, at distances exceeding 30-50 kpc. We present identifications of several new EMP/UMP stars and introduce a survey to expedite discovering hundreds to thousands of EMP/UMP stars in the outermost halo (as well as in the local volume) over the next few years, which could revolutionize chemical-evolution studies of the Galaxy.
Narrow-band photometric surveys, such as the Javalambre Photometric Local Universe Survey (J-PLUS), provide not only a means of pre-selection for high-resolution follow-up, but open a new era of precision photometric stellar parameter determination. Using a family of machine learning algorithms known as Artificial Neural Networks (ANNs), we have obtained photometric estimates of effective temperature (Teff) and metallicity ([Fe/H]) across a wide parameter range of temperature and metallicity (4000 < Teff <7000 K; −3.5 <[Fe/H]<0.0) for a number of stars in the J-PLUS Early Data Release. With this methodology, we expect to increase the number of known Carbon-enhanced Metal-poor (CEMP; [C/Fe]>+0.7) stars by several orders of magnitude, as well as constrain the metallicity distribution function of the Milky Way Halo system.
The effectiveness of practice bundles on reducing ventilator-associated pneumonia (VAP) has been questioned.
OBJECTIVE
To implement a comprehensive program that included a real-time bundle compliance dashboard to improve compliance and reduce ventilator-associated complications.
DESIGN
Before-and-after quasi-experimental study with interrupted time-series analysis.
SETTING
Academic medical center.
METHODS
In 2007 a comprehensive institutional ventilator bundle program was developed. To assess bundle compliance and stimulate instant course correction of noncompliant parameters, a real-time computerized dashboard was developed. Program impact in 6 adult intensive care units (ICUs) was assessed. Bundle compliance was noted as an overall cumulative bundle adherence assessment, reflecting the percentage of time all elements were concurrently in compliance for all patients.
RESULTS
The VAP rate in all ICUs combined decreased from 19.5 to 9.2 VAPs per 1,000 ventilator-days following program implementation (P<.001). Bundle compliance significantly increased (Z100 score of 23% in August 2007 to 83% in June 2011 [P<.001]). The implementation resulted in a significant monthly decrease in the overall ICU VAP rate of 3.28/1,000 ventilator-days (95% CI, 2.64–3.92/1,000 ventilator-days). Following the intervention, the VAP rate decreased significantly at a rate of 0.20/1,000 ventilator-days per month (95% CI, 0.14–0.30/1,000 ventilator-days per month). Among all adult ICUs combined, improved bundle compliance was moderately correlated with monthly VAP rate reductions (Pearson correlation coefficient, −0.32).
CONCLUSION
A prevention program using a real-time bundle adherence dashboard was associated with significant sustained decreases in VAP rates and an increase in bundle compliance among adult ICU patients.
Infect. Control Hosp. Epidemiol. 2015;36(11):1261–1267
Influenza A (H1N1) pdm09 became the predominant circulating strain in the United States during the 2013–2014 influenza season. Little is known about the epidemiology of severe influenza during this season.
METHODS
A retrospective cohort study of severely ill patients with influenza infection in intensive care units in 33 US hospitals from September 1, 2013, through April 1, 2014, was conducted to determine risk factors for mortality present on intensive care unit admission and to describe patient characteristics, spectrum of disease, management, and outcomes.
RESULTS
A total of 444 adults and 63 children were admitted to an intensive care unit in a study hospital; 93 adults (20.9%) and 4 children (6.3%) died. By logistic regression analysis, the following factors were significantly associated with mortality among adult patients: older age (>65 years, odds ratio, 3.1 [95% CI, 1.4–6.9], P=.006 and 50–64 years, 2.5 [1.3–4.9], P=.007; reference age 18–49 years), male sex (1.9 [1.1–3.3], P=.031), history of malignant tumor with chemotherapy administered within the prior 6 months (12.1 [3.9–37.0], P<.001), and a higher Sequential Organ Failure Assessment score (for each increase by 1 in score, 1.3 [1.2–1.4], P<.001).
CONCLUSION
Risk factors for death among US patients with severe influenza during the 2013–2014 season, when influenza A (H1N1) pdm09 was the predominant circulating strain type, shifted in the first postpandemic season in which it predominated toward those of a more typical epidemic influenza season.
Infect. Control Hosp. Epidemiol. 2015;36(11):1251–1260
Beveled retouch on stone projectile points has often been considered as a device to spin and stabilize a projectile. A recent paper showed that a beveled point will spin a small shaft under tightly controlled laboratory conditions. However, this experiment has little relevance for real projectiles such as atlatl darts, which flex dramatically and spin unevenly inflight, quite independent of point form. The spinning is related to the flexibility of the dart, which is necessary for spearthrower functión. A beveled point cannot spin a dart in the air, but is likely to cause some rotation when encountering a solid target like flesh. Beveled points are probably not related to spinning either darts or arrows inflight and present a good example of why we need to have both theoretical understanding and experimental observations of details of projectile behavior before interpreting artifacts. Spinning in a carcass could make beveled points more lethal, but the suggestion that beveling mostly results from sharpening and other modification of stone points remains the best explanation.
Pronunciation variation is under-studied in infant-directed speech, particularly for consonants. Regressive place assimilation involves a word-final alveolar stop taking the place of articulation of a following word-initial consonant. We investigated pronunciation variation in word-final alveolar stop consonants in storybooks read by forty-eight mothers in adult-directed or infant-directed style to infants aged approximately 0;3, 0;9, 1;1, or 1;8. We focused on phonological environments where regressive place assimilation could occur, i.e., when the stop preceded a word-initial labial or velar consonant. Spectrogram, waveform, and perceptual evidence was used to classify tokens into four pronunciation categories: canonical, assimilated, glottalized, or deleted. Results showed a reliable tendency for canonical variants to occur in infant-directed speech more often than in adult-directed speech. However, the otherwise very similar distributions of variants across addressee and age group suggested that infants largely experience statistical distributions of non-canonical consonantal pronunciation variants that mirror those experienced by adults.
This chapter describes crossover trials and their applications in neurology. Crossover trials could be used to study aspects of many common neurological disorders and psychiatric disorders. To illustrate the efficiency of crossover designs, the chapter presents sample size estimates for two placebo-controlled parallel and one crossover design for a trial examining the efficiency of donepezil in treating dementia in patients with Parkinson's disease. It also describes approaches to mitigate carryover effects. Alternatives to the 2 x 2 design are used to increase efficiency, provide unbiased estimates in the presence of carryover effects, and to compare more than two treatments. This chapter reviews response adaptive designs, matching and N of 1 trial along with several recent innovations in design. Simple carryover depends only on the treatment in the period prior to when carryover occurs. Crossover trials have logistical challenges beyond the careful planning and implementation that accompanies any successful clinical trial.