To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The availability of automated powder diffractometers, APD, has revolutionized the collection of diffraction data and allowed many improvements in the analysis of these data. The biggest change is the ease of digitizing the diffraction trace rather than preparing it as a strip chart on paper from an analog recorder. When the data are collected properly, the trace is a digitized record of intensity versus 2θ. The trace is an accurate representation of the diffraction pattern containing the sample information along with the spectral and instrument aberrations.
To examine the relationship between protein intake and the risk of incident premenstrual syndrome (PMS).
Nested case–control study. FFQ were completed every 4 years during follow-up. Our main analysis assessed protein intake 2–4 years before PMS diagnosis (for cases) or reference year (for controls). Baseline (1991) protein intake was also assessed.
Nurses’ Health Study II (NHS2), a large prospective cohort study of registered female nurses in the USA.
Participants were premenopausal women between the ages of 27 and 44 years (mean: 34 years), without diagnosis of PMS at baseline, without a history of cancer, endometriosis, infertility, irregular menstrual cycles or hysterectomy. Incident cases of PMS (n 1234) were identified by self-reported diagnosis during 14 years of follow-up and validated by questionnaire. Controls (n 2426) were women who did not report a diagnosis of PMS during follow-up and confirmed experiencing minimal premenstrual symptoms.
In logistic regression models adjusting for smoking, BMI, B-vitamins and other factors, total protein intake was not associated with PMS development. For example, the OR for women with the highest intake of total protein 2–4 years before their reference year (median: 103·6 g/d) v. those with the lowest (median: 66·6 g/d) was 0·94 (95 % CI 0·70, 1·27). Additionally, intakes of specific protein sources and amino acids were not associated with PMS. Furthermore, results substituting carbohydrates and fats for protein were also null.
Overall, protein consumption was not associated with risk of developing PMS.
The rapid and massive adoption of mobile money transfer (MMT) services in East Africa, particularly in Kenya, stands in stark contrast to historically low use of formal financial systems on the continent. Its ‘fertile grounds’ therefore require in-depth analysis to understand the implications for African financial systems. This paper argues for the need to examine the underlying conceptual environment that enables low income and poor people's MMT adoption. It innovatively combines anthropological with ethnolinguistic analytical approaches to distinguish two repertoires around resource exchange. First, is a relational financial repertoire where relationships are developed and consolidated to create support and ‘upliftment’. A contrasting resource-focused repertoire is more like that of the formal financial sector. Identifying the conceptual features of relationality, the study offers a new perspective on the adoption and use of MMT in Africa and highlights the potential for disjunctures with policy efforts to increase financial inclusion.
In 2018, the Clostridium difficile LabID event methodology changed so that hospitals doing 2-step tests, nucleic acid amplification test (NAAT) plus enzyme immunofluorescence assay (EIA), had their adjustment modified to EIA-based tests, and only positive final tests (eg, EIA) were counted in the numerator. We report the immediate impact of this methodological change at 3 Milwaukee hospitals.
Studies of the Supreme Court of Canada (SCC) focus largely on its policy-making role and its interpretation of the Charter of Rights. However, less studied are the Court's decisions in earlier periods, especially in comparison to the Charter years and in cases beyond civil rights and liberties. This study fills a gap in the scholarship by analyzing the universe of decisions from 1945 to 2005 in criminal, tax and tort cases. Utilizing Baum's (1988, 1989) method to examine policy change, I explore policy trends on the Supreme Court. The findings suggest that, for the most part, the SCC has remained a stable, consistent body over the course of its modern history. It appears that most of the variation in judicial output across time is due to issue change with some shifts due to personnel and membership change.
We developed a decision analytic model to evaluate the impact of a preoperative Staphylococcus aureus decolonization bundle on surgical site infections (SSIs), health-care–associated costs (HCACs), and deaths due to SSI.
Our model population comprised US adults undergoing elective surgery. We evaluated 3 self-administered preoperative strategies: (1) the standard of care (SOC) consisting of 2 disinfectant soap showers; (2) the “test-and-treat” strategy consisting of the decolonization bundle including chlorhexidine gluconate (CHG) soap, CHG mouth rinse, and mupirocin nasal ointment for 5 days) if S. aureus was found at any of 4 screened sites (nasal, throat, axillary, perianal area), otherwise the SOC; and (3) the “treat-all” strategy consisting of the decolonization bundle for all patients, without S. aureus screening. Model parameters were derived primarily from a randomized controlled trial that measured the efficacy of the decolonization bundle for eradicating S. aureus.
Under base-case assumptions, the treat-all strategy yielded the fewest SSIs and the lowest HCACs, followed by the test-and-treat strategy. In contrast, the SOC yielded the most SSIs and the highest HCACs. Consequently, relative to the SOC, the average savings per operation was $217 for the treat-all strategy and $123 for the test-and-treat strategy, and the average savings per per SSI prevented was $21,929 for the treat-all strategy and $15,166 for the test-and-treat strategy. All strategies were sensitive to the probability of acquiring an SSI and the increased risk if SSI if the patient was colonized with SA.
We predict that the treat-all strategy would be the most effective and cost-saving strategy for preventing SSIs. However, because this strategy might select more extensively for mupirocin-resistant S. aureus and cause more medication adverse effects than the test-and-treat approach or the SOC, additional studies are needed to define its comparative benefits and harms.
High protein intake in young children is associated with excess gains in weight and body fat, but the specific role of different protein sources has yet to be described. The study aimed to investigate the role of different types of protein in the post-weaning stage on weight, BMI and overweight/obesity at 60 months. Intakes of animal, dairy and plant protein and a dietary pattern characterising variation in protein types at 21 months of age were estimated using a 3-d diet diary in a cohort of 2154 twins; weight and height were recorded every 3 months from birth to 60 months. Longitudinal mixed-effect models investigated the associations between sources of protein intake or dietary pattern scores and BMI, weight and overweight/obesity from 21 months up to 60 months. Adjusting for confounders, dairy protein intake at 21 months was positively associated with greater weight (46 (95 % CI 21, 71) g and BMI up to 60 months (0·04 (95 % CI 0·004, 0·070) kg/m2) and the odds of overweight/obesity at 3 years (OR 1·12; 95 % CI 1·00, 1·24). Milk showed associations of similar magnitude. A dietary pattern low in dairy protein and high in plant protein was associated with lower weight gain up to 60 months, but not overweight/obesity. Intake of dairy products in early childhood is most strongly associated with weight gain, compared with other protein sources. A dietary pattern characterised by lower protein intake and greater protein source diversity at 2 years may confer a lower risk of excess weight gain.
To determine the efficacy in eradicating Staphylococcus aureus (SA) carriage of a 5-day preoperative decolonization bundle compared to 2 disinfectant soap showers, with both regimens self-administered at home.
Open label, single-center, randomized clinical trial.
Ambulatory orthopedic, urologic, neurologic, colorectal, cardiovascular, and general surgery clinics at a tertiary-care referral center in the United States.
Patients at the University of Minnesota Medical Center planning to have elective surgery and not on antibiotics.
Consenting participants were screened for SA colonization using nasal, throat, axillary, and perianal swab cultures. Carriers of SA were randomized, stratified by methicillin resistance status, to a decolonization bundle group (5 days of nasal mupirocin, chlorhexidine gluconate [CHG] bathing, and CHG mouthwash) or control group (2 preoperative showers with antiseptic soap). Colonization status was reassessed preoperatively. The primary endpoint was absence of SA at all 4 screened body sites.
Of 427 participants screened between August 31, 2011, and August 9, 2016, 127 participants (29.7%) were SA carriers. Of these, 121 were randomized and 110 were eligible for efficacy analysis (57 decolonization bundle group, 53 control group). Overall, 90% of evaluable participants had methicillin-susceptible SA strains. Eradication of SA at all body sites was achieved for 41 of 57 participants (71.9%) in the decolonization bundle group and for 13 of 53 participants (24.5%) in the control group, a difference of 47.4% (95% confidence interval [CI], 29.1%–65.7%; P<.0001).
An outpatient preoperative antiseptic decolonization bundle aimed at 4 body sites was significantly more effective in eradicating SA than the usual disinfectant showers (ie, the control).
We aimed to explore multiple perspectives regarding barriers to and facilitators of advance care planning (ACP) among African Americans to identify similarities or differences that might have clinical implications.
Qualitative study with health disparities experts (n = 5), community members (n = 9), and seriously ill African American patients and caregivers (n = 11). Using template analysis, interviews were coded to identify intrapersonal, interpersonal, and systems-level themes in accordance with a social ecological framework.
Participants identified seven primary factors that influence ACP for African Americans: religion and spirituality; trust and mistrust; family relationships and experiences; patient-clinician relationships; prognostic communication, care preferences, and preparation and control. These influences echo those described in the existing literature; however, our data highlight consistent differences by group in the degree to which these factors positively or negatively affect ACP. Expert participants reinforced common themes from the literature, for example, that African Americans were not interested in prognostic information because of mistrust and religion. Seriously ill patients were more likely to express trust in their clinicians and to desire prognostic communication; they and community members expressed a desire to prepare for and control the end of life. Religious belief did not appear to negate these desires.
Significance of results
The literature on ACP in African Americans may not accurately reflect the experience of seriously ill African Americans. What are commonly understood as barriers to ACP may in fact not be. We propose reframing stereotypical barriers to ACP, such as religion and spirituality, or family, as cultural assets that should be engaged to enhance ACP. Although further research can inform best practices for engaging African American patients in ACP, findings suggest that respectful, rapport-building communication may facilitate ACP. Clinicians are encouraged to engage in early ACP using respectful and rapport building communication practices, including open-ended questions.
Approximately 8–20 % of reproductive-aged women experience premenstrual syndrome (PMS), substantially impacting quality of life. Women with PMS are encouraged to reduce fat intake to alleviate symptoms; however, its role in PMS development is unclear. We evaluated the association between dietary fat intake and PMS development among a subset of the prospective Nurses’ Health Study II cohort. We compared 1257 women reporting clinician-diagnosed PMS, confirmed by premenstrual symptom questionnaire and 2463 matched controls with no or minimal premenstrual symptoms. Intakes of total fat, subtypes and fatty acids were assessed via FFQ. After adjustment for age, BMI, smoking, Ca and other factors, intakes of total fat, MUFA, PUFA and trans-fat measured 2–4 years before were not associated with PMS. High SFA intake was associated with lower PMS risk (relative risk (RR) quintile 5 (median=28·1 g/d) v. quintile 1 (median=15·1 g/d)=0·75; 95 % CI 0·58, 0·98; Ptrend=0·07). This association was largely attributable to stearic acid intake, with women in the highest quintile (median=7·4 g/d) having a RR of 0·75 v. those with the lowest intake (median=3·7 g/d) (95 % CI 0·57, 0·97; Ptrend=0·03). Individual PUFA and MUFA, including n-3 fatty acids, were not associated with risk. Overall, fat intake was not associated with higher PMS risk. High intake of stearic acid may be associated with a lower risk of developing PMS. Additional prospective research is needed to confirm this finding.
To determine the impact of recurrent Clostridium difficile infection (RCDI) on patient behaviors following illness.
Using a computer algorithm, we searched the electronic medical records of 7 Chicago-area hospitals to identify patients with RCDI (2 episodes of CDI within 15 to 56 days of each other). RCDI was validated by medical record review. Patients were asked to complete a telephone survey. The survey included questions regarding general health, social isolation, symptom severity, emotional distress, and prevention behaviors.
In total, 119 patients completed the survey (32%). On average, respondents were 57.4 years old (standard deviation, 16.8); 57% were white, and ~50% reported hospitalization for CDI. At the time of their most recent illness, patients rated their diarrhea as high severity (58.5%) and their exhaustion as extreme (30.7%). Respondents indicated that they were very worried about getting sick again (41.5%) and about infecting others (31%). Almost 50% said that they have washed their hands more frequently (47%) and have increased their use of soap and water (45%) since their illness. Some of these patients (22%–32%) reported eating out less, avoiding certain medications and public areas, and increasing probiotic use. Most behavioral changes were unrelated to disease severity.
Having had RCDI appears to increase prevention-related behaviors in some patients. While some behaviors are appropriate (eg, handwashing), others are not supported by evidence of decreased risk and may negatively impact patient quality of life. Providers should discuss appropriate prevention behaviors with their patients and should clarify that other behaviors (eg, eating out less) will not affect their risk of future illness.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
Objectives: The present study examined differences in neurocognitive outcomes among non-Hispanic Black and White stroke survivors using the NIH Toolbox-Cognition Battery (NIHTB-CB), and investigated the roles of healthcare variables in explaining racial differences in neurocognitive outcomes post-stroke. Methods: One-hundred seventy adults (91 Black; 79 White), who participated in a multisite study were included (age: M=56.4; SD=12.6; education: M=13.7; SD=2.5; 50% male; years post-stroke: 1–18; stroke type: 72% ischemic, 28% hemorrhagic). Neurocognitive function was assessed with the NIHTB-CB, using demographically corrected norms. Participants completed measures of socio-demographic characteristics, health literacy, and healthcare use and access. Stroke severity was assessed with the Modified Rankin Scale. Results: An independent samples t test indicated Blacks showed more neurocognitive impairment (NIHTB-CB Fluid Composite T-score: M=37.63; SD=11.67) than Whites (Fluid T-score: M=42.59, SD=11.54; p=.006). This difference remained significant after adjusting for reading level (NIHTB-CB Oral Reading), and when stratified by stroke severity. Blacks also scored lower on health literacy, reported differences in insurance type, and reported decreased confidence in the doctors treating them. Multivariable models adjusting for reading level and injury severity showed that health literacy and insurance type were statistically significant predictors of the Fluid cognitive composite (p<.001 and p=.02, respectively) and significantly mediated racial differences on neurocognitive impairment. Conclusions: We replicated prior work showing that Blacks are at increased risk for poorer neurocognitive outcomes post-stroke than Whites. Health literacy and insurance type might be important modifiable factors influencing these differences. (JINS, 2017, 23, 640–652)
The Tiskilwa Till Member of the Wedron Formation represents deposition by basal melt-outin the marginal area of the Laurentide ice sheetduring the Woodfordian (late-Wisconsinan) in Illinois. Distinctive characteristics include: a very thick, homogeneous till; relatively little ablation till; red color; sandy texture; illite content that is relatively low withrespect to other Woodfordian tills; and the presence of discontinuous basal zones of differing composition.
Erosion and entrainment of debris from both distant and local source areas are evident in the Tiskilwa Jill. Basal thermal regime is suggested as a major controlling factor on the location of the zones of entrainment. The debris was homogenized en route to the margin and eventually was deposited as basal melt-out till near the margin. Deposition occurred within an interval of 6 ka or more during the first half of the Woodfordian.
Eastern hemlock [Tsuga canadensis (L.) Carrière] is a valuable component of Allegheny Plateau forests in northwestern Pennsylvania and western New York. Since the 1950s, hemlock forests throughout the Central Appalachians have been under threat from a nonnative forest insect pest, the hemlock woolly adelgid (Adelges tsugae Annand). In 2012, to address this threat at the most meaningful scale, the United States Forest Service and The Nature Conservancy organized a diverse partnership to develop a strategy for landscape-level conservation of hemlock on the High Allegheny Unglaciated Plateau. The main goal of the partnership was to locate hemlock across the landscape regardless of land ownership and prioritize the hemlock for monitoring and protection from the adelgid. The priority Hemlock Conservation Areas that were identified by this partnership provide a guide for focusing limited financial and personnel resources, with the goal of protecting at least a portion of these areas from the impacts of the adelgid until more long-term management techniques are identified. To protect the important hemlock forests identified in this prioritization, a partnership of private and public land managers are forming a Cooperative Pest Management Area to continue this important collaboration, allocate scarce resources across the area, and allow private partners access to public funding for protection of priority hemlock on their lands.
In this chapter, you will learn about the unique challenges of learning game design, the necessary multidisciplinary makeup of learning game design teams, and ways to improve team efficiency and effectiveness through communication. Learning games combine content and context to create a meaningful interaction between players’ experience and learning. They often employ an experiential learning strategy and have been called “designed experiences” (Squire, 2006). When learning games are viewed in this light, designing them becomes quite a challenge for several reasons: 1) many variables must be manipulated to achieve the right kind of learning experience at the right time; 2) learning game design has characteristics of ill-structured problem solving; 3) as an ill-structured problem, it requires learning game designers with a high level of expertise; and 4) the solution will require input from multiple disciplines. Having a highly skilled multidisciplinary design team raises another set of challenges including the development of a shared mental model. Research has shown that when team members think similarly, they are more likely to work effectively together (Cannon-Bowers & Salas, 1998; Guzzo & Salas, 1995; Hackman, 1990). When team members understand their differences and take measures to leverage them, learning game design teams are strengthened, leading to a more efficient and effective design process. Research indicates that multidisciplinary learning game design team members think differently about: 1) design goals; 2) authenticity requirements; 3) feedback design; 4) the integration of fun within the learning experience; 5) term definition; and 6) documentation contents. Current design models do not include steps to mitigate these differences and to build a team’s shared mental model. Therefore, we provide specific actions that should be integrated into a learning game design model to support the critical and necessary communications among learning game design team members.