To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Machu Picchu, in Cuzco, is one of the most famous archaeological sites in South America. The precise dating of the monumental complex, however, relies largely on documentary sources. Samples of bone and teeth from individuals buried in caves at four cemeteries around Machu Picchu form the basis for a new programme of AMS radiocarbon-dating. The results show that the site was occupied from c. AD 1420–1532, with activity beginning two decades earlier than suggested by the textual sources that associate the site with Emperor Pachacuti's rise to power in AD 1438. The new AMS dates—the first large set published for Machu Picchu—therefore have implications for the wider understanding of Inca chronology.
Background: As the world prepared for and responded to the COVID-19 pandemic in early 2020, a rapid increase in demand for personal protective equipment (PPE) led to severe shortages worldwide. Acquisition of PPE in the general market was an integral part of pandemic response, along with the safeguarding of hospital supplies. We seek to quantify the difference in cost per unit (CPU) of PPE during the first wave of COVID-19 compared to prepandemic prices. Methods: We performed a retrospective review of market prices for PPE during the first surge of the pandemic in Chicago. Cost of PPE was tabulated and compared with prepandemic prices. The maximum cost per unit (CPU) of PPE was tabulated for each week, and the average cost throughout the pandemic was calculated. Disposable gowns, washable gowns, N95 respirators, face masks, and gloves were included in our analysis. Results: PPE prices were significantly higher during the pandemic compared to prepandemic prices (Figure 1). Disposable gown CPU peaked at $12 during the first week of March, 13.7 times higher than prepandemic prices, and the average gown CPU was 7.5 times higher than prepandemic prices. N95 respirators had a peak CPU of $12, and average CPU was 8 times higher than prepandemic prices. Face-mask CPU peaked at $0.55, 11 times higher, and averaged 9 times higher the regular price. Gloves averaged 2.5 times higher than the prepandemic CPU. Conclusions: Market prices for PPE were significantly elevated during the first weeks of the pandemic and remained high throughout the first wave of COVID-19. Multiple factors likely contributed to high prices, including demand shock, disrupted supply chains, and a rush to acquisition by healthcare systems and the general population alike. The impact of COVID-19 on prices highlights the importance of supply chains and national stockpiles for pandemic preparedness.
Background: The disease caused by SARS-CoV-2, COVID-19, has caused a pandemic leading to strained healthcare systems worldwide and an unprecedented public health crisis. Lower respiratory tract infections (LRTIs) and hypoxia caused by COVID-19 has led to an increase in hospitalizations. We sought to define the impact of COVID-19 on antimicrobial use and antimicrobial resistance (AMR) in an urban safety-net community hospital. Methods: Retrospective review of antimicrobial use and AMR in a 151-bed urban community hospital. Antimicrobial use was calculated in days of therapy per 1,000 patient days (DOT/1,000 PD) for ceftriaxone, piperacillin-tazobactam and meropenem during 2019 and 2020. Ceftriaxone, piperacillin-tazobactam and meropenem were reviewed for calendar year 2019 and 2020. AMR was assessed by comparing the carbapenem resistant Enterobacteriaceae (CRE) infection incidence rate per 1,000 patient days between 2019 and 2020. Results: The average quarterly DOT/1,000 PD increased from 359.5 in 2019 to 394.25 in 2020, with the highest increase in the second and fourth quarters of 2020, which temporarily correspond to the first and second waves of COVID-19. Ceftriaxone and meropenem use increased during the first and second waves of COVID-19. Piperacillin-tazobactam use increased during the first wave and declined thereafter (Figure 1). Rates of CRE increased from a quarterly average of 0.57 to 0.68 (Figure 2). Conclusions: Antimicrobial pressure increased during the first and second waves of COVID-19. Ceftriaxone was the most commonly used antimicrobial, reflecting internal guidelines and ASP interventions. CRE rates increased during COVID-19. This finding may be due to an overall increase in antimicrobial pressure in the community and in critically ill patients. Antibiotics are a precious resource, and antimicrobial stewardship remains important during the COVID-19 pandemic. Appropriate use of antimicrobials is critical to preventing AMR.
Background: The disease caused by SARS-CoV-2, COVID-19, has caused a pandemic leading to strained healthcare systems worldwide and an unprecedented public health crisis. The hallmark of severe COVID-19 is lower respiratory tract infection (LRTI) and hypoxia requiring hospitalization. A paucity of data on bacterial coinfection and a lack of therapeutic options for COVID-19 during the first surge of cases has increased pressure on antimicrobial use and has challenged antimicrobial stewardship programs (ASPs). We implemented a multimodal approach to antimicrobial stewardship in an urban safety-net community hospital targeting selection and duration of therapy. Methods: Retrospective review of cases during the first wave of COVID-19 in a 151-bed urban safety-net community hospital from March to June 2020. EMR order sets (Figure 1) and prospective audit and feedback by ASPs targeting empiric antimicrobial selection and duration were implemented as part of the COVID-19 response. Hospitalized patients with COVID-19 were reviewed retrospectively. Demographic information was collected. Data on antimicrobial use were tabulated, including selection and duration of antimicrobials (Figure 1). Results: In total, 302 patients were reviewed, of whom 221 (73%) received empiric antimicrobials. The most commonly used antimicrobials were ceftriaxone and azithromycin (Figure 2). Days of therapy per 1,000 patient days (DOT/1,000 PD) for ceftriaxone increased from 71 in the quarter prior to 113 during the study period. Average duration of therapy was 6 days. In the ICU, average duration was 8 days compared to 5 days in non-ICU settings. Average durations of parenteral therapy were 5.54 days in the ICU and 3.36 days in non-ICU settings. Procalcitonin was obtained in 37 cases (17%) and ranged from 0.09 to 12.57 ng/mL, with an average of 1.21 ng/mL. No cases had documented bacterial coinfection (Figure 1). Conclusions: Antimicrobials were commonly prescribed during the first wave of COVID-19 in a safety-net community hospital. Procalcitonin did not guide therapy nor did lack of documented coinfection change physician behavior. With limited resources, ASP successfully guided clinicians toward IDSA guideline recommendations for selection and duration, as evidenced by antimicrobial use. During this unprecedented surge of LRTIs, a multimodal approach to antimicrobial stewardship was used and guided toward early transition to oral agents and shorter durations.
Adolescence into young adulthood represents a sensitive period in which brain development significantly diverges by sex. Regular cannabis use by young people is associated with neuropsychological vulnerabilities, but the potential impact of sex on these relationships is unclear.
In a cross-sectional study, we examined sex differences in multi-domain neuropsychological functioning using the Cambridge Neuropsychological Test Automated Battery (CANTAB) and tested whether sex moderated the relationship between cognitive performance and age of initiation, frequency of cannabis use, amount of cannabis use, and withdrawal symptoms in at least weekly adolescent and young adult cannabis users (n = 171; aged 13–25 years; 46.2% female).
Male cannabis users had poorer visual recognition memory and female cannabis users showed worse attention and executive functions, with medium to large effect sizes. These sex effects persisted, when controlling for age, IQ, amount of alcohol and nicotine use, mood and anxiety symptoms, emotional stability and impulsive behavior. Earlier age of initiated use and more use were associated with worse attentional functions in females, but not males. More use was more strongly associated with worse episodic memory in males than in females. More use was associated with poorer learning in males only.
Domain-specific patterns of neuropsychological performance were found by sex, such that males showed poorer visual memory and females showed worse performance on measures of attention (sustained visual, multitasking) and executive functioning (spatial planning/working memory subdomains). Larger studies including healthy controls are needed to determine if the observed sex differences are more exaggerated relative to non-users.
ABSTRACT IMPACT: This study examines gray matter volume differences resulting from the bilingual experience in children and adults allowing us to better understand the brains of over half of the world’s population that speaks more than one language. OBJECTIVES/GOALS: Literature is mixed regarding a bilingual advantage in executive control (EC). While it has been shown that young adult bilinguals have greater gray matter volume (GMV) than monolinguals in EC regions, there is behavioral evidence that suggests such difference would be more pronounced in children. METHODS/STUDY POPULATION: Using SPM12 to test this hypothesis, we used a whole-brain t-test to compare GMV in 35 English-speaking monolingual and 20 Spanish-English early (learned both languages before 6 years old) bilingual children. Next, we submitted both groups of children to an ANOVA with 42 English speaking monolingual and 26 Spanish-English bilingual adults to test for an interaction of Language Experience by Age Group at the level of the whole brain. RESULTS/ANTICIPATED RESULTS: e between-group comparison of bilingual and monolingual children, revealed more GMV in bilingual compared to monolingual children in regions associated with EC (right middle and inferior frontal gyri, superior parietal lobule, and precuneus). Our second analysis, an ANOVA comparing bilingual and monolingual children and adults, revealed an interaction in which bilingual>monolingual GMV in children was greater than any bilingual>monolingual GMV (or bilingual=monolingual GMV) in the adult groups in the right superior parietal lobule (BA1). No regions indicated that bilingual>monolingual GMV was more pronounced in adults. DISCUSSION/SIGNIFICANCE OF FINDINGS: These results provide further evidence for GMV differences in early bilinguals in regions associated with EC and indicate that more GMV differences exist between bilingual and monolingual children than adults.
In the era of the Schengen Area (at least in the days before Covid-19), travel from Munich to Bozen/Bolzano or Ljubljana to Trst/Trieste is a decidedly unremarkable, albeit beautiful, adventure. Just as meaningful as the lack of border controls, travellers find all public signage in both Italian and German (and sometimes Ladin, too) upon arrival in Bozen/Bolzano. Signs in the streets of Trst/Trieste less reliably have Slovene alongside the Italian, but assistance with translation can be found with little difficulty. The Italian autonomous regions ‘with special statutes’ in which these cities reside – Trentino-Alto Adige (South Tyrol) and Friuli Venezia Giulia (the Julian March) – are multilingual territories that, at least on an official level, embrace a multiethnic heritage and reality. In fact, Trentino-Alto Adige's consociational democracy is widely regarded among political scientists as an international role model for how states can successfully protect and give voice to minority populations. Those unfamiliar with the more recent history of these regions might be surprised to learn of these avowedly multiethnic political and cultural structures. For much of the first half of the twentieth century, the regions’ two states – Austria-Hungary until 1919 and thereafter Italy – employed the ‘nationality principle’ to define policies and populations in these territories. As in most of Europe at the time, sovereignty was increasingly predicated on the contemporary ideal of the nation state, in which borders, ethnicity, language and citizenship were all bound together. Of course, as a multiethnic empire, Austria-Hungary was much more concerned about centralising state authority (and then fighting a world war) than national homogeneity, while Italy's nationalisation campaign in the interwar period became fundamental to its presence in the new provinces. Still, both states sought to classify and ultimately to control their border populations by privileging ethnolinguistic categories of citizenship.
Strategies for pandemic preparedness and response are urgently needed for all settings. We describe our experience using inverted classroom methodology (ICM) for COVID-19 pandemic preparedness in a small hospital with limited infection prevention staff. ICM for pandemic preparedness was feasible and contributed to an increase in COVID-19 knowledge and comfort.
Background: Water management programs (WMPs) are needed to minimize the growth and transmission of opportunistic pathogens in healthcare facility water systems. In 2017, the Centers for Medicare & Medicaid Service (CMS) began requiring that certified hospitals in the United States have water management policies and procedures; in response, the National Healthcare Safety Network (NHSN) Annual Hospital Survey included new, voluntary questions on practices regarding water management and monitoring. Of 4,929 hospitals surveyed in 2017, 3,821 (77.5%) reported having a WMP. Of these 3,821 facilities, 86.9% reported regular monitoring of water temperature; 66.2% monitored disinfectant (eg, residual chlorine); 63.1% used specific tests for Legionella; and 35.6% performed heterotrophic plate counts (HPCs). We analyzed new, 2018 hospital survey data to assess further progress toward meeting CMS requirements for WMPs. Methods: We analyzed 2018 NHSN Annual Hospital Survey responses for facilities that reported on WMPs in 2017. Responses included information regarding risk assessments for Legionella and other waterborne pathogens as well as details regarding WMP teams and water-monitoring practices. WMP team members were categorized as administrative (hospital administrator, compliance officer, risk or quality management), epidemiology or infection control (epidemiologist or infection preventionist, other clinical), or environmental or facilities (consultant, facility manager or engineer, equipment or chemical supplier, maintenance). Statistical significance was assessed using the McNemar test, where appropriate. Results: Of hospitals reporting on WMPs in 2017, 4,087 of 4,929 (83%) responded again in 2018. The proportion of facilities that reported having a WMP increased from 3,258 of 4,087 (79.7%) in 2017 to 3,647 of 4,087 (89.2%) in 2018 (P < .0001). Of the 3,647 hospitals that reported having a WMP in 2018, 95.9% had conducted a risk assessment for waterborne pathogens; 67.3% of these facilities had most recently done so within 1 year of the survey. WMP teams had representation from environmental or facilities staff at 98.8% of hospitals, epidemiology or infection control staff at 89.8% of hospitals, and administrative staff at 71.7% of hospitals. Of facilities with WMPs in 2018, 90.5% reported regular monitoring of water temperature, 72.2% disinfectant, 67.4% tests for Legionella, and 48.8% HPCs. Conclusions: More hospitals reported having a WMP in 2018 than 2017. However, ~1 in 10 respondents lacked a WMP. Differences in water monitoring practices across facilities potentially reflect a lack of standardization in how WMPs are implemented. Some hospital WMPs do not incorporate routine monitoring of water temperature and disinfectant, which is a basic practice. CDC continues to develop tools, resources, and training to support facility WMP teams in meeting CMS requirements and protecting patients from water-associated pathogens.
The forties are a time of irregular and sometimes heavy menstrual loss due to fluctuating levels of sex hormones. Some months are characterized by low estrogen secretion and anovulation and others by extremely high levels of estradiol (E2). In a 20-year-old woman, E2 usually peaks at 500–1000 pmol/L. In contrast, some perimenopausal women may have cycles where E2 levels peak at around 5000 pmol/L. These high levels of estrogen are often not followed by ovulation and so progesterone is either not secreted at all, or levels are too low to counter these high-estrogen months. As we will see shortly, many gynecologic pathologies are driven by these unbalanced sex hormone levels.
Polysaccharide-based nanoparticles such as pectin had always been of greatest interest because of its excellent solubility and mucoadhesive nature and are highly suitable for oral drug delivery for drug administration. In this study, we used commercially available pectin samples based on their degree of esterification, and nanoparticles were fabricated by the ionotropic gelation method using magnesium (Mg2+) as the divalent cross-linker. We conducted a comparative analysis on the three pectin NPs—high methoxylated pectin (HMP), low methoxylated pectin (LMP), and amidated LMP (AMP)—to examine the difference in characteristics such as shape, size, and biocompatibility. HMP and AMP were found to be similar in size (~850 nm), whereas LMP was found to be of ~700 nm. The three NPs were also tested for their biocompatibility toward THP-1 cells. All three NPs were found to have the potential as a nanocarrier of therapeutic and preventive drugs, especially through oral routes.
A defining feature of the contemporary workforce is its diversity. Indeed, changes in worker demographics have spurred substantial scholarship and management practice. In this chapter, we draw from population-based statistics to describe and discuss the nature of change as it refers to gender, race/ethnicity, and age diversity. We further discuss existing data and theory on change in the workforce participation of people from understudied demographic groups such as people with children, multiracial individuals, immigrants, religious minorities, gender and sexual minorities, individuals with disabilities, and socioeconomic status. In so doing, this review prompts important new directions for theory and practice.
To study the impact of chronic, life-threatening stressors in the form of daily missile attacks, for five consecutive years, on pregnancy outcomes.
Charts of deliveries from two neighboring towns in the south of Israel, covering the years 2000 and 2003–2008, were reviewed retrospectively. One city had been exposed to missile attacks, while the other was not. For each year, 100 charts were chosen at random.
Significant association was found between exposure to stress and frequency of pregnancy complications (P = 0.047) and premature membrane rupture (P = 0.029). A more detailed analysis, based on dividing the stressful years into three distinct periods: early (2003–2004), intermediate (2005–2006) and late (2007–2008), revealed that preterm deliveries were significantly more frequent (P = 0.044) during the intermediate period, as was premature membrane rupture during the late period (P = 0.014).
Exposure to chronic life-threatening stress resulted in more pregnancy complications and in particular more premature membrane ruptures. The impact was most significant during the middle period of the 5-year-exposure to the stressor. Hence it seems that factors of duration and habituation may play a role in the impact of chronic, life-threatening stressors on pregnancy.
To explore associations of whole grain and cereal fibre intake to CVD risk factors in Australian adults.
Cross-sectional analysis. Intakes of whole grain and cereal fibre were examined in association to BMI, waist circumference (WC), blood pressure (BP), serum lipid concentrations, C-reactive protein, systolic BP, fasting glucose and HbA1c.
Australian Health Survey 2011–2013.
A population-representative sample of 7665 participants over 18 years old.
Highest whole grain consumers (T3) had lower BMI (T0 26·8 kg/m2, T3 26·0 kg/m2, P < 0·0001) and WC (T0 92·2 cm, T3 90·0 cm, P = 0·0005) compared with non-consumers (T0), although only WC remained significant after adjusting for dietary and lifestyle factors, including cereal fibre intake (P = 0·03). Whole grain intake was marginally inversely associated with fasting glucose (P = 0·048) and HbA1c (P = 0·03) after adjusting for dietary and lifestyle factors, including cereal fibre intake. Cereal fibre intake was inversely associated with BMI (P < 0·0001) and WC (P < 0·0008) and tended to be inversely associated with total cholesterol, LDL-cholesterol and apo-B concentrations, although associations were attenuated after further adjusting for BMI and lipid-lowering medication use.
The extent to which cereal fibre is responsible for the CVD-protective associations of whole grains may vary depending on the mediators involved. Longer-term intervention studies directly comparing whole grain and non-whole grain diets of similar cereal fibre contents (such as through the use of bran or added-fibre refined grain products) are needed to confirm independent effects.
To investigate how intakes of whole grains and cereal fibre were associated to risk factors for CVD in UK adults.
Cross-sectional analyses examined associations between whole grain and cereal fibre intakes and adiposity measurements, serum lipid concentrations, C-reactive protein, systolic blood pressure, fasting glucose, HbA1c, homocysteine and a combined CVD relative risk score.
The National Diet and Nutrition Survey (NDNS) Rolling Programme 2008–2014.
A nationally representative sample of 2689 adults.
Participants in the highest quartile (Q4) of whole grain intake had lower waist–hip ratio (Q1 0·872; Q4 0·857; P = 0·04), HbA1c (Q1 5·66 %; Q4 5·47 %; P = 0·01) and homocysteine (Q1 9·95 µmol/l; Q4 8·76 µmol/l; P = 0·01) compared with participants in the lowest quartile (Q1), after adjusting for dietary and lifestyle factors, including cereal fibre intake. Whole grain intake was inversely associated with C-reactive protein using multivariate analysis (P = 0·02), but this was not significant after final adjustment for cereal fibre. Cereal fibre intake was also inversely associated with waist–hip ratio (P = 0·03) and homocysteine (P = 0·002) in multivariate analysis.
Similar inverse associations between whole grain and cereal fibre intakes to CVD risk factors suggest the relevance of cereal fibre in the protective effects of whole grains. However, whole grain associations often remained significant after adjusting for cereal fibre intake, suggesting additional constituents may be relevant. Intervention studies are needed to compare cereal fibre intake from non-whole grain sources to whole grain intake.
To describe pathogen distribution and rates for central-line–associated bloodstream infections (CLABSIs) from different acute-care locations during 2011–2017 to inform prevention efforts.
CLABSI data from the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) were analyzed. Percentages and pooled mean incidence density rates were calculated for a variety of pathogens and stratified by acute-care location groups (adult intensive care units [ICUs], pediatric ICUs [PICUs], adult wards, pediatric wards, and oncology wards).
From 2011 to 2017, 136,264 CLABSIs were reported to the NHSN by adult and pediatric acute-care locations; adult ICUs and wards reported the most CLABSIs: 59,461 (44%) and 40,763 (30%), respectively. In 2017, the most common pathogens were Candida spp/yeast in adult ICUs (27%) and Enterobacteriaceae in adult wards, pediatric wards, oncology wards, and PICUs (23%–31%). Most pathogen-specific CLABSI rates decreased over time, excepting Candida spp/yeast in adult ICUs and Enterobacteriaceae in oncology wards, which increased, and Staphylococcus aureus rates in pediatric locations, which did not change.
The pathogens associated with CLABSIs differ across acute-care location groups. Learning how pathogen-targeted prevention efforts could augment current prevention strategies, such as strategies aimed at preventing Candida spp/yeast and Enterobacteriaceae CLABSIs, might further reduce national rates.
Whole grain intake is associated with lower CVD risk in epidemiological studies. It is unclear to what extent cereal fibre, located primarily within the bran, is responsible. This review aimed to evaluate association between intake of whole grain, cereal fibre and bran and CVD risk. Academic databases were searched for human studies published before March 2018. Observational studies reporting whole grain and cereal fibre or bran intake in association with any CVD-related outcome were included. Studies were separated into those defining whole grain using a recognised definition (containing the bran, germ and endosperm in their natural proportions) (three studies, seven publications) and those using an alternative definition, such as including added bran as a whole grain source (eight additional studies, thirteen publications). Intake of whole grain, cereal fibre and bran were similarly associated with lower risk of CVD-related outcomes. Within the initial analysis, where studies used the recognised whole grain definition, results were less likely to show attenuation after adjustment for cereal fibre content. The fibre component of grain foods appears to play an important role in protective effects of whole grains. Adjusting for fibre content, associations remained, suggesting that additional components within the whole grain, and the bran component, may contribute to cardio-protective association. The limited studies and considerable discrepancy in defining and calculating whole grain intake limit conclusions. Future research should utilise a consistent definition and methodical approach of calculating whole grain intake to contribute to a greater body of consistent evidence surrounding whole grains.