To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Three potentially competing bear species inhabit tropical Asia: the sloth bear (Melursus ursinus), sun bear (Helarctos malayanus), and Asiatic black bear (Ursus thibetanus). Sun bears (30–80 kg), the smallest species of bear in the world, are about half the size of black bears (65–150 kg) and sloth bears (55–145 kg). What factors generate the separation of sloth bears geographically from black and sun bears? What factors facilitate the extensive sympatry of black bears and sun bears? How are these patterns structured by evolutionary history and competition between bear species, and what mechanisms facilitate their coexistence or maintain their separation? Has current forest loss and degradation benefited one species over another? If so, has interspecific competition played a part? These questions are the focus of this chapter.
Psychotropic medications are sometimes used off-label and inappropriately. This may cause harm to adolescents with intellectual disability. However, few studies have analysed off-label or inappropriate prescribing to this group.
To examine the appropriateness of psychotropic prescribing to adolescents with intellectual disability living in the community in south-east Queensland, Australia.
Off-label medication use was determined based on whether the recorded medical condition treated was approved by the Australian Therapeutic Goods Administration. Clinical appropriateness of medication use was determined based on published guidelines and clinical opinion of two authors who specialise in developmental disability medicine (J.N.T. and D.H.).
We followed 429 adolescents for a median of 4.2 years. A total of 107 participants (24.9%) were prescribed psychotropic medications on at least one occasion. Of these, 88 (82.2%) were prescribed their medication off-label or inappropriately at least once. Off-label or inappropriate use were most commonly associated with challenging behaviours.
Off-label or inappropriate use of psychotropic medications was common, especially for the management of challenging behaviours. Clinical decision-making accounts for individual patient factors and is made based on clinical experience as well as scientific evidence, whereas label indications are developed for regulatory purposes and, although appropriate at a population level, cannot encompass the foregoing considerations. Education for clinicians and other staff caring for people with intellectual disability, and a patient-centred approach to prescribing with involvement of families should encourage appropriate prescribing. The effect of the National Disability Insurance Scheme on the appropriateness of psychotropic medication prescribing should be investigated.
Traditionally, Roman temples and shrines in Britain have been contextualised in relation to wider ‘Roman’ religious practices. Until recently, considerations of architectural form and named deities have dominated discussions. The wider turn in archaeological discourse recognising ritual in everyday contexts has highlighted the importance of lived experience and landscape practice in shaping belief. Here we reflect on the implications of such ideas when approaching ritual practice at Roman temples, using a recently excavated example from Wiltshire, southern Britain, as a case study. The exceptional artefactual assemblages from the site demonstrate the importance of local and regional landscape practices and belief in shaping ritual practice in a sacred space. In addition, geophysical survey and analysis of Portable Antiquities Scheme (PAS) finds suggests that those occupying the landscape had long-term access to wealth. Deposition in the temple itself indicates the continuing importance attached to prehistoric objects in the Roman period, but also to the adoption of new votive practices of miniaturisation, mutilation and sacrifice. These rituals, although part of wider grammars of religious behaviour, had their roots in specific local contexts. Our detailed analyses provide a picture of a temple dedicated to a previously unknown local god, Bregneus, framed against that of an active community involved in farming, iron processing, quarrying, hunting and woodland management.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Calculus students are taught that an indefinite integral is defined only up to an additive constant, and as a consequence generations of students have assiduously added ‘+C’ to their calculus homework. Although ubiquitous, these constants rarely garner much attention, and typically loiter without intent around the ends of equations, feeling neglected. There is, however, useful work they can do, work which is particularly relevant in the contexts of integral tables and computer algebra systems. We begin, therefore, with a discussion of the context, before returning to coax the constants out of the shadows and assign them their tasks.
To assess the associations between nutrient intake and dietary patterns with different sarcopenia definitions in older men.
Sarcopenia was defined using the Foundation for the National Institutes of Health (FNIH), the European Working Group on Sarcopenia in Older People (EWGSOP) and the European Working Group on Sarcopenia in Older People 2 (EWGSOP2). Dietary adequacy of fourteen nutrients was assessed by comparing participants’ intakes with the Nutrient Reference Values (NRV). Attainment of NRV for nutrients was incorporated into a variable ‘poor’ (meeting ≤ 9) v. ‘good’ (meeting ≥ 10) using the cut-point method. Also, two different dietary patterns, monounsaturated:saturated fat and n-6:n-3 fatty acids ratio and individual nutrients were used as predictor variables.
A total of 794 men aged ≥75 years participated in this study.
The prevalence of sarcopenia by the FNIH, EWGSOP and EWGSOP2 definitions was 12·9 %, 12·9 % and 19·6 %, respectively. With the adjustment, poor nutrient intake was significantly associated with FNIH-defined sarcopenia (OR: 2·07 (95 % CI 1·16, 3·67)), but not with EWGSOP and EWGSPOP2 definitions. The lowest and second-lowest quartiles of protein, Mg and Ca and the lowest quartiles of n-6 PUFA and n-3 PUFA intakes were significantly associated with FNIH-defined sarcopenia. Each unit decrease in n-6:n-3 ratio was significantly associated with a 9 % increased risk of FNIH-defined sarcopenia (OR: 1·09 (95 % CI 1·04, 1·16)).
Inadequate intakes of nutrients are associated with FNIH-defined sarcopenia in older men, but not with the other two sarcopenia definitions. Further studies are required to understand these relationships.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Background: Proper care and maintenance of central lines is essential to prevent central-line–associated bloodstream infections (CLABSI). Our facility implemented a hospital-wide central-line maintenance bundle based on CLABSI prevention guidelines. The objective of this study was to determine whether maintenance bundle adherence was influenced by nursing shift or the day of week. Methods: A central-line maintenance bundle was implemented in April 2018 at a 1,266-bed academic medical center. The maintenance bundle components included alcohol-impregnated disinfection caps on all ports and infusion tubing, infusion tubing dated, dressings, not damp or soiled, no oozing at insertion site greater than the size of a quarter, dressings occlusive with all edges intact, transparent dressing change recorded within 7 days, and no gauze dressings in place for >48 hours. To monitor bundle compliance, 4 non–unit-based nurse observers were trained to audit central lines. Observations were collected between August 2018 and October 2019. Observations were performed during all shifts and 7 days per week. Just-in-time feedback was provided for noncompliant central lines. Nursing shifts were defined as day (7:00 a.m. to 3:00 p.m.), evening (3:00 p.m. to 11:00 p.m.), and night (11:00 p.m. to 7:00 a.m.). Central-line bundle compliance between shifts were compared using multinomial logistic regression. Bundle compliance between week day and weekend were compared using Mantel-Haenszel 2 analysis. Results: Of the 25,902 observations collected, 11,135 (42.9%) were day-shift observations, 11,559 (44.6%) occurred on evening shift, and 3,208 (12.4%) occurred on the night shift. Overall, 22,114 (85.9%) observations occurred on a week day versus 3,788 (14.6%) on a Saturday or Sunday (median observations per day of the week, 2,570; range, 1,680–6,800). In total, 4,599 CLs (17.8%) were noncompliant with >1 bundle component. The most common reasons for noncompliance were dressing not dated (n = 1,577; 44.0%) and dressings not occlusive with all edges intact (n = 1340; 37.4%). The noncompliant rates for central-line observations by shift were 12.8% (1,430 of 1,1,135) on day shift, 20.4% (2,361 of 11,559) on evening shift, and 25.2% (808 of 3,208) on night shift. Compared to day shift, evening shift (OR, 1.74; 95% CI, 1.62–1.87; P < .001) and night shift (OR, 2.29; 95% CI, 2.07–2.52; P < .001) were more likely to have a noncompliant central lines. Compared to a weekday, observations on weekend days were more likely to find a noncompliant central line: 914 of 3,788 (24.4%) weekend days versus 3,685 of 22,114 (16.7%) week days (P < .001). Conclusions: Noncompliance with central-line maintenance bundle was more likely on evening and night shifts and during the weekends.
Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.
Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.
SHEA endorses adhering to the recommendations by the CDC and ACIP for immunizations of all children and adults. All persons providing clinical care should be familiar with these recommendations and should routinely assess immunization compliance of their patients and strongly recommend all routine immunizations to patients. All healthcare personnel (HCP) should be immunized against vaccine-preventable diseases as recommended by the CDC/ACIP (unless immunity is demonstrated by another recommended method). SHEA endorses the policy that immunization should be a condition of employment or functioning (students, contract workers, volunteers, etc) at a healthcare facility. Only recognized medical contraindications should be accepted for not receiving recommended immunizations.
To examine changes in micronutrient intake over 3 years and identify any associations between socio-economic, health, lifestyle and meal-related factors and these changes in micronutrient intakes among older men.
Dietary adequacy of individual micronutrient was compared to the estimated average requirement of the nutrient reference values (NRV). Attainment of the NRV for twelve micronutrients was incorporated into a dichotomised variable ‘not meeting’ (meeting ≤ 6) or ‘meeting’ (meeting ≥ 7) and categorised into four categories to assess change in micronutrient intake over 3 years. The multinomial logistic regression analyses were conducted to model predictors of changes in micronutrient intake.
Seven hundred and ninety-four men participated in a detailed diet history interview at the third wave (baseline nutrition) and 718 men participated at the fourth wave (3-year follow-up).
The mean age was 81 years (range 75–99 years). Median intakes of the majority of micronutrients decreased significantly over a 3-year follow-up. Inadequacy of the NRV for thiamine, dietary folate, Zn, Mg, Ca and I were significantly increased at a 3-year follow-up than baseline nutrition. The incidence of inadequate micronutrient intake was 21 % and remained inadequate micronutrient intake was 16·4 % at 3-year follow-up. Changes in micronutrient intakes were significantly associated with participants born in the UK and Italy, low levels of physical activity, having ≥2 medical conditions and used meal services.
Micronutrient intake decreases with age in older men. Our results suggest that strategies to improve some of the suboptimal micronutrient intakes might need to be developed and implemented for older men.
The symptoms of functional neurological disorder (FND) are a product of its pathophysiology. The pathophysiology of FND is reflective of dysfunction within and across different brain circuits that, in turn, affects specific constructs. In this perspective article, we briefly review five constructs that are affected in FND: emotion processing (including salience), agency, attention, interoception, and predictive processing/inference. Examples of underlying neural circuits include salience, multimodal integration, and attention networks. The symptoms of each patient can be described as a combination of dysfunction in several of these networks and related processes. While we have gained a considerable understanding of FND, there is more work to be done, including determining how pathophysiological abnormalities arise as a consequence of etiologic biopsychosocial factors. To facilitate advances in this underserved and important area, we propose a pathophysiology-focused research agenda to engage government-sponsored funding agencies and foundations.
Cyclonic storms (often called hurricanes, typhoons, or cyclones) often cause population declines in vulnerable bird species, and the intensity of these storms appears to be increasing due to climate change. Prior studies have reported short-term impacts of hurricanes on avifauna, but few have examined long-term impacts. Over two decades (1993–2018), we periodically surveyed a subspecies of West Indian Woodpecker Melanerpes superciliaris nyeanus on San Salvador, a small island in The Bahamas, to determine its distribution on the island, habitat use, and effects of hurricanes on abundance and population size. We conducted passive and playback surveys, supplemented with mist-netting. Woodpeckers were found only in the northern part of San Salvador, despite extensive surveys throughout other accessible areas of the island. Birds occupied areas with taller coppice adjacent to sabal palm Sabal palmetto groves, which were used for nesting. After hurricanes with >160 kph winds passed over San Salvador, woodpecker densities declined to 35–40% of pre-hurricane densities, but generally recovered back to pre-hurricane densities within 2–3 years. Based on an estimated density of woodpeckers within a ~1,400 ha occupied area, we calculated a population size of approximately 240 individuals (CI = 68-408). However, the population declined to far lower numbers immediately following hurricanes. Under IUCN Red List criteria, M. s. nyeanus classifies as ‘Critically Endangered’, and could be especially sensitive to future hurricanes if they occur at a high enough frequency or intensity to prevent the population from rebounding. Given the small size, isolation, and vulnerability of this population, we recommend preservation of the core habitat, continued monitoring, and further research. Our study shows that small, threatened bird populations can be resilient to the effects of hurricanes, but increased intensity of hurricanes, in combination with other threats, may limit this resilience in the future.
The scarcity of Romano-British human remains from north-west England has hindered understanding of burial practice in this region. Here, we report on the excavation of human and non-human animal remains1 and material culture from Dog Hole Cave, Haverbrack. Foetal and neonatal infants had been interred alongside a horse burial and puppies, lambs, calves and piglets in the very latest Iron Age to early Romano-British period, while the mid- to late Roman period is characterised by burials of older individuals with copper-alloy jewellery and beads. This material culture is more characteristic of urban sites, while isotope analysis indicates that the later individuals were largely from the local area. We discuss these results in terms of burial ritual in Cumbria and rural acculturation. Supplementary material is available online (https://doi.org/10.1017/S0068113X20000136), and contains further information about the site and excavations, small finds, zooarchaeology, human osteology, site taphonomy, the palaeoenvironment, isotope methods and analysis, and finds listed in Benson and Bland 1963.
An important component of reintroduction is acclimatization to the release site. Movement parameters and breeding are common metrics used to infer the end of the acclimatization period, but the time taken to locate preferred food items is another important measure. We studied the diet of a reintroduced population of brushtail possums Trichosurus vulpecula in semi-arid South Australia over a 12 month period, investigating changes over time as well as the general diet. We used next-generation DNA sequencing to determine the contents of 253 scat samples, after creating a local plant reference library. Vegetation surveys were conducted monthly to account for availability. Dietary diversity and richness decreased significantly with time since release after availability was accounted for. We used Jacob's Index to assess selectivity; just 13.4% of available plant genera were significantly preferred overall, relative to availability. The mean proportion of preferred plant genera contained within individual samples increased significantly with time since release, but the frequency of occurrence of preferred plants did not. Five genera (Eucalyptus, Petalostylis, Maireana, Zygophyllum and Callitris) were present in more than half of samples. There was no difference in dietary preferences between sexes (Pianka overlap = 0.73). Our results suggest that acclimatization periods may be longer than those estimated via reproduction, changes in mass and movement parameters, but that under suitable conditions a changeable diet should not negatively affect reintroduction outcomes. Reintroduction projects should aim to extend post-release monitoring beyond the dietary acclimatization period and, for dry climates, diet should be monitored through a drought period.