To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In April 2019, the U.S. Fish and Wildlife Service (USFWS) released its recovery plan for the jaguar Panthera onca after several decades of discussion, litigation and controversy about the status of the species in the USA. The USFWS estimated that potential habitat, south of the Interstate-10 highway in Arizona and New Mexico, had a carrying capacity of c. six jaguars, and so focused its recovery programme on areas south of the USA–Mexico border. Here we present a systematic review of the modelling and assessment efforts over the last 25 years, with a focus on areas north of Interstate-10 in Arizona and New Mexico, outside the recovery unit considered by the USFWS. Despite differences in data inputs, methods, and analytical extent, the nine previous studies found support for potential suitable jaguar habitat in the central mountain ranges of Arizona and New Mexico. Applying slightly modified versions of the USFWS model and recalculating an Arizona-focused model over both states provided additional confirmation. Extending the area of consideration also substantially raised the carrying capacity of habitats in Arizona and New Mexico, from six to 90 or 151 adult jaguars, using the modified USFWS models. This review demonstrates the crucial ways in which choosing the extent of analysis influences the conclusions of a conservation plan. More importantly, it opens a new opportunity for jaguar conservation in North America that could help address threats from habitat losses, climate change and border infrastructure.
High dietary phosphorus (P), particularly soluble salts, may contribute to chronic kidney disease development in cats. The aim of the present study was to assess the safety of P supplied at 1 g/1000 kcal (4184kJ) from a highly soluble P salt in P-rich dry format feline diets. Seventy-five healthy adult cats (n 25/group) were fed either a low P control (1·4 g/1000 kcal [4184kJ]; Ca:P ratio 0·97) or one of two test diets with 4 g/1000 kcal (4184 kJ); Ca:P 1·04 or 5 g/1000 kcal (4184kJ); Ca:P 1·27, both incorporating 1 g/1000 kcal (4184 kJ) sodium tripolyphosphate (STPP) – for a period of 30 weeks in a randomised parallel-group study. Health markers in blood and urine, glomerular filtration rate, renal ultrasound and bone density were assessed at baseline and at regular time points. At the end of the test period, responses following transition to a commercial diet (total P – 2·34 g/1000 kcal [4184kJ], Ca:P 1·3) for a 4-week washout period were also assessed. No adverse effects on general, kidney or bone (skeletal) function and health were observed. P and Ca balance, some serum biochemistry parameters and regulatory hormones were increased in cats fed test diets from week 2 onwards (P ≤ 0·05). Data from the washout period suggest that increased serum creatinine and urea values observed in the two test diet groups were influenced by dietary differences during the test period, and not indicative of changes in renal function. The present data suggest no observed adverse effect level for feline diets containing 1 g P/1000 kcal (4184 kJ) from STPP and total P level of up to 5 g/1000 kcal (4184 kJ) when fed for 30 weeks.
On January 29, 2020, a total of 195 US citizens were evacuated from the coronavirus disease 2019 (COVID-19) epidemic in Wuhan, China, to March Air Reserve Base in Riverside, California, and entered the first federally mandated quarantine in over 50 years. With less than 1-d notice, a multi-disciplinary team from Riverside County and Riverside University Health System in conjunction with local and federal agencies established on-site 24-h medical care and behavioral health support. This report details the coordinated efforts by multiple teams that took place to provide care for the passengers and to support the surrounding community.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Critical shortages of personal protective equipment, especially N95 respirators, during the coronavirus disease 2019 (COVID-19) pandemic continues to be a source of concern. Novel methods of N95 filtering face-piece respirator decontamination that can be scaled-up for in-hospital use can help address this concern and keep healthcare workers (HCWs) safe.
A multidisciplinary pragmatic study was conducted to evaluate the use of an ultrasonic room high-level disinfection system (HLDS) that generates aerosolized peracetic acid (PAA) and hydrogen peroxide for decontamination of large numbers of N95 respirators. A cycle duration that consistently achieved disinfection of N95 respirators (defined as ≥6 log10 reductions in bacteriophage MS2 and Geobacillus stearothermophilus spores inoculated onto respirators) was identified. The treated masks were assessed for changes to their hydrophobicity, material structure, strap elasticity, and filtration efficiency. PAA and hydrogen peroxide off-gassing from treated masks were also assessed.
The PAA room HLDS was effective for disinfection of bacteriophage MS2 and G. stearothermophilus spores on respirators in a 2,447 cubic-foot (69.6 cubic-meter) room with an aerosol deployment time of 16 minutes and a dwell time of 32 minutes. The total cycle time was 1 hour and 16 minutes. After 5 treatment cycles, no adverse effects were detected on filtration efficiency, structural integrity, or strap elasticity. There was no detectable off-gassing of PAA and hydrogen peroxide from the treated masks at 20 and 60 minutes after the disinfection cycle, respectively.
The PAA room disinfection system provides a rapidly scalable solution for in-hospital decontamination of large numbers of N95 respirators during the COVID-19 pandemic.
The perinatal period is a vulnerable time for the development of psychopathology, particularly mood and anxiety disorders. In the study of maternal anxiety, important questions remain regarding the association between maternal anxiety symptoms and subsequent child outcomes. This study examined the association between depressive and anxiety symptoms, namely social anxiety, panic, and agoraphobia disorder symptoms during the perinatal period and maternal perception of child behavior, specifically different facets of development and temperament. Participants (N = 104) were recruited during pregnancy from a community sample. Participants completed clinician-administered and self-report measures of depressive and anxiety symptoms during the third trimester of pregnancy and at 16 months postpartum; child behavior and temperament outcomes were assessed at 16 months postpartum. Child development areas included gross and fine motor skills, language and problem-solving abilities, and personal/social skills. Child temperament domains included surgency, negative affectivity, and effortful control. Hierarchical multiple regression analyses demonstrated that elevated prenatal social anxiety symptoms significantly predicted more negative maternal report of child behavior across most measured domains. Elevated prenatal social anxiety and panic symptoms predicted more negative maternal report of child effortful control. Depressive and agoraphobia symptoms were not significant predictors of child outcomes. Elevated anxiety symptoms appear to have a distinct association with maternal report of child development and temperament. Considering the relative influence of anxiety symptoms, particularly social anxiety, on maternal report of child behavior and temperament can help to identify potential difficulties early on in mother–child interactions as well as inform interventions for women and their families.
This article explores the influence of East Asia's economic growth on the evolution of American neoconservative thought in the 1970s and 1980s. It traces how prominent neoconservative thinkers—Nathan Glazer, Peter L. Berger, Herman Kahn, Michael Novak, and Lawrence E. Harrison—developed the claim that the region's prosperity stemmed from its alleged Confucian tradition. Drawing in part from East Asian leaders and scholars, they argued that the region's growth demonstrated that tradition had facilitated, rather than hampered, the development of a distinct East Asian capitalist modernity. The article argues that this Confucian thesis helped American neoconservatives articulate their conviction that “natural” social hierarchies, religious commitment, and traditional families were necessary for healthy and free capitalist societies. It then charts how neoconservatives mobilized this interpretation of Confucian East Asia against postcolonial critiques of capitalism, especially dependency theory. East Asia, they claimed, demonstrated that poverty and wealth were determined not by patterns of welfare, structural exploitation, or foreign assistance, but values and culture. The concept of Confucian capitalism, the article shows, was central to neoconservatives’ broad ideological agenda of protecting political, economic, and racial inequality under the guise of values, culture, and tradition.
Economic hardship (EH) may link to poorer child diet, however whether this association is due to resource limitations or effects on family functioning is unknown. This study examines whether parenting stress mediates the association between EH and child consumption of foods high in saturated fats and added sugars (SFAS).
Data were collected from the Fragile Families and Child Wellbeing study. EH was assessed using eight items collected when children were between 1–9 years old. Mothers reported parenting stress and frequency of child consumption of high SFAS foods when children were 9 years old. Latent growth curve modelling (LGCM) and structural equation modelling tested direct associations between the starting level/rate of change in EH and high SFAS food consumption, and parenting stress as a mediator of the association.
Twenty US cities.
Mothers/children (n 3846) followed birth through age 9 years, oversampled ‘high-risk’, unmarried mothers.
LGCM indicated a curvilinear trend in EH from ages 1–9, with steeper increases from ages 3–9 years. EH did not directly predict the frequency of high SFAS foods. Average EH at 3 and 5 years and change in EH from ages 1–9 predicted higher parenting stress, which in turn predicted more frequent consumption of high SFAS foods.
Findings suggest it may be important to consider parenting stress in early prevention efforts given potential lasting effects of early life EH on child consumption of high SFAS foods. Future research should explore how supports and resources may buffer effects of EH-related stress on parents and children.
Species distribution models (SDMs) are statistical tools used to develop continuous predictions of species occurrence. ‘Integrated SDMs’ (ISDMs) are an elaboration of this approach with potential advantages that allow for the dual use of opportunistically collected presence-only data and site-occupancy data from planned surveys. These models also account for survey bias and imperfect detection through the use of a hierarchical modelling framework that separately estimates the species–environment response and detection process. This is particularly helpful for conservation applications and predictions for rare species, where data are often limited and prediction errors may have significant management consequences. Despite this potential importance, ISDMs remain largely untested under a variety of scenarios. We performed an exploration of key modelling decisions and assumptions on an ISDM using the endangered Baird’s tapir (Tapirus bairdii) as a test species. We found that site area had the strongest effect on the magnitude of population estimates and underlying intensity surface and was driven by estimates of model intercepts. Selecting a site area that accounted for the individual movements of the species within an average home range led to population estimates that coincided with expert estimates. ISDMs that do not account for the individual movements of species will likely lead to less accurate estimates of species intensity (number of individuals per unit area) and thus overall population estimates. This bias could be severe and highly detrimental to conservation actions if uninformed ISDMs are used to estimate global populations of threatened and data-deficient species, particularly those that lack natural history and movement information. However, the ISDM was consistently the most accurate model compared to other approaches, which demonstrates the importance of this new modelling framework and the ability to combine opportunistic data with systematic survey data. Thus, we recommend researchers use ISDMs with conservative movement information when estimating population sizes of rare and data-deficient species. ISDMs could be improved by using a similar parameterization to spatial capture–recapture models that explicitly incorporate animal movement as a model parameter, which would further remove the need for spatial subsampling prior to implementation.
Objective: Few studies have investigated the assessment and functional impact of egocentric and allocentric neglect among stroke patients. This pilot study aimed to determine (1) whether allocentric and egocentric neglect could be dissociated among a sample of stroke patients using eye tracking; (2) the specific patterns of attention associated with each subtype; and (3) the nature of the relationship between neglect subtype and functional outcome. Method: Twenty acute stroke patients were administered neuropsychological assessment batteries, a pencil-and-paper Apples Test to measure neglect subtype, and an adaptation of the Apples Test with an eye tracking measure. To test clinical discriminability, twenty age- and education-matched control participants were administered the eye tracking measure of neglect. Results: The eye tracking measure identified a greater number of individuals as having egocentric and/or allocentric neglect than the pencil-and-paper Apples Test. Classification of neglect subtype based on eye tracking performance was a significant predictor of functional outcome beyond that accounted for by the neuropsychological test performance and Apples Test neglect classification. Preliminary evidence suggests that patients with no neglect symptoms had superior functional outcomes compared with patients with neglect. Patients with combined egocentric and allocentric neglect had poorer functional outcomes than those with either subtype. Functional outcomes of patients with either allocentric or egocentric neglect did not differ significantly. The applications of our findings, to improve neglect detection, are discussed. Conclusion: Results highlight the potential clinical utility of eye tracking for the assessment and identification of neglect subtype among stroke patients to predict functional outcomes. (JINS, 2019, 25, 479–489)
OBJECTIVES/SPECIFIC AIMS: To evaluate the ability of various techniques to track changes in body fluid volumes before and after a rapid infusion of saline. METHODS/STUDY POPULATION: Eight healthy participants (5M; 3F) completed baseline measurements of 1) total body water using ethanol dilution and bioelectrical impedance analysis (BIA) and 2) blood volume, plasma volume and red blood cell (RBC) volume using carbon monoxide rebreathe technique and I-131 albumin dilution. Subsequently, 30mL saline/kg body weight was administered intravenously over 20 minutes after which BIA and ethanol dilution were repeated. RESULTS/ANTICIPATED RESULTS: On average, 2.29±0.35 L saline was infused with an average increase in net fluid input-output (I/O) of 1.56±0.29 L. BIA underestimated measured I/O by −3.4±7.9%, while ethanol dilution did not demonstrate a measurable change in total body water. Carbon monoxide rebreathe differed from I-131 albumin dilution measurements of blood, plasma and RBC volumes by +0.6±2.8%, −5.4±3.6%, and +11.0±4.7%, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: BIA is capable of tracking modest changes in total body water. Carbon monoxide rebreathe appears to be a viable alternative for the I-131 albumin dilution technique to determine blood volume. Together, these two techniques may be useful in monitoring fluid status in patients with impaired fluid regulation.
Understanding factors that are associated with more adaptive death attitudes and competencies can inspire future health-promoting palliative care strategies and inform approaches to training and development for health professionals. The potential importance of meaning, purpose, quality, and values in life for promoting adaptive death attitudes has been highlighted, but there is limited research in this area, particularly in relation to death competence. The purpose of this cross-sectional study was to develop an understanding of demographic and life-related factors associated with perceived death competence, such as meaning in life and quality of life.
During the course enrollment period of a Massive-Online-Open-Course about death and dying, 277 participants completed questionnaires on death competence, meaning in life, quality of life, and sociodemographic background.
Findings indicated that greater presence of meaning in life, quality of life, age, death experience, and carer experience were each statistically significant unique predictors of death competence scores. Life-related variables were more strongly associated with death competence than demographic variables. Bereavement experience and experience caring for the dying was associated with greater death competence, but there were no differences on death competence between health professionals and the general community. Above all other factors, the presence of meaning in life was the strongest predictor of higher perceived competence in coping with death.
Significance of results
The findings demonstrate important interconnections between our attitudes about life and death. Knowledge of factors associated with poorer death competence may help identify those at risk of greater distress when facing death, and might prove useful additions to bereavement risk assessments. Understanding factors associated with greater death competence in health professionals and volunteers may help predict or prevent burnout and compassion fatigue, and help identify who would benefit from additional training and support. Future longitudinal studies including both health professionals and the general community are needed to determine the effect adaptive attitudes toward meaning in life can potentially have on bolstering subsequent adaptive coping and competence regarding death and dying.
OBJECTIVES/SPECIFIC AIMS: Community health coalitions (CHC) aim to improve local cultures of health, health behaviors, and health outcomes. However, challenges sustaining partnerships and activities limit CHC impact. Traditional CHC evaluations survey members about perceived effectiveness, failing to capture underlying network structures and community health outcomes. Thus, we applied a mixed-methods evaluation in eight rural Indiana CHC, triangulating social network analysis [(SNA), conducted in 2017], functioning effectiveness [Coalition Self-Assessment Survey (CSAS), also 2017], and latest county health statistics (2015–2016) to assess existing CHC building efforts, inform best practices, and facilitate the adoption of evidence-based programming. METHODS/STUDY POPULATION: Across the eight rural Indiana CHC, relationships between the three evaluation components were analyzed using Pearson’s correlations. We are now collaborating with Purdue’s Nutrition Education Program Community Wellness Coordinators to scale up evaluation efforts throughout Indiana. RESULTS/ANTICIPATED RESULTS: CHC effectiveness was positively correlated with the average number of connections CHC members held in the network (mean indegree) and negatively correlated with the presence of a network broker (eigenvector centrality). However, effective leadership was positively correlated with opioid deaths and treatment, food insecurity, smoking during pregnancy, lack of healthcare coverage, and fair/ poor health status, and negatively correlated with prenatal care. Effective operating norms was positively correlated with smoking during pregnancy and preterm births, and negatively correlated with prenatal care. Effective action outcomes was positively correlated with opioid deaths and treatments, smoking during pregnancy, preterm births, and fair/ poor health status, and negatively correlated with respondents reporting they had no personal doctor. DISCUSSION/SIGNIFICANCE OF IMPACT: Interestingly, CHC effectiveness was positively correlated with poor county health outcomes related to infant well-being. Thus, CHC may develop in counties with a high unmet need for effective pregnancy and infant services. Alternatively, the prevalent CHC focus on obesity prevention may eclipse programmatic efforts to improve infant well-being. Longitudinal evaluations and scaling up evaluation efforts across Indiana are being pursued to clarify trajectories and inform best practices, which in turn should provide recommendations for network structures to improve CHC effectiveness and county health.
Community stakeholders often participate in community research training curricula development. There is limited information describing how their input informs curricula. This paper describes input solicitation methods, input received, and examples of its integration.
From June 2014 to June 2016, community members (CMs) and community-based organizations (CBOs) guided curricula development tailored for CMs and CBOs, respectively. Engagement methods included a strategic planning retreat, surveys, a listening session, workgroup meetings, and community engagement studios. Descriptive statistics were used to summarize survey input. For other methods, input was extracted and compiled from facilitator notes.
CMs (n=37) and CBOs (n=83) providing input included patients and caregivers and advocacy, community service, and faith-based organizations, respectively. The major feedback categories were training topic priorities, format (e.g., face-to-face vs. online), logistics (e.g., training frequency), and compensation (e.g., appropriateness). Input directly guided design of CBO and CM curricula (e.g., additional time devoted to specific topics based on feedback) or helped to finalize logistics.
Multiple quantitative and qualitative methods can be used to elicit input from community stakeholders to inform the development of community research training curricula. This input is essential for the development of training curricula that are culturally relevant and acceptable.
Japan's remarkable postwar growth spurt in the 1960s would not have been possible without Japan's alliance with the United States. Policy makers, political scientists, economists, historians, and journalists on both sides of the Pacific have made this claim, but no study has yet tested it with modern statistical methods. In this article, we compare the economic growth trajectories of Japan and a statistically constructed “synthetic” Japan, which had a similar profile until the late 1950s but did not experience the consolidation of the US–Japan alliance, a process that began in 1958 and culminated with the signing of a formal defense pact in January 1960. We find that Japan's per capita gross domestic product (GDP) grew much faster than the synthetic Japan's from 1958 to 1968. We substantiate these results with in-depth historical analyses on how the United States facilitated Japan's economic miracle.
To investigate the association of policy, systems and environmental factors with improvement in household food security among low-income Indiana households with children after a Supplemental Nutrition Assistance Program-Education (SNAP-Ed) direct nutrition education intervention.
Household food security scores measured by the eighteen-item US Household Food Security Survey Module in a longitudinal randomized and controlled SNAP-Ed intervention study conducted from August 2013 to April 2015 were the response variable. Metrics to quantify environmental factors including classification of urban or rural county status; the number of SNAP-authorized stores, food pantries and recreational facilities; average fair market housing rental price; and natural amenity rank were collected from government websites and data sets covering the years 2012–2016 and used as covariates in mixed multiple linear regression modelling.
Thirty-seven Indiana counties, USA, 2012–2016.
SNAP-Ed eligible adults from households with children (n 328).
None of the environmental factors investigated were significantly associated with changes in household food security in this exploratory study.
SNAP-Ed improves food security regardless of urban or rural location or the environmental factors investigated. Expansion of SNAP-Ed in rural areas may support food access among the low-income population and reduce the prevalence of food insecurity in rural compared with urban areas. Further investigation into policy, systems and environmental factors of the Social Ecological Model are warranted to better understand their relationship with direct SNAP-Ed and their impact on diet-related behaviours and food security.