We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
After the 2016 election upheaval and polarized public discourse in the United States and the rise of radical-right and populist parties across the globe, a new phenomenon in online charitable giving has emerged – donating motivated by rage. This Element defines this phenomenon, discusses its meaning amidst the current body of research and knowledge on emotions and charitable giving, the implications of viral fundraising and increased social media use by both donors and nonprofit organizations, the intersectionality of rage giving and its meaning for practitioners and nonprofit organizations, the understanding of giving as a form of civic engagement, and the exploration of philanthropy as a tool for social movements and social change. Previous research shows contextual variation in charitable giving motivations; however, giving motivated by feelings of anger and rage is an unstudied behavioral shift in online giving.
COVID-19 has markedly impacted the provision of neurodevelopmental care. In response, the Cardiac Neurodevelopmental Outcome Collaborative established a Task Force to assess the telehealth practices of cardiac neurodevelopmental programmes during COVID-19, including adaptation of services, test protocols and interventions, and perceived obstacles, disparities, successes, and training needs.
Study Design:
A 47-item online survey was sent to 42 Cardiac Neurodevelopmental Outcome Collaborative member sites across North America within a 3-week timeframe (22 July to 11 August 2020) to collect cross-sectional data on practices.
Results:
Of the 30 participating sites (71.4% response rate), all were providing at least some clinical services at the time of the survey and 24 sites (80%) reported using telehealth. All but one of these sites were offering new telehealth services in response to COVID-19, with the most striking change being the capacity to offer new intervention services for children and their caregivers. Only a third of sites were able to carry out standardised, performance-based, neurodevelopmental testing with children and adolescents using telehealth, and none had completed comparable testing with infants and toddlers. Barriers associated with language, child ability, and access to technology were identified as contributing to disparities in telehealth access.
Conclusions:
Telehealth has enabled continuation of at least some cardiac neurodevelopmental services during COVID-19, despite the challenges experienced by providers, children, families, and health systems. The Cardiac Neurodevelopmental Outcome Collaborative provides a unique platform for sharing challenges and successes across sites, as we continue to shape an evidence-based, efficient, and consistent approach to the care of individuals with CHD.
Capacity development is critical to long-term conservation success, yet we lack a robust and rigorous understanding of how well its effects are being evaluated. A comprehensive summary of who is monitoring and evaluating capacity development interventions, what is being evaluated and how, would help in the development of evidence-based guidance to inform design and implementation decisions for future capacity development interventions and evaluations of their effectiveness. We built an evidence map by reviewing peer-reviewed and grey literature published since 2000, to identify case studies evaluating capacity development interventions in biodiversity conservation and natural resource management. We used inductive and deductive approaches to develop a coding strategy for studies that met our criteria, extracting data on the type of capacity development intervention, evaluation methods, data and analysis types, categories of outputs and outcomes assessed, and whether the study had a clear causal model and/or used a systems approach. We found that almost all studies assessed multiple outcome types: most frequent was change in knowledge, followed by behaviour, then attitude. Few studies evaluated conservation outcomes. Less than half included an explicit causal model linking interventions to expected outcomes. Half of the studies considered external factors that could influence the efficacy of the capacity development intervention, and few used an explicit systems approach. We used framework synthesis to situate our evidence map within the broader literature on capacity development evaluation. Our evidence map (including a visual heat map) highlights areas of low and high representation in investment in research on the evaluation of capacity development.
We implemented a parent–teacher Vanderbilt agreement program to increase return rates of Vanderbilt assessment scales for children in our primary care practice, and compared the assessment return rate before and after agreement signature.
Methods
We retrospectively reviewed children diagnosed with attention-deficit/hyperactivity disorder (ADHD) who had a signed Vanderbilt agreement and were under continuous care at our clinic. Return rates were compared 1 year before and 1 year after the agreement date.
Results
Among 195 children, prior to the agreement, 71% returned teacher assessments, and 59% returned parent forms; after the intervention, assessment rates were not significantly different (76%, p = .255; and 65%, p = .185, respectively). The median number of returned assessments increased after the agreement.
Conclusions
Lack of documented parent and teacher Vanderbilt assessments remain a barrier to appropriate management of ADHD. Improving the rate of assessments returned is an important outcome for treating ADHD in the primary care setting.
Seeman, Morris, and Summers misrepresent or misunderstand the arguments we have made, as well as their own previous work. Here, we correct these inaccuracies. We also reiterate our support for hypothesis-driven and evidence-based research.
In April 2019, the U.S. Fish and Wildlife Service (USFWS) released its recovery plan for the jaguar Panthera onca after several decades of discussion, litigation and controversy about the status of the species in the USA. The USFWS estimated that potential habitat, south of the Interstate-10 highway in Arizona and New Mexico, had a carrying capacity of c. six jaguars, and so focused its recovery programme on areas south of the USA–Mexico border. Here we present a systematic review of the modelling and assessment efforts over the last 25 years, with a focus on areas north of Interstate-10 in Arizona and New Mexico, outside the recovery unit considered by the USFWS. Despite differences in data inputs, methods, and analytical extent, the nine previous studies found support for potential suitable jaguar habitat in the central mountain ranges of Arizona and New Mexico. Applying slightly modified versions of the USFWS model and recalculating an Arizona-focused model over both states provided additional confirmation. Extending the area of consideration also substantially raised the carrying capacity of habitats in Arizona and New Mexico, from six to 90 or 151 adult jaguars, using the modified USFWS models. This review demonstrates the crucial ways in which choosing the extent of analysis influences the conclusions of a conservation plan. More importantly, it opens a new opportunity for jaguar conservation in North America that could help address threats from habitat losses, climate change and border infrastructure.
High dietary phosphorus (P), particularly soluble salts, may contribute to chronic kidney disease development in cats. The aim of the present study was to assess the safety of P supplied at 1 g/1000 kcal (4184kJ) from a highly soluble P salt in P-rich dry format feline diets. Seventy-five healthy adult cats (n 25/group) were fed either a low P control (1·4 g/1000 kcal [4184kJ]; Ca:P ratio 0·97) or one of two test diets with 4 g/1000 kcal (4184 kJ); Ca:P 1·04 or 5 g/1000 kcal (4184kJ); Ca:P 1·27, both incorporating 1 g/1000 kcal (4184 kJ) sodium tripolyphosphate (STPP) – for a period of 30 weeks in a randomised parallel-group study. Health markers in blood and urine, glomerular filtration rate, renal ultrasound and bone density were assessed at baseline and at regular time points. At the end of the test period, responses following transition to a commercial diet (total P – 2·34 g/1000 kcal [4184kJ], Ca:P 1·3) for a 4-week washout period were also assessed. No adverse effects on general, kidney or bone (skeletal) function and health were observed. P and Ca balance, some serum biochemistry parameters and regulatory hormones were increased in cats fed test diets from week 2 onwards (P ≤ 0·05). Data from the washout period suggest that increased serum creatinine and urea values observed in the two test diet groups were influenced by dietary differences during the test period, and not indicative of changes in renal function. The present data suggest no observed adverse effect level for feline diets containing 1 g P/1000 kcal (4184 kJ) from STPP and total P level of up to 5 g/1000 kcal (4184 kJ) when fed for 30 weeks.
On January 29, 2020, a total of 195 US citizens were evacuated from the coronavirus disease 2019 (COVID-19) epidemic in Wuhan, China, to March Air Reserve Base in Riverside, California, and entered the first federally mandated quarantine in over 50 years. With less than 1-d notice, a multi-disciplinary team from Riverside County and Riverside University Health System in conjunction with local and federal agencies established on-site 24-h medical care and behavioral health support. This report details the coordinated efforts by multiple teams that took place to provide care for the passengers and to support the surrounding community.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Methods:
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Results:
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Conclusions:
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Critical shortages of personal protective equipment, especially N95 respirators, during the coronavirus disease 2019 (COVID-19) pandemic continues to be a source of concern. Novel methods of N95 filtering face-piece respirator decontamination that can be scaled-up for in-hospital use can help address this concern and keep healthcare workers (HCWs) safe.
Methods:
A multidisciplinary pragmatic study was conducted to evaluate the use of an ultrasonic room high-level disinfection system (HLDS) that generates aerosolized peracetic acid (PAA) and hydrogen peroxide for decontamination of large numbers of N95 respirators. A cycle duration that consistently achieved disinfection of N95 respirators (defined as ≥6 log10 reductions in bacteriophage MS2 and Geobacillus stearothermophilus spores inoculated onto respirators) was identified. The treated masks were assessed for changes to their hydrophobicity, material structure, strap elasticity, and filtration efficiency. PAA and hydrogen peroxide off-gassing from treated masks were also assessed.
Results:
The PAA room HLDS was effective for disinfection of bacteriophage MS2 and G. stearothermophilus spores on respirators in a 2,447 cubic-foot (69.6 cubic-meter) room with an aerosol deployment time of 16 minutes and a dwell time of 32 minutes. The total cycle time was 1 hour and 16 minutes. After 5 treatment cycles, no adverse effects were detected on filtration efficiency, structural integrity, or strap elasticity. There was no detectable off-gassing of PAA and hydrogen peroxide from the treated masks at 20 and 60 minutes after the disinfection cycle, respectively.
Conclusion:
The PAA room disinfection system provides a rapidly scalable solution for in-hospital decontamination of large numbers of N95 respirators during the COVID-19 pandemic.
The perinatal period is a vulnerable time for the development of psychopathology, particularly mood and anxiety disorders. In the study of maternal anxiety, important questions remain regarding the association between maternal anxiety symptoms and subsequent child outcomes. This study examined the association between depressive and anxiety symptoms, namely social anxiety, panic, and agoraphobia disorder symptoms during the perinatal period and maternal perception of child behavior, specifically different facets of development and temperament. Participants (N = 104) were recruited during pregnancy from a community sample. Participants completed clinician-administered and self-report measures of depressive and anxiety symptoms during the third trimester of pregnancy and at 16 months postpartum; child behavior and temperament outcomes were assessed at 16 months postpartum. Child development areas included gross and fine motor skills, language and problem-solving abilities, and personal/social skills. Child temperament domains included surgency, negative affectivity, and effortful control. Hierarchical multiple regression analyses demonstrated that elevated prenatal social anxiety symptoms significantly predicted more negative maternal report of child behavior across most measured domains. Elevated prenatal social anxiety and panic symptoms predicted more negative maternal report of child effortful control. Depressive and agoraphobia symptoms were not significant predictors of child outcomes. Elevated anxiety symptoms appear to have a distinct association with maternal report of child development and temperament. Considering the relative influence of anxiety symptoms, particularly social anxiety, on maternal report of child behavior and temperament can help to identify potential difficulties early on in mother–child interactions as well as inform interventions for women and their families.
This article explores the influence of East Asia's economic growth on the evolution of American neoconservative thought in the 1970s and 1980s. It traces how prominent neoconservative thinkers—Nathan Glazer, Peter L. Berger, Herman Kahn, Michael Novak, and Lawrence E. Harrison—developed the claim that the region's prosperity stemmed from its alleged Confucian tradition. Drawing in part from East Asian leaders and scholars, they argued that the region's growth demonstrated that tradition had facilitated, rather than hampered, the development of a distinct East Asian capitalist modernity. The article argues that this Confucian thesis helped American neoconservatives articulate their conviction that “natural” social hierarchies, religious commitment, and traditional families were necessary for healthy and free capitalist societies. It then charts how neoconservatives mobilized this interpretation of Confucian East Asia against postcolonial critiques of capitalism, especially dependency theory. East Asia, they claimed, demonstrated that poverty and wealth were determined not by patterns of welfare, structural exploitation, or foreign assistance, but values and culture. The concept of Confucian capitalism, the article shows, was central to neoconservatives’ broad ideological agenda of protecting political, economic, and racial inequality under the guise of values, culture, and tradition.
Economic hardship (EH) may link to poorer child diet, however whether this association is due to resource limitations or effects on family functioning is unknown. This study examines whether parenting stress mediates the association between EH and child consumption of foods high in saturated fats and added sugars (SFAS).
Design:
Data were collected from the Fragile Families and Child Wellbeing study. EH was assessed using eight items collected when children were between 1–9 years old. Mothers reported parenting stress and frequency of child consumption of high SFAS foods when children were 9 years old. Latent growth curve modelling (LGCM) and structural equation modelling tested direct associations between the starting level/rate of change in EH and high SFAS food consumption, and parenting stress as a mediator of the association.
Setting:
Twenty US cities.
Participants:
Mothers/children (n 3846) followed birth through age 9 years, oversampled ‘high-risk’, unmarried mothers.
Results:
LGCM indicated a curvilinear trend in EH from ages 1–9, with steeper increases from ages 3–9 years. EH did not directly predict the frequency of high SFAS foods. Average EH at 3 and 5 years and change in EH from ages 1–9 predicted higher parenting stress, which in turn predicted more frequent consumption of high SFAS foods.
Conclusions:
Findings suggest it may be important to consider parenting stress in early prevention efforts given potential lasting effects of early life EH on child consumption of high SFAS foods. Future research should explore how supports and resources may buffer effects of EH-related stress on parents and children.
Species distribution models (SDMs) are statistical tools used to develop continuous predictions of species occurrence. ‘Integrated SDMs’ (ISDMs) are an elaboration of this approach with potential advantages that allow for the dual use of opportunistically collected presence-only data and site-occupancy data from planned surveys. These models also account for survey bias and imperfect detection through the use of a hierarchical modelling framework that separately estimates the species–environment response and detection process. This is particularly helpful for conservation applications and predictions for rare species, where data are often limited and prediction errors may have significant management consequences. Despite this potential importance, ISDMs remain largely untested under a variety of scenarios. We performed an exploration of key modelling decisions and assumptions on an ISDM using the endangered Baird’s tapir (Tapirus bairdii) as a test species. We found that site area had the strongest effect on the magnitude of population estimates and underlying intensity surface and was driven by estimates of model intercepts. Selecting a site area that accounted for the individual movements of the species within an average home range led to population estimates that coincided with expert estimates. ISDMs that do not account for the individual movements of species will likely lead to less accurate estimates of species intensity (number of individuals per unit area) and thus overall population estimates. This bias could be severe and highly detrimental to conservation actions if uninformed ISDMs are used to estimate global populations of threatened and data-deficient species, particularly those that lack natural history and movement information. However, the ISDM was consistently the most accurate model compared to other approaches, which demonstrates the importance of this new modelling framework and the ability to combine opportunistic data with systematic survey data. Thus, we recommend researchers use ISDMs with conservative movement information when estimating population sizes of rare and data-deficient species. ISDMs could be improved by using a similar parameterization to spatial capture–recapture models that explicitly incorporate animal movement as a model parameter, which would further remove the need for spatial subsampling prior to implementation.
OBJECTIVES/SPECIFIC AIMS: To evaluate the ability of various techniques to track changes in body fluid volumes before and after a rapid infusion of saline. METHODS/STUDY POPULATION: Eight healthy participants (5M; 3F) completed baseline measurements of 1) total body water using ethanol dilution and bioelectrical impedance analysis (BIA) and 2) blood volume, plasma volume and red blood cell (RBC) volume using carbon monoxide rebreathe technique and I-131 albumin dilution. Subsequently, 30mL saline/kg body weight was administered intravenously over 20 minutes after which BIA and ethanol dilution were repeated. RESULTS/ANTICIPATED RESULTS: On average, 2.29±0.35 L saline was infused with an average increase in net fluid input-output (I/O) of 1.56±0.29 L. BIA underestimated measured I/O by −3.4±7.9%, while ethanol dilution did not demonstrate a measurable change in total body water. Carbon monoxide rebreathe differed from I-131 albumin dilution measurements of blood, plasma and RBC volumes by +0.6±2.8%, −5.4±3.6%, and +11.0±4.7%, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: BIA is capable of tracking modest changes in total body water. Carbon monoxide rebreathe appears to be a viable alternative for the I-131 albumin dilution technique to determine blood volume. Together, these two techniques may be useful in monitoring fluid status in patients with impaired fluid regulation.
Objective: Few studies have investigated the assessment and functional impact of egocentric and allocentric neglect among stroke patients. This pilot study aimed to determine (1) whether allocentric and egocentric neglect could be dissociated among a sample of stroke patients using eye tracking; (2) the specific patterns of attention associated with each subtype; and (3) the nature of the relationship between neglect subtype and functional outcome. Method: Twenty acute stroke patients were administered neuropsychological assessment batteries, a pencil-and-paper Apples Test to measure neglect subtype, and an adaptation of the Apples Test with an eye tracking measure. To test clinical discriminability, twenty age- and education-matched control participants were administered the eye tracking measure of neglect. Results: The eye tracking measure identified a greater number of individuals as having egocentric and/or allocentric neglect than the pencil-and-paper Apples Test. Classification of neglect subtype based on eye tracking performance was a significant predictor of functional outcome beyond that accounted for by the neuropsychological test performance and Apples Test neglect classification. Preliminary evidence suggests that patients with no neglect symptoms had superior functional outcomes compared with patients with neglect. Patients with combined egocentric and allocentric neglect had poorer functional outcomes than those with either subtype. Functional outcomes of patients with either allocentric or egocentric neglect did not differ significantly. The applications of our findings, to improve neglect detection, are discussed. Conclusion: Results highlight the potential clinical utility of eye tracking for the assessment and identification of neglect subtype among stroke patients to predict functional outcomes. (JINS, 2019, 25, 479–489)
Understanding factors that are associated with more adaptive death attitudes and competencies can inspire future health-promoting palliative care strategies and inform approaches to training and development for health professionals. The potential importance of meaning, purpose, quality, and values in life for promoting adaptive death attitudes has been highlighted, but there is limited research in this area, particularly in relation to death competence. The purpose of this cross-sectional study was to develop an understanding of demographic and life-related factors associated with perceived death competence, such as meaning in life and quality of life.
Method
During the course enrollment period of a Massive-Online-Open-Course about death and dying, 277 participants completed questionnaires on death competence, meaning in life, quality of life, and sociodemographic background.
Result
Findings indicated that greater presence of meaning in life, quality of life, age, death experience, and carer experience were each statistically significant unique predictors of death competence scores. Life-related variables were more strongly associated with death competence than demographic variables. Bereavement experience and experience caring for the dying was associated with greater death competence, but there were no differences on death competence between health professionals and the general community. Above all other factors, the presence of meaning in life was the strongest predictor of higher perceived competence in coping with death.
Significance of results
The findings demonstrate important interconnections between our attitudes about life and death. Knowledge of factors associated with poorer death competence may help identify those at risk of greater distress when facing death, and might prove useful additions to bereavement risk assessments. Understanding factors associated with greater death competence in health professionals and volunteers may help predict or prevent burnout and compassion fatigue, and help identify who would benefit from additional training and support. Future longitudinal studies including both health professionals and the general community are needed to determine the effect adaptive attitudes toward meaning in life can potentially have on bolstering subsequent adaptive coping and competence regarding death and dying.