To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Human behavioral ecology (HBE) applies the principles of evolutionary theory and optimisation to the study of human behavioural and cultural diversity. Among other things, HBE attempts to explain variation in behaviour as adaptive solutions to the competing life-history demands of growth, development, reproduction, parental care, and mate acquisition. This book is a comprehensive introduction to the theoretical orientation and specific findings of HBE. It consolidates the insights of evolution and human behaviour into a single volume that reflects the current state and future of the field. It brings together leading scholars from across the evolutionary social sciences to provide a comprehensive and thought-provoking review of the state of the topic. Throughout, the authors explain the latest developments in theory and highlight critical debates in the literature, while also engaging readers with ethnographic insights and field-based studies that remain at the core of human behavioral ecology.
Pediatric patients transferred by Emergency Medical Services (EMS) from urgent care (UC) and office-based physician practices to the emergency department (ED) following activation of the 9-1-1 EMS system are an under-studied population with scarce literature regarding outcomes for these children. The objectives of this study were to describe this population, explore EMS level-of-care transport decisions, and examine ED outcomes.
This was a retrospective review of patients zero to <15 years of age transported by EMS from UC and office-based physician practices to the ED of two pediatric receiving centers from January 2017 through December 2019. Variables included reason for transfer, level of transport, EMS interventions and medications, ED medications/labs/imaging ordered in the first hour, ED procedures, ED disposition, and demographics. Data were analyzed with descriptive statistics, X2 test, point biserial correlation, two-sample z test, Mann-Whitney U test, and 2-way ANOVA.
A total of 450 EMS transports were included in this study: 382 Advanced Life Support (ALS) runs and 68 Basic Life Support (BLS) runs. The median patient age was 2.66 years, 60.9% were male, and 60.7% had private insurance. Overall, 48.9% of patients were transported from an office-based physician practice and 25.1% were transported from UC. Almost one-half (48.7%) of ALS patients received an EMS intervention or medication, as did 4.41% of BLS patients. Respiratory distress was the most common reason for transport (46.9%). Supplemental oxygen was the most common EMS intervention and albuterol was the most administered EMS medication. There was no significant association between level of transport and ED disposition (P = .23). The in-patient admission rate for transported patients was significantly higher than the general ED admission rate (P <.001).
This study demonstrates that pediatric patients transferred via EMS after activation of the 9-1-1 system from UC and medical offices are more acutely ill than the general pediatric ED population and are likely sicker than the general pediatric EMS population. Paramedics appear to be making appropriate level-of-care transport decisions.
Although offspring of women exposed to childhood trauma exhibit elevated rates of psychopathology, many children demonstrate resilience to these intergenerational impacts. Among the variety of factors that likely contribute to resilience, epigenetic processes have been suggested to play an important role. The current study used a prospective design to test the novel hypothesis that offspring epigenetic aging – a measure of methylation differences that are associated with infant health outcomes – moderates the relationship between maternal exposure to childhood adversity and offspring symptomatology. Maternal childhood adversity was self-reported during pregnancy via the ACEs survey and the CTQ, which assessed total childhood trauma as well as maltreatment subtypes (i.e., emotional, physical, and sexual abuse). Offspring blood samples were collected at or shortly after birth and assayed on a DNA methylation microarray, and offspring symptomatology was assessed with the CBCL/1.5–5 when offspring were 2–4 years old. Results indicated that maternal childhood trauma, particularly sexual abuse, was predictive of offspring symptoms (ps = 0.003–0.03). However, the associations between maternal sexual abuse and offspring symptomatology were significantly attenuated in offspring with accelerated epigenetic aging. These findings further our understanding of how epigenetic processes may contribute to and attenuate the intergenerational link between stress and psychopathology.
Globally, burns are responsible for around 11 million injuries and 180 000 burn-related deaths yearly. Unfortunately, 9 of 10 burn injuries and deaths happen in low-and-middle-income countries (LMICs) such as Pakistan. One in three people admitted to hospitals with burn injuries die within three weeks, and survivors face serious lifelong physical, emotional and psychosocial problems. This may result in anxiety, depression, post-traumatic stress disorder, increased mortality and social disintegration. This study aims to evaluate if implementation of a culturally adapted multidisciplinary rehabilitation programme for burn survivors is clinically and cost-effective, sustainable and scalable across Pakistan.
- To understand lived experiences of burn survivors, families, and other stakeholders including the experience of care and impact of burns To work together with key stakeholders (such as burn survivors, family members) to adapt a culturally appropriate affordable burn rehabilitation programme
- To undertake social media campaigns to promote burn prevention and risk assessment at communities, workplaces/industries/households; improve first aid; and address burn related stigma
- To work with policy makers/parliamentarians to develop national guidelines for burns care and prevention in Pakistan
There are 6 work-packages (WPs). WP1 is to co-adapt a culturally appropriate burn care and rehabilitation programme. WP2 will develop and implement national burn registry on WHO’s initiative. WP3 is a cluster randomised controlled trial to determine clinical and cost-effectiveness in Pakistan. WP4 will evaluate social media campaigns for burn prevention and reduce stigma. WP5 involves working with key-stakeholders for burns-related care and policy and WP6 offers sustainable capacity and capability for burns treatment and rehabilitation.
A clinical and cost-effective burn care quality and rehabilitation programme may have a huge potential to save lives and contribute health and socio-economic benefits for patients, families, and the healthcare system in Pakistan. The nation-wide implementation and involvement of burn centres across all provinces offer an excellent opportunity to overcome the problem of burn care access experienced in LMICs.
To date, burns prevention, care and rehabilitation have not received sufficient attention in policy initiatives in Pakistan and other LMICs. This study is an excellent opportunity to evaluate culturally adapted burn care and rehabilitation programmes that can be implemented across LMICs. We will disseminate our findings widely, using a variety of approaches, supported by our stakeholder and patient advisory groups.
Airway management is a vital component of administering anesthesia, allowing for the exchange of gases between the patient and the surrounding atmosphere. Difficult or unsuccessful management of the airway is a significant source of anesthesia-related morbidity and mortality . As such, it is important for anesthesia providers to be adept at all aspects of managing the airway. A thorough understanding of the pertinent anatomy and physiology, the ability to use clinical evaluation to identify potential difficulties, and a mastery of interventional techniques and procedures are crucial to safe and effective airway management. This chapter presents a comprehensive overview of the elements related to effective airway management.
We investigated a decrease in antibiotic prescribing for respiratory illnesses in 2 academic urgent-care clinics during the coronavirus disease 2019 (COVID-19) pandemic using semistructured clinician interviews.
We conducted a quality-improvement project from November 2020 to May 2021. We investigated provider antibiotic decision making using a mixed-methods explanatory design including interviews. We analyzed transcripts using a thematic framework approach to identify emergent themes. Our performance measure was antibiotic prescribing rate (APR) for encounters with respiratory diagnosis billing codes. We extracted billing and prescribing data from the electronic medical record and assessed differences using run charts, p charts and generalized linear regression.
We observed significant reductions in the APR early during the COVID-19 pandemic (relative risk [RR], 0.20; 95% confidence interval [CI], 0.17–0.25), which was maintained over the study period (P < .001). The average APRs were 14% before the COVID-19 pandemic, 4% during the QI project, and 7% after the project. All providers prescribed less antibiotics for respiratory encounters during COVID-19, but only 25% felt their practice had changed. Themes from provider interviews included changing patient expectations and provider approach to respiratory encounters during COVID-19, the impact of increased telemedicine encounters, and the changing epidemiology of non–COVID-19 respiratory infections.
Our findings suggest that the decrease in APR was likely multifactorial. The average APR decreased significantly during the pandemic. Although the APR was slightly higher after the QI project, it did not reach prepandemic levels. Future studies should explore how these factors, including changing patient expectations, can be leveraged to improve urgent-care antibiotic stewardship.
Global farmed finfish production increased from 9 to 56 million tonnes between 1990 and 2019. Although finfishes are now widely recognised as sentient beings, production is still being quantified as biomass rather than number of individuals (in contrast to farmed mammals and birds). Here, we estimate the global number of farmed finfishes slaughtered using FAO aquaculture production tonnages (1990–2019 data) and estimates of individual weight at killing (determined from internet searches at species and country level where possible). We relate these numbers to knowledge on humane slaughter, animal welfare law, and certification schemes. Since 1990, farmed finfish numbers killed annually for food have increased nine-fold, to 124 billion (1.24 × 1011, range 78–171 billion) in 2019. This figure does not represent the total number farmed (due to mortalities during rearing and non-food production) and is expected to increase as aquaculture expands. Our estimates indicate that farmed finfishes now outnumber the 80 billion farmed birds and mammals killed globally each year for food. The majority are produced in Asia. Inhumane slaughter practices cause suffering for most farmed finfishes. Most, 70–72%, have no legal welfare protection, and less than 1% have any fish-specific legal protection, at slaughter. The main global certification schemes in 2013–2015 accounted for 2% of slaughtered farmed finfishes. Fishes for which species-specific parameters for automated humane stunning are published comprise 20–24%. As the dominant taxa of farmed vertebrates, finfishes would benefit from better welfare if species-specific humane slaughter was defined and incorporated into laws and certification schemes.
To analyze the frequency and rates of community respiratory virus infections detected in patients at the National Institutes of Health Clinical Center (NIHCC) between January 2015 and March 2021, comparing the trends before and during the coronavirus disease 2019 (COVID-19) pandemic.
We conducted a retrospective study comparing frequency and rates of community respiratory viruses detected in NIHCC patients between January 2015 and March 2021. Test results from nasopharyngeal swabs and washes, bronchoalveolar lavages, and bronchial washes were included in this study. Results from viral-challenge studies and repeated positives were excluded. A quantitative data analysis was completed using cross tabulations. Comparisons were performed using mixed models, applying the Dunnett correction for multiplicity.
Frequency of all respiratory pathogens declined from an annual range of 0.88%–1.97% between January 2015 and March 2020 to 0.29% between April 2020 and March 2021. Individual viral pathogens declined sharply in frequency during the same period, with no cases of influenza A/B orparainfluenza and 1 case of respiratory syncytial virus (RSV). Rhino/enterovirusdetection continued, but with a substantially lower frequency of 4.27% between April 2020 and March 2021, compared with an annual range of 8.65%–18.28% between January 2015 and March 2020.
The decrease in viral respiratory infections detected in NIHCC patients during the pandemic was likely due to the layered COVID-19 prevention and mitigation measures implemented in the community and the hospital. Hospitals should consider continuing the use of nonpharmaceutical interventions in the future to prevent nosocomial transmission of respiratory viruses during times of high community viral load.
The National Institutes of Health launched the NIH Centers for Accelerated Innovation and the Research Evaluation and Commercialization Hubs programs to develop approaches and strategies to promote academic entrepreneurship and translate research discoveries into products and tools to help patients. The two programs collectively funded 11 sites at individual research institutions or consortia of institutions around the United States. Sites provided funding, project management, and coaching to funded investigators and commercialization education programs open to their research communities.
We implemented an evaluation program that included longitudinal tracking of funded technology development projects and commercialization outcomes; interviews with site teams, funded investigators, and relevant institutional and innovation ecosystem stakeholders and analysis and review of administrative data.
As of May 2021, interim results for 366 funded projects show that technologies have received nearly $1.7 billion in follow-on funding to-date. There were 88 start-ups formed, a 40% Small Business Innovation Research/Small Business Technology Transfer application success rate, and 17 licenses with small and large businesses. Twelve technologies are currently in clinical testing and three are on the market.
Best practices used by the sites included leadership teams using milestone-based project management, external advisory boards that evaluated funding applications for commercial merit as well as scientific, sustained engagement with the academic community about commercialization in an effort to shift attitudes about commercialization, application processes synced with education programs, and the provision of project managers with private-sector product development expertise to coach funded investigators.
To summarize existing literature on the mental health impact of the Flint Water Crisis.
In March 2020, we searched 5 databases for literature exploring the psychological consequences of the crisis. Main findings were extracted.
132 citations were screened and 11 included in the review. Results suggest a negative psychological effect caused by the water crisis, including anxiety and health worries, exacerbated by lowered trust in public health officials, uncertainty about the long-term impacts of the crisis, financial hardships, stigma, and difficulties seeking help. There was evidence that concerns about tap water continued even after the state of emergency was lifted.
With a possible compound effect to residents of Flint with the recent COVID-19 pandemic, the results highlight the need for more resources for psychological health interventions in Flint as well as a need for local governments and health authorities to regain the trust of those affected by the Flint Water Crisis.
To assess the relationship between food insecurity, sleep quality, and days with mental and physical health issues among college students.
An online survey was administered. Food insecurity was assessed using the ten-item Adult Food Security Survey Module. Sleep was measured using the nineteen-item Pittsburgh Sleep Quality Index (PSQI). Mental health and physical health were measured using three items from the Healthy Days Core Module. Multivariate logistic regression was conducted to assess the relationship between food insecurity, sleep quality, and days with poor mental and physical health.
Twenty-two higher education institutions.
College students (n 17 686) enrolled at one of twenty-two participating universities.
Compared with food-secure students, those classified as food insecure (43·4 %) had higher PSQI scores indicating poorer sleep quality (P < 0·0001) and reported more days with poor mental (P < 0·0001) and physical (P < 0·0001) health as well as days when mental and physical health prevented them from completing daily activities (P < 0·0001). Food-insecure students had higher adjusted odds of having poor sleep quality (adjusted OR (AOR): 1·13; 95 % CI 1·12, 1·14), days with poor physical health (AOR: 1·01; 95 % CI 1·01, 1·02), days with poor mental health (AOR: 1·03; 95 % CI 1·02, 1·03) and days when poor mental or physical health prevented them from completing daily activities (AOR: 1·03; 95 % CI 1·02, 1·04).
College students report high food insecurity which is associated with poor mental and physical health, and sleep quality. Multi-level policy changes and campus wellness programmes are needed to prevent food insecurity and improve student health-related outcomes.
Psychosocial stress in childhood and adolescence is linked to stress system dysregulation, although few studies have examined the relative impacts of parental harshness and parental disengagement. This study prospectively tested whether parental harshness and disengagement show differential associations with overall cortisol output in adolescence. Associations between overall cortisol output and adolescent mental health problems were tested concurrently. Adolescents from the Fragile Families and Child Wellbeing Study (FFCWS) provided hair samples for cortisol assay at 15 years (N = 171). Caregivers reported on parental harshness and disengagement experiences at 1, 3, 5, 9, and 15 years, and adolescents reported at 15 years. Both parent and adolescent reported depressive and anxiety symptoms and antisocial behaviors at 15. Greater parental harshness from 1–15 years, and harshness reported at 15 years in particular, was associated with higher overall cortisol output at 15. Greater parental disengagement from 1–15 years, and disengagement at 1 year specifically, was associated with lower cortisol output. There were no significant associations between cortisol output and depressive symptoms, anxiety symptoms, or antisocial behaviors. These results suggest that the unique variances of parental harshness and disengagement may have opposing associations with cortisol output at 15 years, with unclear implications for adolescent mental health.
Background: When control mechanisms such as water temperature and biocide level are insufficient, Legionella, the causative bacteria of Legionnaires’ disease, can proliferate in water distribution systems in buildings. Guidance and oversight bodies are increasingly prioritizing water safety programs in healthcare facilities to limit Legionella growth. However, ensuring optimal implementation in large buildings is challenging. Much is unknown, and sometimes assumed, about whether building and campus characteristics influence Legionella growth. We used an extensive real-world environmental Legionella data set in the Veterans Health Administration (VHA) healthcare system to examine infrastructure characteristics and Legionella positivity. Methods: VHA medical facilities across the country perform quarterly potable water sampling of healthcare buildings for Legionella detection as part of a comprehensive water safety program. Results are reported to a standardized national database. We did an exploratory univariate analysis of facility-reported Legionella data from routine potable water samples taken in 2015 to 2018, in conjunction with infrastructure characteristics available in a separate national data set. This review examined the following characteristics: building height (number of floors), building age (reported construction year), and campus acreage. Results: The final data set included 201,936 water samples from 819 buildings. Buildings with 1–5 floors (n = 634) had a Legionella positivity rate of 5.3%, 6–10 floors (n = 104) had a rate of 6.4%, 11–15 floors (n = 36) had a rate of 8.1%, and 16–22 floors (n = 9) had a rate of 8.8%. All rates were significantly different from each other except 11–15 floors and 16–22 floors (P < .05, χ2). The oldest buildings (1800s) had significantly less (P < .05, χ2) Legionella positivity than those built between 1900 and 1939 and between 1940 and 1979, but they were no different than the newest buildings (Fig. 1). In newer buildings (1980–2019), all decades had buildings with Legionella positivity (Fig. 1 inset). Campus acreage varied from ~3 acres to almost 500 acres. Although significant differences were found in Legionella positivity for different campus sizes, there was no clear trend and campus acreage may not be a suitable proxy for the extent or complexity of water systems feeding buildings. Conclusions: The analysis of this large, real-world data set supports an assumption that taller buildings are more likely to be associated with Legionella detection, perhaps a result of more extensive piping. In contrast, the assumption that newer buildings are less associated with Legionella was not fully supported. These results demonstrate the variability in Legionella positivity in buildings, and they also provide evidence that can inform implementation of water safety programs.
Disclosures: Chetan Jinadatha, principal Investigator/Co-I: Research: NIH/NINR, AHRQ, NSF principal investigator: Research: Xenex Healthcare Services. Funds provided to institution. Inventor: Methods for organizing the disinfection of one or more items contaminated with biological agents. Owner: Department of Veterans Affairs. Licensed to Xenex Disinfection System, San Antonio, TX.
OBJECTIVES/GOALS: The detection of liver fibrotic changes at an early and reversible stage is essential to prevent its progression to end-stage cirrhosis and hepatocellular carcinoma. Liver biopsy, which is the current gold standard for fibrosis assessment, is accompanied by several complications due to its invasive nature in addition to sampling errors and reader variability. In this study, we evaluate the use of quantitative parameters extracted from hybrid ultrasound and photoacoustic imaging to detect and monitor fibrotic changes in a DEN rat model. METHODS/STUDY POPULATION: Liver fibrotic changes were induced in 34 Wistar male rats by oral administration of Diethylnitrosamine (DEN) for 12 weeks. 22 rats were imaged with B-mode ultrasound at 3 different time points (baseline, 10 weeks and 13 weeks) for monitoring liver texture changes. Texture features studied included tissue echointensity (liver brightness normalized to kidney brightness) and tissue heterogeneity. 12 rats were imaged with photoacoustic imaging at 4 time points (baseline, 5 wks, 10 wks, and 13 wks) to look at changes in tissue oxygenation. Hemoglobin oxygen saturation (sO2A) and hemoglobin concentration (HbT) in the right and left lobes of the liver were measured. 8 rats were used as controls. Liver tissue samples were obtained following 13 weeks from DEN start time for METAVIR histopathology staging of fibrosis. RESULTS/ANTICIPATED RESULTS: Texture features studied showed an increase with time in DEN rats. Normalized echointensity increased from 0.28 ± 0.06 at baseline to 0.46 ± 0.10 at 10 weeks (p < 0.0005) and 0.53 ± 0.15 at 13 weeks in DEN rats (p < 0.0005). In the control rats, echointensity remained at an average of 0.25 ± 0.05 (p = 0.31). Tissue heterogeneity increased over time in the DEN-exposed rats from a baseline of 208.7 ± 58.3 to 344.6 ± 52.9 at 10 weeks (p < 0.0005) and 376.8 ± 54.9 at 13 weeks (p = 0.06) however it stayed constant at 225.7 ± 37.6 in control rats (p = 0.58). The quantitative analyses of the photoacoustic signals showed that blood oxygen saturation significantly increased with time. At 5 weeks sO2AvT increased by 53.83 % (± 0.25), and HbT by 35.31 % (± 0.07). Following 10 weeks of DEN; sO2AvT by 92.04 % (± 0.29), and HbT by 55.24 % (± 0.1). All increases were significant p < 0.05. In the 13th week, however, the values of all of these parameters were lower than those in the 10th week, however, the decrease was statistically insignificant. DISCUSSION/SIGNIFICANCE OF IMPACT: Quantitative features from B-mode ultrasound and photoacoustic imaging consistently increased over time corresponding to hepatic damage, inflammation and fibrosis progressed. The use of this hybrid imaging method in clinical practice can help meet the significant need for noninvasive assessment of liver fibrosis.
A scanning attachment has been designed for the Corinth electron microscope enabling either standard transmission images of thin specimens (TEM) or reflection scanning images of bulk specimens (SEM) to be obtained from the one instrument.
At the heart of the modified instrument is a lens which acts as both the final projector lens for the TEM and also as the objective lens for the SEM with a working distance of 5mm. Thus even when used as a SEM the whole length of the Corinth column is traversed by the beam and this presents the choice of two possible modes of operation, namely, a two lens mode in which only the first condenser and final projector are excited, or a three lens mode in which the TEM objective lens is also excited. The three lens mode provides a range of much higher gaussian demagnifications.
The illumination system of the conventional transmission electron microscope is designed to provide an adequate level of coherent illumination over the field of view being examined. The additional requirement of providing the maximum possible current into very small (sub-micron) focussed spots, as is required by X-ray microanalysis for example, necessitates a probe forming lens with a very short working distance and has previously been met by using either a mini-lens or the prefield of the objective lens when used in the Single-Field Condenser-Objective mode.
The SFCO gives the smallest spot size and aberration parameters but suffers from the disadvantage that the illumination and imaging fields are not independently controllable.