We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine how engagement of the hospital and/or vendor with performance improvement strategies combined with an automated hand hygiene monitoring system (AHHMS) influence hand hygiene (HH) performance rates.
The study was conducted in 58 adult and pediatric inpatient units located in 10 hospitals.
Methods:
HH performance rates were estimated using an AHHMS. Rates were expressed as the number of soap and alcohol-based hand rub portions dispensed divided by the number of room entries and exits. Each hospital self-assigned to one of the following intervention groups: AHHMS alone (control group), AHHMS plus clinician-based vendor support (vendor-only group), AHHMS plus hospital-led unit-based initiatives (hospital-only group), or AHHMS plus clinician-based vendor support and hospital-led unit-based initiatives (vendor-plus-hospital group). Each hospital unit produced 1–2 months of baseline HH performance data immediately after AHHMS installation before implementing initiatives.
Results:
Hospital units in the vendor-plus-hospital group had a statistically significant increase of at least 46% in HH performance compared with units in the other 3 groups (P ≤ .006). Units in the hospital only group achieved a 1.3% increase in HH performance compared with units that had AHHMS alone (P = .950). Units with AHHMS plus other initiatives each had a larger change in HH performance rates over their baseline than those in the AHHMS-alone group (P < 0.001).
Conclusions:
AHHMS combined with clinician-based vendor support and hospital-led unit-based initiatives resulted in the greatest improvements in HH performance. These results illustrate the value of a collaborative partnership between the hospital and the AHHMS vendor.
One mechanism for airway closure in the lung is the surface-tension-driven instability of the mucus layer which lines the airway wall. We study the instability of an axisymmetric layer of viscoplastic Bingham liquid coating the interior of a rigid tube, which is a simple model for an airway that takes into account the yield stress of mucus. An evolution equation for the thickness of the liquid layer is derived using long-wave theory, from which we also derive a simpler thin-film evolution equation. In the thin-film case we show that two branches of marginally yielded static solutions of the evolution equation can be used to both predict the size of the initial perturbation required to trigger instability and quantify how increasing the capillary Bingham number (a parameter measuring yield stress relative to surface tension) reduces the final deformation of the layer. Using numerical solutions of the long-wave evolution equation, we quantify how the critical layer thickness required to form a liquid plug in the tube increases as the capillary Bingham number is increased. We discuss the significance of these findings for modelling airway closure in obstructive conditions such as cystic fibrosis, where the mucus layer is often thicker and has a higher yield stress.
Democratic cooperation is a particularly complex type of arrangement that requires attendant institutions to ensure that the problems inherent in collective action do not subvert the public good. It is perhaps due to this complexity that historians, political scientists, and others generally associate the birth of democracy with the emergence of so-called states and center it geographically in the “West,” where it then diffused to the rest of the world. We argue that the archaeological record of the American Southeast provides a case to examine the emergence of democratic institutions and to highlight the distinctive ways in which such long-lived institutions were—and continue to be—expressed by Native Americans. Our research at the Cold Springs site in northern Georgia, USA, provides important insight into the earliest documented council houses in the American Southeast. We present new radiocarbon dating of these structures along with dates for the associated early platform mounds that place their use as early as cal AD 500. This new dating makes the institution of the Muskogean council, whose active participants have always included both men and women, at least 1,500 years old, and therefore one of the most enduring and inclusive democratic institutions in world history.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
The purpose of this study was to pilot safety and tolerability of a 1-week aerobic exercise program during the post-acute phase of concussion (14–25 days post-injury) by examining adherence, symptom response, and key functional outcomes (e.g., cognition, mood, sleep, postural stability, and neurocognitive performance) in young adults.
Method:
A randomized, non-blinded pilot clinical trial was performed to compare the effects of aerobic versus non-aerobic exercise (placebo) in concussion patients. The study enrolled three groups: 1) patients with concussion/mild traumatic brain injury (mTBI) randomized to an aerobic exercise intervention performed daily for 1-week, 2) patients with concussion/mTBI randomized to a non-aerobic (stretching and calisthenics) exercise program performed daily for 1-week, and 3) non-injured, no intervention reference group.
Results:
Mixed-model analysis of variance results indicated a significant decrease in symptom severity scores from pre- to post-intervention (mean difference = −7.44, 95% CI [−12.37, −2.20]) for both concussion groups. However, the pre- to post-change was not different between groups. Secondary outcomes all showed improvements by post-intervention, but no differences in trajectory between the groups. By three months post-injury, all outcomes in the concussion groups were within ranges of the non-injured reference group.
Conclusions:
Results from this study indicate that the feasibility and tolerability of administering aerobic exercise via stationary cycling in the post-acute time frame following post-concussion (14–25 days) period are tentatively favorable. Aerobic exercise does not appear to negatively impact recovery trajectories of neurobehavioral outcomes; however, tolerability may be poorer for patients with high symptom burden.
We examined whether preadmission history of depression is associated with less delirium/coma-free (DCF) days, worse 1-year depression severity and cognitive impairment.
Design and measurements:
A health proxy reported history of depression. Separate models examined the effect of preadmission history of depression on: (a) intensive care unit (ICU) course, measured as DCF days; (b) depression symptom severity at 3 and 12 months, measured by the Beck Depression Inventory-II (BDI-II); and (c) cognitive performance at 3 and 12 months, measured by the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) global score.
Setting and participants:
Patients admitted to the medical/surgical ICU services were eligible.
Results:
Of 821 subjects eligible at enrollment, 261 (33%) had preadmission history of depression. After adjusting for covariates, preadmission history of depression was not associated with less DCF days (OR 0.78, 95% CI, 0.59–1.03 p = 0.077). A prior history of depression was associated with higher BDI-II scores at 3 and 12 months (3 months OR 2.15, 95% CI, 1.42–3.24 p = <0.001; 12 months OR 1.89, 95% CI, 1.24–2.87 p = 0.003). We did not observe an association between preadmission history of depression and cognitive performance at either 3 or 12 months (3 months beta coefficient −0.04, 95% CI, −2.70–2.62 p = 0.97; 12 months 1.5, 95% CI, −1.26–4.26 p = 0.28).
Conclusion:
Patients with a depression history prior to ICU stay exhibit a greater severity of depressive symptoms in the year after hospitalization.
In this retrospective cohort study of patients presenting to a national direct-to-consumer medical practice, we found that provider geographic location is a stronger driver of antibiotic prescribing than patient location. Physicians in the Northeast and South are significantly more likely than physicians in the West to prescribe antibiotics for upper respiratory infection and bronchitis.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
The present study explored the influence of romantic love on the expression of several obsessive–compulsive disorder (OCD) characteristics, including symptom severity, symptom dimensions, age at onset, sensory phenomena (SP), and developmental course, as well as other related comorbid disorders. It was hypothesized that love-precipitated OCD would be associated with a set of distinct characteristics and exhibit greater rates of comorbid disorders.
Methods
The analyses were performed using a large sample (n = 981) of clinical patients with a primary diagnosis of OCD (Females = 67.3%, M age = 35.31).
Results
Love-precipitated OCD was associated with greater severity of SP and later age at onset of obsessions. However, symptom severity, symptom dimension, developmental course, and psychiatric comorbidities were not associated with love-precipitated OCD.
Conclusion
It was concluded that romantic love does shape the expression of OCD, especially with regard to SP and onset age. These findings encourage further exploration to determine its clinical significance as a phenotype.
Salt marshes are valuable but complex biophysical systems with associated ecosystems. This presents numerous challenges when trying to understand and predict their behaviour and evolution, which is essential to facilitate their continued and sustainable use, conservation and management1. Detailed understanding of the hydrodynamics, sediment dynamics, and ecology that control the system is required, as well as their numerous interactions2,3, but is complicated by spatial and temporal heterogeneity at a range of scales4,5. These complex interactions and feedbacks between the physical, biological, and chemical processes can be investigated in situ following natural, unintentional, or intentional manipulation6, but the mechanistic basis of any observations are confounded by the presence of collinear variables. Hence, laboratory investigations can be beneficial, as they provide the opportunity for systematic testing of subsets of coastal processes, mechanisms, or conditions typical of salt marsh systems, in the absence of confounding variables. With appropriate scaling, this allows a better understanding of the overall function of the salt marsh, and better predictions of their evolution.
Limited information exists about the prevalence of psychiatric illness for Indigenous Australians. This study examines the prevalence of diagnosed psychiatric disorders in Indigenous Australians and compares this to non-Indigenous Australians. The aims were to: (1) determine prevalence rates for psychiatric diagnoses for Indigenous Australians admitted to hospital; and (2) examine whether the profile of psychiatric diagnoses for Indigenous Australians was different compared with non-Indigenous Australians.
Methods
A birth cohort design was adopted, with the population consisting of 45 141 individuals born in the Australian State of Queensland in 1990 (6.3% Indigenous). Linked administrative data from Queensland Health hospital admissions were used to identify psychiatric diagnoses from age 4/5 to 23/24 years. Crude lifetime prevalence rates of psychiatric diagnoses for Indigenous and non-Indigenous individuals were derived from the hospital admissions data. The cumulative incidence of psychiatric diagnoses was modelled separately for Indigenous and non-Indigenous individuals. Logistic regression was used to model differences between Indigenous and non-Indigenous psychiatric presentations while controlling for sociodemographic characteristics.
Results
There were 2783 (6.2%) individuals in the cohort with a diagnosed psychiatric disorder from a hospital admission. The prevalence of any psychiatric diagnosis at age 23/24 years was 17.2% (491) for Indigenous Australians compared with 5.4% (2292) for non-Indigenous Australians. Indigenous individuals were diagnosed earlier, with overrepresentation in psychiatric illness becoming more pronounced with age. Indigenous individuals were overrepresented in almost all categories of psychiatric disorder and this was most pronounced for substance use disorders (SUDs) (12.2 v. 2.6% of Indigenous and non-Indigenous individuals, respectively). Differences between Indigenous and non-Indigenous Australians in the likelihood of psychiatric disorders were not statistically significant after controlling for sociodemographic characteristics, except for SUDs.
Conclusions
There is significant inequality in psychiatric morbidity between Indigenous and non-Indigenous Australians across most forms of psychiatric illness that is evident from an early age and becomes more pronounced with age. SUDs are particularly prevalent, highlighting the importance of appropriate interventions to prevent and address these problems. Inequalities in mental health may be driven by socioeconomic disadvantage experienced by Indigenous individuals.
A violent event in the Democratic Republic of the Congo and the loss of a friend created a path for re-engaging with applied theatre and love for the field of applied theatre. In a singularly loveless world, theatre practitioners, performance scholars, and activists need to renew a sense of passion, joy, and commitment to their work.
Hernando de Soto's expedition through the southeastern United States between 1539 and 1543 is often regarded as a watershed moment for the collapse of Indigenous societies across the region. Historical narratives have proposed that extreme depopulation as a result of early contact destabilized Indigenous economies, politics, networks, and traditions. Although processes of depopulation and transformation were certainly set in motion by this and earlier colonial encounters, the timing, temporality, and heterogeneous rhythms of postcontact Indigenous histories remain unclear. Through the integration of radiocarbon and archaeological data from the Mississippian earthen platform mound at Dyar (9GE5) in central Georgia, we present a case of Indigenous endurance and resilience in the Oconee Valley that has long been obfuscated by materially based chronologies and typologies. Bayesian chronological modeling suggests that Indigenous Mississippian traditions persisted for up to 130 years beyond contact with European colonizers. We argue that advances in modeling radiocarbon dates, along with meaningful consultation/collaboration with descendant communities, can contribute to efforts that move us beyond a reliance on materially based chronologies that can distort and erase Indigenous histories.
Cultures around the world are converging as populations become more connected. On the one hand this increased connectedness can promote the recombination of existing cultural practices to generate new ones, but on the other it may lead to the replacement of traditional practices and global WEIRDing. Here we examine the process and causes of changes in cultural traits concerning wild plant knowledge in Mbendjele BaYaka hunter–gatherers from Congo. Our results show that the BaYaka who were born in town reported knowing and using fewer plants than the BaYaka who were born in forest camps. Plant uses lost in the town-born BaYaka related to medicine. Unlike the forest-born participants, the town-born BaYaka preferred Western medicine over traditional practices, suggesting that the observed decline of plant knowledge and use is the result of replacement of cultural practices with the new products of cumulative culture.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Methods
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
Results
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Conclusions
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
Obesity remains a major public health concern and intermittent fasting is a popular strategy for weight loss, which may present independent health benefits. However, the number of diet books advising how fasting can be incorporated into our daily lives is several orders of magnitude greater than the number of trials examining whether fasting should be encouraged at all. This review will consider the state of current understanding regarding various forms of intermittent fasting (e.g. 5:2, time-restricted feeding and alternate-day fasting). The efficacy of these temporally defined approaches appears broadly equivalent to that of standard daily energy restriction, although many of these models of intermittent fasting do not involve fed-fasted cycles every other 24 h sleep–wake cycle and/or permit some limited energy intake outside of prescribed feeding times. Accordingly, the intervention period therefore may not regularly alternate, may not span all or even most of any given day, and may not even involve absolute fasting. This is important because potentially advantageous physiological mechanisms may only be initiated if a post-absorptive state is sustained by uninterrupted fasting for a more prolonged duration than applied in many trials. Indeed, promising effects on fat mass and insulin sensitivity have been reported when fasting duration is routinely extended beyond sixteen consecutive hours. Further progress will require such models to be tested with appropriate controls to isolate whether any possible health effects of intermittent fasting are primarily attributable to regularly protracted post-absorptive periods, or simply to the net negative energy balance indirectly elicited by any form of dietary restriction.
The objective of this research was to evaluate producers’ perspectives of four key precision agriculture technologies (variable rate fertilizer application, precision soil sampling, guidance and autosteer, and yield monitoring) in terms of the benefits they provide to their farms (increased yield, reduced production costs, and increased convenience) using a best-worst scaling choice experiment. Results indicate that farmers’ perceptions of the benefits derived from various precision agriculture technologies are heterogeneous. To better understand farmers’ adoption decisions, or lack thereof, it is important to first understand their perceptions of the benefits precision agriculture technologies provide.