To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Loneliness is linked to many negative health outcomes and places strain on the economy and the National Health Service in the United Kingdom. To combat these issues, the determinants of loneliness need to be fully understood. Although friendships have been shown to be particularly important in relation to loneliness in older adults, this association has thus far not been explored more closely. Our exploratory study examines the relationship between number of friends and loneliness, depression, anxiety and stress in older adults. Data were obtained from 335 older adults via completion of an online survey. Measures included loneliness (UCLA Loneliness Scale version 3), depression, anxiety and stress (Depression Anxiety Stress Scales DASS-21). Participants also reported their number of close friends. Regression analyses revealed an inverse curvilinear relationship between number of friends and each of the measures tested. Breakpoint analyses demonstrated a threshold for the effect of number of friends on each of the measures (loneliness = 4, depression = 2, anxiety = 3, stress = 2). The results suggest that there is a limit to the benefit of increasing the number of friends in older adults for each of these measures. The elucidation of these optimal thresholds can inform the practice of those involved in loneliness interventions for older adults. These interventions can become more targeted; focusing on either establishing four close friendships, increasing the emotional closeness of existing friendships or concentrating resources on other determinants of loneliness in this population.
There is a developing body of research that suggests that there may be distinct categories of patients that can explain the relationship between psychosis and antisocial behaviours. Specifically, three pathways of offending, antisocial behaviour and psychosis have been described and there is an evolving empirical evidence base to suggest that these pathways are aetiologically distinct. Firstly, there is a pathway for early-start offenders, which have been identified as those with psychosis preceded by Conduct Disorder (SZ + CD). Secondly, a group that start to display antisocial behaviours in parallel to the onset of psychosis (SZ-AS). The third group involves those with a long history of a psychotic disorder and no history of antisocial behaviours, who will present to services following a first conviction for non-violent or violent crime (SZ). The authors hypothesise that each typology will utilise services differently throughout the clinical trajectory. This pilot study aimed to (i) examine the concurrent validity of the antisocial behaviour and psychosis typologies, and (ii) examine differences in the service utilisation patterns of patients between these groups.
The sample consisted of adult male patients admitted to low and medium secure forensic hospitals within the Northwest of England. A total of 90 patients were used.
A categorisation checklist was developed, and the typology of patients determined from data collected from electronic health records. Data were collected on patient demographics, psychiatric diagnosis, aetiological factors, and service utilisation. Two researchers reviewed the data and determined the typology. Statistical analysis aimed to assess the difference in aetiological variables between the typologies and examine the relationship with how each typology utilised services.
This study provided further evidence of distinguishing characteristics emphasising typology heterogeneity.
The CD-SZ group were more likely to have utilised mental health services <18 years (70%, p = 0.062), and to have used services preceding a diagnosis of psychosis (60%, p = 0.011). Following the onset of a psychotic disorder, the AS-SZ and SZ groups had a higher proportion that used general adult psychiatry services (p = 0.031), with CD-SZ coming in to contact with forensic psychiatry services and criminal justice services earlier and more frequently.
This study demonstrates that each typology has a different clinical trajectory through mental health services. This provides further empirical evidence towards different clinical typologies and trajectories of individuals with psychosis and anti-social behaviour. Understanding more about how these typologies utilise services will enable clinicians to introduce interventions help develop effective management plans that address the distinct characteristics of each typology of offender with psychosis.
My immediate aim in this lecture is to contribute something to the apt characterization of our representation and knowledge of the specifically human life form, as I will put it—and, to some extent, of things ‘human’ more generally. In particular I want to argue against an exaggerated empiricism about such cognition. Meditation on these themes might be pursued as having a kind of interest of its own, an epistemological and in the end metaphysical interest, but my own purpose in the matter is practical-philosophical. I want to employ my theses to make room for a certain range of doctrines in ethical theory and the theory of practical rationality—doctrines, namely, of natural normativity or natural goodness, as we may call them. I am not proposing to attempt a positive argument for any such ‘neo-Aristotelian’ position, but merely to defend such views against certain familiar lines of objection; and even here my aims will be limited, as will be seen.
OBJECTIVES/GOALS: We assessed the relationship between C-peptide preservation and a serum exocrine pancreatic enzyme (trypsin) in a recently concluded clinical trial. We hypothesized that immunomodulatory treatment resulting in improved beta-cell function would be associated with improved trypsin levels in subjects with recent-onset type 1 diabetes (T1D). METHODS/STUDY POPULATION: In a three-arm, randomized, double-masked, placebo-controlled trial 'Antithymocyte Globulin (ATG) and pegylated granulocyte colony stimulating factor (GCSF) in New Onset Type 1 Diabetes’ 89 subjects with recent-onset T1D (duration <100 days) were enrolled and randomized to 3 groups: low-dose ATG (2.5 mg/kg IV) followed by pegylated GCSF (6 mg subcutaneously every 2 weeks for 6 doses), low-dose ATG alone, and placebo. We compared longitudinal serum levels of an exocrine enzyme (trypsin) in a subset of responders to therapy (defined as subjects with at least 60% of baseline area under the curve (AUC) C-peptide levels at 96 weeks, n=4) versus placebo 'responders’ (n=2) and non-responders (n=25), and treated (n=19) versus placebo (n=12) subjects at baseline, 2 weeks, and 6 months after treatment. RESULTS/ANTICIPATED RESULTS: There was no observed difference in treated (n=20) versus placebo (n=12) longitudinal trends in trypsin levels when compared to baseline levels. However, responders to immunotherapy (n=4) had 6 month trypsin levels that were 114% of baseline whereas placebo subject 'responders’ (n=2), placebo subjects (n=10), and non-responders to immunotherapy (n=15) had trypsin levels that were 81-93% of baseline (unpaired T test p=0.05). Overall, we found that serum trypsin, a marker of exocrine pancreatic function, had a normal upward trend in new-onset T1D subjects who responded clinically to immunotherapy but declined in subjects who did not respond or who were not treated. These results were bordering on statistical significance but did not reach significance, likely due to the small sample size. DISCUSSION/SIGNIFICANCE: An improvement in trypsin, a marker of exocrine function, after response to immunotherapy in new-onset T1D may be due to a direct impact on exocrine function versus an indirect effect from improved beta cell function. Future studies will be needed to confirm our findings in a larger sample and evaluate the mechanism for improved exocrine function.
Adults who had non-edematous severe acute malnutrition (SAM) during infancy (i.e., marasmus) have worse glucose tolerance and beta-cell function than survivors of edematous SAM (i.e., kwashiorkor). We hypothesized that wasting and/or stunting in SAM is associated with lower glucose disposal rate (M) and insulin clearance (MCR) in adulthood.
We recruited 40 nondiabetic adult SAM survivors (20 marasmus survivors (MS) and 20 kwashiorkor survivors (KS)) and 13 matched community controls. We performed 150-minute hyperinsulinaemic, euglycaemic clamps to estimate M and MCR. We also measured serum adiponectin, anthropometry, and body composition. Data on wasting (weight-for-height) and stunting (height-for-age) were abstracted from the hospital records.
Children with marasmus had lower weight-for-height z-scores (WHZ) (−3.8 ± 0.9 vs. −2.2 ± 1.4; P < 0.001) and lower height-for-age z-scores (HAZ) (−4.6 ± 1.1 vs. −3.4 ± 1.5; P = 0.0092) than those with kwashiorkor. As adults, mean age (SD) of participants was 27.2 (8.1) years; BMI was 23.6 (5.0) kg/m2. SAM survivors and controls had similar body composition. MS and KS and controls had similar M (9.1 ± 3.2; 8.7 ± 4.6; 6.9 ± 2.5 mg.kg−1.min−1 respectively; P = 0.3) and MCR. WHZ and HAZ were not associated with M, MCR or adiponectin even after adjusting for body composition.
Wasting and stunting during infancy are not associated with insulin sensitivity and insulin clearance in lean, young, adult survivors of SAM. These data are consistent with the finding that glucose intolerance in malnutrition survivors is mostly due to beta-cell dysfunction.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
To achieve the elimination of the hepatitis C virus (HCV), sustained and sufficient levels of HCV testing is critical. The purpose of this study was to assess trends in testing and evaluate the effectiveness of strategies to diagnose people living with HCV. Data were from 12 primary care clinics in Victoria, Australia, that provide targeted services to people who inject drugs (PWID), alongside general health care. This ecological study spanned 2009–2019 and included analyses of trends in annual numbers of HCV antibody tests among individuals with no previous positive HCV antibody test recorded and annual test yield (positive HCV antibody tests/all HCV antibody tests). Generalised linear models estimated the association between count outcomes (HCV antibody tests and positive HCV antibody tests) and time, and χ2 test assessed the trend in test yield. A total of 44 889 HCV antibody tests were conducted 2009–2019; test numbers increased 6% annually on average [95% confidence interval (CI) 4–9]. Test yield declined from 2009 (21%) to 2019 (9%) (χ2P = <0.01). In more recent years (2013–2019) annual test yield remained relatively stable. Modest increases in HCV antibody testing and stable but high test yield within clinics delivering services to PWID highlights testing strategies are resulting in people are being diagnosed however further increases in the testing of people at risk of HCV or living with HCV may be needed to reach Australia's HCV elimination goals.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
The coronavirus disease 2019 (COVID-19) pandemic has significantly increased depression rates, particularly in emerging adults. The aim of this study was to examine longitudinal changes in depression risk before and during COVID-19 in a cohort of emerging adults in the U.S. and to determine whether prior drinking or sleep habits could predict the severity of depressive symptoms during the pandemic.
Participants were 525 emerging adults from the National Consortium on Alcohol and NeuroDevelopment in Adolescence (NCANDA), a five-site community sample including moderate-to-heavy drinkers. Poisson mixed-effect models evaluated changes in the Center for Epidemiological Studies Depression Scale (CES-D-10) from before to during COVID-19, also testing for sex and age interactions. Additional analyses examined whether alcohol use frequency or sleep duration measured in the last pre-COVID assessment predicted pandemic-related increase in depressive symptoms.
The prevalence of risk for clinical depression tripled due to a substantial and sustained increase in depressive symptoms during COVID-19 relative to pre-COVID years. Effects were strongest for younger women. Frequent alcohol use and short sleep duration during the closest pre-COVID visit predicted a greater increase in COVID-19 depressive symptoms.
The sharp increase in depression risk among emerging adults heralds a public health crisis with alarming implications for their social and emotional functioning as this generation matures. In addition to the heightened risk for younger women, the role of alcohol use and sleep behavior should be tracked through preventive care aiming to mitigate this looming mental health crisis.
The Chinese Communist Party's (CCP) ideology, rooted in its foundational struggles, explicitly denounces “bureaucratism” (guanliaozhuyi) as an intrinsic ailment of bureaucracy. Yet while the revolutionary Party has blasted bureaucratism, its revolutionary regime has had to find a way to coexist with bureaucracy, which is a requisite for effective governance. An anti-bureaucratic ghost thus dwells in the machinery of China's bureaucratic state. We analyse the CCP's anti-bureaucratism through two steps. First, we perform a historical analysis of the Party's anti-bureaucratic ideology, teasing out its substance and emphasizing its roots in and departures from European Marxism and Leninism. Second, we trace both the continuity and evolution in the Party's anti-bureaucratic rhetoric, taking an interactive approach that combines close reading with computational analysis of the entire corpus of the People's Daily (1947–2020). We find striking endurance as well as subtle shifts in the substance of the CCP's anti-bureaucratic ideology. We show that bureaucratism is an umbrella term that expresses the revolutionary Party's anxiety about losing its popular legitimacy. Yet the substance of the Party's concern evolved from commandism and revisionism under Mao, to corruption and formalism during reform. The Party's ongoing critiques of bureaucratism and formalism unfold in parallel fashion with its efforts to standardize, regularize and institutionalize the state.
The COVID-19 pandemic has disrupted lives and livelihoods, and people already experiencing mental ill health may have been especially vulnerable.
Quantify mental health inequalities in disruptions to healthcare, economic activity and housing.
We examined data from 59 482 participants in 12 UK longitudinal studies with data collected before and during the COVID-19 pandemic. Within each study, we estimated the association between psychological distress assessed pre-pandemic and disruptions since the start of the pandemic to healthcare (medication access, procedures or appointments), economic activity (employment, income or working hours) and housing (change of address or household composition). Estimates were pooled across studies.
Across the analysed data-sets, 28% to 77% of participants experienced at least one disruption, with 2.3–33.2% experiencing disruptions in two or more domains. We found 1 s.d. higher pre-pandemic psychological distress was associated with (a) increased odds of any healthcare disruptions (odds ratio (OR) 1.30, 95% CI 1.20–1.40), with fully adjusted odds ratios ranging from 1.24 (95% CI 1.09–1.41) for disruption to procedures to 1.33 (95% CI 1.20–1.49) for disruptions to prescriptions or medication access; (b) loss of employment (odds ratio 1.13, 95% CI 1.06–1.21) and income (OR 1.12, 95% CI 1.06 –1.19), and reductions in working hours/furlough (odds ratio 1.05, 95% CI 1.00–1.09) and (c) increased likelihood of experiencing a disruption in at least two domains (OR 1.25, 95% CI 1.18–1.32) or in one domain (OR 1.11, 95% CI 1.07–1.16), relative to no disruption. There were no associations with housing disruptions (OR 1.00, 95% CI 0.97–1.03).
People experiencing psychological distress pre-pandemic were more likely to experience healthcare and economic disruptions, and clusters of disruptions across multiple domains during the pandemic. Failing to address these disruptions risks further widening mental health inequalities.
Herbicide resistance is an increasing issue in many weed species, including rigid ryegrass (Lolium rigidum Gaudin); a major weed of winter cropping systems in southern Australia. Recently, this weed has also been found in summer crops in the southeastern region of Australia. Effective control of this herbicide-resistant weed across southeastern Australia requires alternative management strategies. These strategies can be informed by analyses on the interaction of germinable seeds with their regional environments and by identifying the differences between populations of varying herbicide-resistance levels. In this study, we explore how various environmental factors differentially affect the seed germination and seedling emergence of three L. rigidum populations, including one glyphosate-resistant population (GR), one glyphosate-susceptible population (GS), and one population of unknown resistance status (CC04). Germination was greater than 90% for all populations at each temperature regime, except 15/5 C. Populations germinated at a lower rate under 15/5 C, ranging from 74% to 87% germination. Salt stress had a similar effect on the germination of all populations, with 0% germination occurring at 250 mM salt stress. Population GS had greater tolerance to osmotic stress, with 65% germination at −0.4 MPa compared with 47% and 43% germination for CC04 and GR, respectively; however, germination was inhibited at −0.8 and −1.6 MPa for all populations. All populations had lower germination when placed in complete darkness as opposed to alternating light/dark. Germination in darkness was lower for CC04 (69%) than GR (83%) and GS (83%). Seedling emergence declined with increasing burial depth with the lowest emergence occuring at 8 cm (37%) when averaged over the populations. These results indicate that L. rigidum can survive under a range of environmental variables and that the extent of survival differs based on population; however, there was no difference based on herbicide-resistance status.
Targeted drug development efforts in patients with CHD are needed to standardise care, improve outcomes, and limit adverse events in the post-operative period. To identify major gaps in knowledge that can be addressed by drug development efforts and provide a rationale for current clinical practice, this review evaluates the evidence behind the most common medication classes used in the post-operative care of children with CHD undergoing cardiac surgery with cardiopulmonary bypass.
We systematically searched PubMed and EMBASE from 2000 to 2019 using a controlled vocabulary and keywords related to diuretics, vasoactives, sedatives, analgesics, pulmonary vasodilators, coagulation system medications, antiarrhythmics, steroids, and other endocrine drugs. We included studies of drugs given post-operatively to children with CHD undergoing repair or palliation with cardiopulmonary bypass.
We identified a total of 127 studies with 51,573 total children across medication classes. Most studies were retrospective cohorts at single centres. There is significant age- and disease-related variability in drug disposition, efficacy, and safety.
In this study, we discovered major gaps in knowledge for each medication class and identified areas for future research. Advances in data collection through electronic health records, novel trial methods, and collaboration can aid drug development efforts in standardising care, improving outcomes, and limiting adverse events in the post-operative period.
The peoples of southern Mesoamerica, including the Classic period Maya, are often claimed to exhibit a distinct type of spatial organization relative to contemporary urban systems. Here, we use the settlement scaling framework and properties of settlements recorded in systematic, full-coverage surveys to examine ways in which southern Mesoamerican settlement systems were both similar to and different from contemporary systems. We find that the population-area relationship in these settlements differs greatly from that reported for other agrarian settlement systems, but that more typical patterns emerge when one considers a site epicenter as the relevant social interaction area, and the population administered from a given center as the relevant interacting population. Our results imply that southern Mesoamerican populations mixed socially at a slower temporal rhythm than is typical of contemporary systems. Residential locations reflected the need to balance energetic and transport costs of farming with lower-frequency costs of commuting to central places. Nevertheless, increasing returns in activities such as civic construction were still realized through lower-frequency social mixing. These findings suggest that the primary difference between low-density urbanism and contemporary urban systems lies in the spatial and temporal rhythms of social mixing.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Accurate low-dimensional models for the dynamics of falling liquid films subject to localized or time-varying heating are essential for applications that involve patterning or control. However, existing modelling methodologies either fail to respect fundamental thermodynamic properties or else do not accurately capture the effects of advection and diffusion on the temperature profile. We argue that the best-performing long-wave models are those that give the surface temperature implicitly as the solution of an evolution equation in which the wall temperature alone (and none of its derivatives) appears as a source term. We show that, for both flat and non-uniform films, such a model can be rationally derived by expanding the temperature field about its free-surface values. We test this model in linear and nonlinear regimes, and show that its predictions are in remarkable quantitative agreement with full Navier–Stokes calculations regarding the surface temperature, the internal temperature field and the surface displacement that would result from temperature-induced Marangoni stresses.
The rate of failing to apply a tourniquet remains high.
The study objective was to examine whether early advanced training under conditions that approximate combat conditions and provide stress inoculation improve competency, compared to the current educational program of non-medical personnel.
This was a randomized controlled trial. Male recruits of the armored corps were included in the study. During Combat Lifesaver training, recruits apply The Tourniquet 12 times. This educational program was used as the control group. The combat stress inoculation (CSI) group also included 12 tourniquet applications, albeit some of them in combat conditions such as low light and physical exertion. Three parameters defined success, and these parameters were measured by The Simulator: (1) applied pressure ≥ 200mmHg; (2) time to stop bleeding ≤ 60 seconds; and (3) placement up to 7.5cm above the amputation.
Out of the participants, 138 were assigned to the control group and 167 were assigned to the CSI group. The overall failure rate was 80.33% (81.90% in the control group versus 79.00% in the CSI group; P value = .565; 95% confidence interval, 0.677 to 2.122). Differences in pressure, time to stop bleeding, or placement were not significant (95% confidence intervals, −17.283 to 23.404, −1.792 to 6.105, and 0.932 to 2.387, respectively). Tourniquet placement was incorrect in most of the applications (62.30%).
This study found high rates of failure in tourniquet application immediately after successful completion of tourniquet training. These rates did not improve with tourniquet training, including CSI. The results may indicate that better tourniquet training methods should be pursued.
Tsur, AM, Binyamin, Y, Koren, L, Ohayon, S, Thompson; P, Glassberg, E. High tourniquet failure rates among non-medical personnel do not improve with tourniquet training, including combat stress inoculation: a randomized controlled trial. Prehosp Disaster Med. 2019;34(3):282–287.