To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sperm are highly specialized cells, evolved to function as vehicles for the transport of the paternal genome to the oocyte. The sperm cell is characterized by a distinct head, mid-piece and tail, structured for a streamlined function. The sperm head consists of the haploid paternal genome (23 chromosomes), packed in a specific tight manner with the help of specialized proteins called protamines. The mid-piece consists of the centrosome and mitochondria, organelles that provide energy for sperm propulsion from the tail. The unique sperm structure, complimented with its motility, helps the sperm to swim through the male and female reproductive tract and penetrate the egg. Therefore, the primary function of the sperm is to successfully deliver the paternal genome to the oocyte.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
In the study of electoral politics and political behavior in the developing world, India is often considered to be an exemplar of the centrality of contingency in distributive politics, the role of ethnicity in shaping political behavior, and the organizational weakness of political parties. Whereas these axioms have some empirical basis, the massive changes in political practices, the vast variation in political patterns, and the burgeoning literature on subnational dynamics in India mean that such generalizations are not tenable. In this article, we consider research on India that compels us to rethink the contention that India neatly fits the prevailing wisdom in the comparative politics literature. Our objective is to elucidate how the many nuanced insights about Indian politics can improve our understanding of electoral behavior both across and within other countries, allowing us to question core assumptions in theories of comparative politics.
In-patient treatment is a complex system of recursively interacting components. Patient characteristics interact with caregiver characteristics, home context and ward factors. Quality improvement requires primary focus on the interacting factors over which the ward itself potentially has influence. Ward practice has to integrate the demands of the hospital owner, the legal framework for treatment and what we know facilitates effective treatment plans. We describe how we have implemented a quality improvement system that addresses these interplaying influences in acute adolescent psychiatry in Norway. The process involved with this system (developed in the UK for child and adolescent psychiatric units) is independent of the organisational structure of the department and which alternative resources it has to rely on. It is independent of the characteristics of the patient population, although specific standards can be developed for local requirements.
Primatologists use data collected by GPS devices to answer a wide variety of scientific questions. GPS data on locations where individuals were recorded as present or absent can provide insight into primate genetic diversity, dispersal patterns, densities, and habitat suitability (e.g., Guschanski et al. 2009; Hickey et al. 2012; Junker et al. 2012; Kouakou et al. 2009). GPS data on locations of primates’ daily travel paths provide an even wider range of information. Knowing how locations change over time can inform us on disease transmission probabilities, the impact of seasonality in food availability, or differences in social organization (e.g., Lehmann & Boesch 2005; Olupot et al. 1997; Walsh et al. 2005). Calculations of travel distances reveal indices of energy expenditure (e.g., Steudel 2000), while calculations of travel speed provide information on vigilance behavior, levels of food competition, and anticipation of food finding (e.g., Janmaat et al. 2006; Noser & Byrne 2009; Pochron 2001). In addition, travel shape (e.g., linearity of or directional changes in the travel path) can help us reveal cognitive abilities, such as spatio-temporal memory or planning skills (Milton 2000; Noser & Byrne 2007; Valero & Byrne 2007). Lastly, knowledge about directional changes improves our understanding of the importance of specific locations in the habitat, such as fruit trees (Asensio et al. 2011; Byrne et al. 2009). Within this large number of studies, very few reported that GPS devices make errors that can affect the scientific conclusions that are drawn. Even fewer studies investigated how we can limit or correct these errors. In this chapter, we therefore discuss the issues we encountered when using a handheld commercial GPS device (Garmin GPSMAP® 60CSx) to estimate travel locations of wild chimpanzees (Pan troglodytes verus) in a West African rain forest. We present methods we used for testing the accuracy of the GPS device and provide primatologists with ideas on how to clean and smooth track data.
Wilderness medicine is plagued by myths and dogmatic teachings not supported by evidence. This article focuses particularly on those teachings and tools that would be most likely used in archaeological fieldwork. It lays out 10 of the most common and concerning myths taught in wilderness medicine and wilderness emergency medical services, both in terms of first aid and preparation of medical kits. The myths described are provide a structure for the main purpose of the article: to explain interventions and medical kit contents that are more evidence based and supported by modern understandings of wilderness medicine and fieldwork risk management. The list of top 10 myths includes (1) the use of medications other than epinephrine for anaphylaxis and (2) the availability and proper use of epinephrine auto-injectors, (3) the use of suction devices and tourniquets for snakebites, (4) the use of spinal immobilization for neck injuries, (5) the identification and treatment of heat illnesses, (6) the use of CPR in remote areas, (7) the appropriateness of dislocation reduction in remote areas, (8) the use and choice of tourniquets for arterial bleeding, (9) the initial definition and management of drowning patients, and (10) wound management myths.
This study aimed to provide a systematic review on survival outcome based on Pittsburgh T-staging for patients with primary external auditory canal squamous cell carcinoma.
This study was a systematic review in compliance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines performed until January 2018; pertinent studies were screened. Quality of evidence was assessed using the grading of recommendation, assessment, development and evaluation working group system.
Eight articles were chosen that reported on 437 patients with external auditory carcinoma. The 5-year overall survival rate was 53.0 per cent. The pooled proportion of survivors at 5 years for T1 tumours was 88.4 per cent and for T2 tumours was 88.6 per cent. For the combined population of T1 and T2 cancer patients, it was 84.5 per cent. For T3 and T4 tumours, it was 53.3 per cent and 26.8 per cent, respectively, whereas for T3 and T4 tumours combined, it was 40.4 per cent. Individual analysis of 61 patients with presence of cervical nodes showed a poor survival rate.
From this review, there was not any significant difference found in the survival outcome between T1 and T2 tumours. A practical classification incorporating nodal status that accurately stratifies patients was proposed.
To describe the neuroimaging and other methods for assessing vascular contributions to neurodegeneration in the Comprehensive Assessment of Neurodegeneration and Dementia (COMPASS-ND) study, a Canadian multi-center, prospective longitudinal cohort study, including reliability and feasibility in the first 200 participants.
COMPASS-ND includes persons with Alzheimer’s disease (AD; n = 150), Parkinson’s disease (PD) and Lewy body dementias (LBDs) (200), mixed dementia (200), mild cognitive impairment (MCI; 400), subcortical ischemic vascular MCI (V-MCI; 200), subjective cognitive impairment (SCI; 300), and cognitively intact elderly controls (660). Magnetic resonance imaging (MRI) was acquired according to the validated Canadian Dementia Imaging Protocol and visually reviewed by either of two experienced readers blinded to clinical characteristics. Other relevant assessments include history of vascular disease and risk factors, blood pressure, height and weight, cholesterol, glucose, and hemoglobin A1c.
Analyzable data were obtained in 197/200 of whom 18 of whom were clinically diagnosed with V-MCI or mixed dementia. The overall prevalence of infarcts was 24.9%, microbleeds was 24.6%, and high white matter hyperintensity (WMH) was 31.0%. MRI evidence of a potential vascular contribution to neurodegeneration was seen in 12.9%–40.0% of participants clinically diagnosed with another condition such as AD. Inter-rater reliability was good to excellent.
COMPASS-ND will be a useful platform to study vascular brain injury and its association with risk factors, biomarkers, and cognitive and functional decline across multiple age-related neurodegenerative diseases. Initial findings show that MRI-defined vascular brain injury is common in all cognitive syndromes and is under-recognized clinically.
Most patients with World Federation of Neurological Surgeons (WFNS) grade 5 subarachnoid hemorrhage (SAH) have poor outcomes. Accurate assessment of prognosis is important for treatment decisions and conversations with families regarding goals of care. Unjustified pessimism may lead to “self-fulfilling prophecy,” where withdrawal of life-sustaining measures (WLSM) is invariably followed by death.
We performed a cohort study involving consecutive patients with WFNS grade 5 SAH to identify variables with >= 90% and >= 95% positive predictive value (PPV) for poor outcome (1-year modified Rankin Score >= 4), as well as findings predictive of WLSM.
Of 140 patients, 38 (27%) had favorable outcomes. Predictors with >= 95% PPV for poor outcome included unconfounded 72-hour Glasgow Coma Scale motor score <= 4, absence of >= 1 pupillary light reflex (PLR) at 24 hours, and intraventricular hemorrhage (IVH) score of >= 20 (volume >= 54.6 ml). Intracerebral hemorrhage (ICH) volume >= 53 ml had PPV of 92%. Variables associated with WLSM decisions included a poor motor score (p < 0.0001) and radiographic evidence of infarction (p = 0.02).
We identified several early predictors with high PPV for poor outcome. Of these, lack of improvement in motor score during the initial 72 hours had the greatest potential for confounding from “self-fulfilling prophecy.” Absence of PLR at 24 hours, IVH score >= 20, and ICH volume >= 53 ml predicted poor outcome without a statistically significant effect on WLSM decisions. More research is needed to validate prognostic variables in grade 5 SAH, especially among patients who do not undergo WLSM.
Liben Lark Heteromirafra archeri is a ‘Critically Endangered’ species threatened by the loss and degradation of grassland at the Liben Plain, southern Ethiopia, one of only two known sites for the species. We use field data from nine visits between 2007 and 2019 and satellite imagery to quantify changes over time in the species’ abundance and in the extent and quality of its habitat. We estimate that the population fell from around 279 singing males (95% CL: 182–436) in 2007 to around 51 (14–144) in 2013, after which too few birds were recorded to estimate population size. Arable cultivation first appeared on the plain in the early 1990s and by 2019 more than a third of the plain had been converted to crops. Cultivation was initially confined to the fertile black soils but from 2008 began to spread into the less fertile red soils that cover most of the plain. Liben Larks strongly avoided areas with extensive bare ground or trees and bushes, but the extent of these did not change significantly over the survey period. A plausible explanation for the species’ decline is that grassland degradation, caused before 2007 by continuous high-pressure grazing by livestock, reduced its rates of reproduction or survival to a level that could not support its previous population. Since 2015, communal kalos (grazing exclosures) have been established to generate forage and other resources in the hope of also providing breeding habitat for Liben Larks. Grass height and density within four grassland kalos in 2018 greatly exceeded that in the surrounding grassland, indicating that the plain retains the potential to recover rapidly if appropriately managed. Improvement of grassland structure through the restitution of traditional and sustainable rangeland management regimes and the reversion of cereal agriculture to grassland are urgently needed to avert the species’ extinction.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Treatment with antipsychotics is associated with an increased risk of type 2 diabetes mellitus (T2D), and increased levels of inflammatory biomarkers are present in patients with T2D. We previously demonstrated that the glucagon-like peptide-1 receptor agonist liraglutide significantly reduced glucometabolic disturbances and body weight in prediabetic, overweight/obese schizophrenia-spectrum disorder patients treated with clozapine or olanzapine. This study aims to assess the involvement of cytokines in the therapeutic effects of liraglutide.
Serum concentrations of 10 cytokines (interferon-γ [IFN-γ], tumor necrosis factor-α, interleukin 1β [IL-1β], IL-2, IL-4, IL-6, IL-8, IL-10, IL-12p70, and IL-13) from fasting prediabetic and normal glucose-tolerant (NGT) patients with schizophrenia-spectrum disorders were measured using multiplexed immunoassays. Prediabetic patients were randomized to 16 weeks of treatment with liraglutide or placebo, and cytokines were measured again at the end of the treatment.
IFN-γ (1.98 vs 1.17 pg/ml, P = .001), IL-4 (0.02 vs 0.01 pg/ml, P < .001), and IL-6 (0.73 vs 0.46 pg/ml, P < .001) were significantly higher in prediabetic (n = 77) vs NGT patients (n = 31). No significant changes in cytokine levels following treatment with liraglutide (n = 37) vs placebo (n = 40) were found.
Prediabetic vs NGT patients with schizophrenia treated with clozapine or olanzapine had increased serum levels of several proinflammatory cytokines, further substantiating the link between inflammation and T2D. Treatment with liraglutide did not affect the investigated cytokines. Further testing of these findings in larger numbers of individuals is needed.
New guidelines for peanut allergy prevention in high-risk infants recommend introducing peanut during infancy but do not address breastfeeding or maternal peanut consumption. We assessed the independent and combined association of these factors with peanut sensitization in the general population CHILD birth cohort (N = 2759 mother–child dyads). Mothers reported peanut consumption during pregnancy, timing of first infant peanut consumption, and length of breastfeeding duration. Child peanut sensitization was determined by skin prick testing at 1, 3, and 5 years. Overall, 69% of mothers regularly consumed peanuts and 36% of infants were fed peanut in the first year (20% while breastfeeding and 16% after breastfeeding cessation). Infants who were introduced to peanut early (before 1 year) after breastfeeding cessation had a 66% reduced risk of sensitization at 5 years compared to those who were not (1.9% vs. 5.8% sensitization; aOR 0.34, 95% CI 0.14–0.68). This risk was further reduced if mothers introduced peanut early while breastfeeding and regularly consumed peanut themselves (0.3% sensitization; aOR 0.07, 0.01–0.25). In longitudinal analyses, these associations were driven by a higher odds of outgrowing early sensitization and a lower odds of late-onset sensitization. There was no apparent benefit (or harm) from maternal peanut consumption without breastfeeding. Taken together, these results suggest the combination of maternal peanut consumption and breastfeeding at the time of peanut introduction during infancy may help to decrease the risk of peanut sensitization. Mechanistic and clinical intervention studies are needed to confirm and understand this “triple exposure” hypothesis.
A person’s epistemic goals sometimes clash with pragmatic ones. At times, rational agents will degrade the quality of their epistemic process in order to satisfy a goal that is knowledge-independent (for example, to gain status or at least keep the peace with friends.) This is particularly so when the epistemic quest concerns an abstract political or economic theory, where evidence is likely to be softer and open to interpretation. Before wide-scale adoption of the Internet, people sought out or stumbled upon evidence related to a proposition in a more random way. And it was difficult to aggregate the evidence of friends and other similar people to the exclusion of others, even if one had wanted to. Today, by contrast, the searchable Internet allows people to simultaneously pursue social and epistemic goals.
This essay shows that the selection effect caused by a merging of social and epistemic activities will cause both polarization in beliefs and devaluation of expert testimony. This will occur even if agents are rational Bayesians and have moderate credences before talking to their peers. What appears to be rampant dogmatism could be just as well explained by the nonrandom walk in evidence-gathering. This explanation better matches the empirical evidence on how people behave on social media platforms. It also helps clarify why media outlets (not just the Internet platforms) might have their own pragmatic reasons to compromise their epistemic goals in today’s competitive and polarized information market. Yet, it also makes policy intervention much more difficult, since we are unlikely to neatly separate individuals’ epistemic goals from their social ones.
Nutrition during the periconceptional period influences postnatal cardiovascular health. We determined whether in vitro embryo culture and transfer, which are manipulations of the nutritional environment during the periconceptional period, dysregulate postnatal blood pressure and blood pressure regulatory mechanisms. Embryos were either transferred to an intermediate recipient ewe (ET) or cultured in vitro in the absence (IVC) or presence of human serum (IVCHS) and a methyl donor (IVCHS+M) for 6 days. Basal blood pressure was recorded at 19–20 weeks after birth. Mean arterial pressure (MAP) and heart rate (HR) were measured before and after varying doses of phenylephrine (PE). mRNA expression of signaling molecules involved in blood pressure regulation was measured in the renal artery. Basal MAP did not differ between groups. Baroreflex sensitivity, set point, and upper plateau were also maintained in all groups after PE stimulation. Adrenergic receptors alpha-1A (αAR1A), alpha-1B (αAR1B), and angiotensin II receptor type 1 (AT1R) mRNA expression were not different from controls in the renal artery. These results suggest there is no programmed effect of ET or IVC on basal blood pressure or the baroreflex control mechanisms in adolescence, but future studies are required to determine the impact of ET and IVC on these mechanisms later in the life course when developmental programming effects may be unmasked by age.
Scientific interest in the therapeutic effects of classical psychedelics has increased in the past two decades. The psychological effects of these substances outside the period of acute intoxication have not been fully characterized. This study aimed to: (1) quantify the effects of psilocybin, ayahuasca, and lysergic acid diethylamide (LSD) on psychological outcomes in the post-acute period; (2) test moderators of these effects; and (3) evaluate adverse effects and risk of bias.
We conducted a systematic review and meta-analysis of experimental studies (single-group pre-post or randomized controlled trials) that involved administration of psilocybin, ayahuasca, or LSD to clinical or non-clinical samples and assessed psychological outcomes ⩾24 h post-administration. Effects were summarized by study design, timepoint, and outcome domain.
A total of 34 studies (24 unique samples, n = 549, mean longest follow-up = 55.34 weeks) were included. Classical psychedelics showed significant within-group pre-post and between-group placebo-controlled effects on a range of outcomes including targeted symptoms within psychiatric samples, negative and positive affect-related measures, social outcomes, and existential/spiritual outcomes, with large between-group effect in these domains (Hedges' gs = 0.84 to 1.08). Moderator tests suggest some effects may be larger in clinical samples. Evidence of effects on big five personality traits and mindfulness was weak. There was no evidence of post-acute adverse effects.
High risk of bias in several domains, heterogeneity across studies, and indications of publication bias for some models highlight the need for careful, large-scale, placebo-controlled randomized trials.
Physical health outcomes in severe mental illness are worse than in the general population. Routine physical health check completion in this group is poor.
To quantitatively and qualitatively evaluate the impact of point of care (POC) blood testing on physical health check completion in community mental health services.
In a prospective cohort design, we equipped an early intervention service (EIS) and a community mental health team (CMHT) with a POC blood testing device for 6 months. We compared rates of blood test and full physical health check completion in the intervention teams with a matched EIS and CMHT, historically and during the intervention. We explored attitudes to POC testing using thematic analysis of semi-structured interviews with patients and clinicians.
Although the CMHT scarcely used the POC device and saw no change in outcomes, direct comparison of testing rates in the intervention period showed increased physical health check completion in the EIS with the device (rate ratio RR = 5.18; 95% CI 2.54–12.44; P < 0.001) compared with usual care. The rate was consistent with the EIS's increasing rate of testing over time (RR = 0.45; 95% 0.09–2.08; P = 0.32). Similar trends were seen in blood test completion. POC testing was acceptable to patients but clinicians reported usability, provision and impact on the therapeutic relationship as barriers to uptake.
POC testing was beneficial and acceptable to patients and may increase physical health check uptake. Further research, accounting for clinician barriers, is needed to evaluate its clinical and cost-effectiveness.
Eighty percent of all patients suffering from major depressive disorder (MDD) relapse at least once in their lifetime. Thus, understanding the neurobiological underpinnings of the course of MDD is of utmost importance. A detrimental course of illness in MDD was most consistently associated with superior longitudinal fasciculus (SLF) fiber integrity. As similar associations were, however, found between SLF fiber integrity and acute symptomatology, this study attempts to disentangle associations attributed to current depression from long-term course of illness.
A total of 531 patients suffering from acute (N = 250) or remitted (N = 281) MDD from the FOR2107-cohort were analyzed in this cross-sectional study using tract-based spatial statistics for diffusion tensor imaging. First, the effects of disease state (acute v. remitted), current symptom severity (BDI-score) and course of illness (number of hospitalizations) on fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity were analyzed separately. Second, disease state and BDI-scores were analyzed in conjunction with the number of hospitalizations to disentangle their effects.
Disease state (pFWE < 0.042) and number of hospitalizations (pFWE< 0.032) were associated with decreased FA and increased MD and RD in the bilateral SLF. A trend was found for the BDI-score (pFWE > 0.067). When analyzed simultaneously only the effect of course of illness remained significant (pFWE < 0.040) mapping to the right SLF.
Decreased FA and increased MD and RD values in the SLF are associated with more hospitalizations when controlling for current psychopathology. SLF fiber integrity could reflect cumulative illness burden at a neurobiological level and should be targeted in future longitudinal analyses.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.