To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To analyse the results of treatment for nasolabial cysts according to whether an intraoral sublabial or endoscopic transnasal approach was used, and to determine the recent surgical trend in our hospital.
Twenty-four patients with a histopathologically and radiologically confirmed nasolabial cyst between January 2010 and December 2017 were enrolled in this study.
Nasolabial cysts were predominant in females (91.7 per cent) and on the left side (54.2 per cent). Treatment involved an intraoral sublabial approach in 12 cases (48.0 per cent) and a transnasal endoscopic approach in 13 cases (52.0 per cent). In 13 cases (52.0 per cent) surgery was performed under local anaesthesia, while in 12 cases (48.0 per cent) it was conducted under general anaesthesia. The most common post-operative complications were numbness of the upper lip or teeth (n = 9, 36.0 per cent). Only one patient (4.0 per cent), who underwent a transnasal endoscopic approach, experienced a reoccurrence.
Surgical resection through an intraoral sublabial or transnasal endoscopic approach is the best treatment for a nasolabial cyst, showing very good results and a low recurrence rate. The recent surgical trend in our hospital is to treat nasolabial cysts using a transnasal endoscopic approach under local anaesthesia.
This study aimed to determine the knowledge of first year health sciences students at a South African university regarding hearing loss and symptoms attributable to personal listening devices and their practices concerning the use of personal listening devices.
This was a cross-sectional study carried out using an anonymous self-administered questionnaire.
Of 336 students, 269 (80.1 per cent) completed the questionnaire. While most participants could identify symptoms that could be caused by extensive use of personal listening devices, almost 30 per cent did not know that it could cause permanent hearing loss. Personal listening devices were used by 90.7 per cent of participants, with 77.8 per cent having used them for more than five years. Use was at a high volume in 14.9 per cent of participants and for more than 2 hours per day in 52.7 per cent.
The findings indicate the need for an educational programme to inform students as to safe listening practices when using personal listening devices.
Presenteeism, or working while ill, by healthcare personnel (HCP) experiencing influenza-like illness (ILI) puts patients and coworkers at risk. However, hospital policies and practices may not consistently facilitate HCP staying home when ill.
Objective and methods:
We conducted a mixed-methods survey in March 2018 of Emerging Infections Network infectious diseases physicians, describing institutional experiences with and policies for HCP working with ILI.
Of 715 physicians, 367 (51%) responded. Of 367, 135 (37%) were unaware of institutional policies. Of the remaining 232 respondents, 206 (89%) reported institutional policies regarding work restrictions for HCP with influenza or ILI, but only 145 (63%) said these were communicated at least annually. More than half of respondents (124, 53%) reported that adherence to work restrictions was not monitored or enforced. Work restrictions were most often not perceived to be enforced for physicians-in-training and attending physicians. Nearly all (223, 96%) reported that their facility tracked laboratory-confirmed influenza (LCI) in patients; 85 (37%) reported tracking ILI. For employees, 109 (47%) reported tracking of LCI and 53 (23%) reported tracking ILI. For independent physicians, not employed by the facility, 30 (13%) reported tracking LCI and 11 (5%) ILI.
More than one-third of respondents were unaware of whether their institutions had policies to prevent HCP with ILI from working; among those with knowledge of institutional policies, dissemination, monitoring, and enforcement of these policies was highly variable. Improving communication about work-restriction policies, as well as monitoring and enforcement, may help prevent the spread of infections from HCP to patients.
Abnormal effort-based decision-making represents a potential mechanism underlying motivational deficits (amotivation) in psychotic disorders. Previous research identified effort allocation impairment in chronic schizophrenia and focused mostly on physical effort modality. No study has investigated cognitive effort allocation in first-episode psychosis (FEP).
Cognitive effort allocation was examined in 40 FEP patients and 44 demographically-matched healthy controls, using Cognitive Effort-Discounting (COGED) paradigm which quantified participants’ willingness to expend cognitive effort in terms of explicit, continuous discounting of monetary rewards based on parametrically-varied cognitive demands (levels N of N-back task). Relationship between reward-discounting and amotivation was investigated. Group differences in reward-magnitude and effort-cost sensitivity, and differential associations of these sensitivity indices with amotivation were explored.
Patients displayed significantly greater reward-discounting than controls. In particular, such discounting was most pronounced in patients with high levels of amotivation even when N-back performance and reward base amount were taken into consideration. Moreover, patients exhibited reduced reward-benefit sensitivity and effort-cost sensitivity relative to controls, and that decreased sensitivity to reward-benefit but not effort-cost was correlated with diminished motivation. Reward-discounting and sensitivity indices were generally unrelated to other symptom dimensions, antipsychotic dose and cognitive deficits.
This study provides the first evidence of cognitive effort-based decision-making impairment in FEP, and indicates that decreased effort expenditure is associated with amotivation. Our findings further suggest that abnormal effort allocation and amotivation might primarily be related to blunted reward valuation. Prospective research is required to clarify the utility of effort-based measures in predicting amotivation and functional outcome in FEP.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
In this study, the pull-in phenomenon of a Nano-actuator is investigated employing a nonlocal Bernoulli-Euler beam model with clamped-clamped conditions. The model accounts for viscous damping, residual stresses, the van der Waals (vdW) force and electrostatic forces with nonlocal effects. The hybrid differential transformation/finite difference method (HDTFDM) is used to analyze the nonlocal effects on a graphene sheet nanobeam, which is electrostatically actuated under the influence of the coupling effect, the von Kármán nonlinear strains and the fringing field effect. The pull-in voltage as calculated by the presented model deviates by no more than 0.29% from previous literature, verifying the validity of the HDTFDM. Furthermore, the nonlocal nonlinear behavior of the electrostatically actuated nanobeam is investigated, and the effects of viscous damping, residual stresses, and length-gap ratio are examined in detail. Overall, the results reveal that small scale effects significantly influence the characteristics of the graphene sheet nanobeam actuator.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
While much research has focused on crop damage following foliar exposure to auxin herbicides, reports documenting the risk posed by exposure via root uptake of irrigation water are lacking. Herbicide residues circulated in tailwater recovery systems may pose threats of cross-crop impacts to nonresistant cultivars with known sensitivity to auxins. An auxin-susceptible soybean [Glycine max (L.) Merr.] cultivar was grown in a controlled growth chamber environment and exposed to dicamba dissolved in irrigation water applied to the soil surface, simulating furrow irrigation. Five herbicide treatment concentrations, ranging from 0.05 to 5.0 mg L−1 and encompassing estimated field doses of 3.1 to 310g ha−1, were applied to the soil of potted soybean plants at V3/V4 or R1 growth stages. Plant injury (0% to 100%), dry mass, height, number of pods, and number of pod-bearing nodes were measured. Kruskal-Wallis and logistic regression analyses were performed to determine treatment differences and examine dose effects. Yield losses were projected using (1) 14 d after treatment plant injury assessments based on injury–yield relationships described for foliar exposures and (2) pod counts. Dicamba concentration was the main significant factor affecting all growth response metrics, and growth stage was a significant explanatory variable only for the height response metric. A nonlinear response to dicamba dose was observed, with the threshold response dose required to affect 50% of plants being three times greater for 40% crop injury compared with 20% injury. Yield projections derived from plant response to root uptake compared with foliar exposure indicate that soybean may express both magnitude of injury and specific symptomology differently when exposure occurs via root uptake. Drift exposure–based models may be incompatible to predict soybean yield loss when injury results from irrigation. Data are needed to develop correlations for predicting yield losses based on field-scale exposure to dicamba in irrigation water, as well as assessment of real-world concentrations of herbicide residues in tailwater recovery systems.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
A 1108.6 m long core was recovered at Site U1457 located on the Indus Fan in the Laxmi Basin of the eastern Arabian Sea during IODP Expedition 355. Shipboard examinations defined five lithologic units (I to V) of the lower Paleocene to Holocene sedimentary sequence. In this study, δ13C values of sedimentary organic matter (SOM) confirm the differentiation of the lithologic units and further divide units III and IV into two subunits (1 and 2). Based on the underlying assumption that the SOM is decided primarily by a mixture of marine and terrestrial origins, δ13CSOM values at Site U1457 provide information on the terrestrial catchment conditions since late Miocene time. Low δ13CSOM values from late Miocene to late Pleistocene times are similar (c. −22.0 ‰) for the most part, reflecting a consistent contribution of terrestrial organic matter from the catchment areas characterized by dominant C3 land plants. Significantly lower δ13CSOM values (c. −24.0 ‰) in Unit III-2 (∼8 to ∼7 Ma) might be due to a greater input of C3 terrestrial organic matter. The increase in δ13CSOM values at ∼7 Ma and the appearance of high δ13CSOM values (c. −18.0 ‰) within Unit III-1 (∼7 to ∼2 Ma) indicate that C4 biomass overwhelmed the terrestrial catchment environment as a result of enhanced terrestrial aridity in the Himalayan foreland. The three-end-member simple mixing model, estimating the relative contributions of SOM from terrestrial C3 and C4 plants and marine phytoplankton, supports our interpretation of the distribution of C3 and C4 land plants in the terrestrial catchment environment.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
Evidence from animal models indicates that exposure to an obesogenic or hyperglycemic intrauterine environment adversely impacts offspring kidney development and renal function. However, evidence from human studies has not been evaluated systematically. Therefore, the aim of this systematic review was to synthesize current research in humans that has examined the relationship between gestational obesity and/or diabetes and offspring kidney structure and function. Systematic electronic database searches were conducted of five relevant databases (CINAHL, Cochrane, EMBASE, MEDLINE and Scopus). Preferred Reporting Items for Systematic Reviews and Meta-analysis guidelines were followed, and articles screened by two independent reviewers generated nine eligible papers for inclusion. Six studies were assessed as being of ‘neutral’ quality, two of ‘negative’ and one ‘positive’ quality. Observational studies suggest that offspring exposed to a hyperglycemic intrauterine environment are more likely to display markers of renal dysfunction and are at higher risk of end-stage renal disease. There was limited and inconsistent evidence for a link between exposure to an obesogenic intrauterine environment and offspring renal outcomes. Offspring renal outcome measures across studies were diverse, with a large variation in offspring age at follow-up, limiting comparability across studies. The collective current body of evidence suggests that intrauterine exposure to maternal obesity and/or diabetes adversely impacts renal programming in offspring, with an increased risk of kidney disease in adulthood. Further high-quality, longitudinal, prospective cohort studies that measure indicators of offspring renal development and function, including fetal kidney volume and albuminuria, at standardized follow-up time points, are warranted.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
Decades of fetal programming research indicates that we may be able to map the origins of many physical, psychological, and medical variations and morbidities before the birth of the child. While great strides have been made in identifying associations between prenatal insults, such as undernutrition or psychosocial stress, and negative developmental outcomes, far less is known about how adaptive responses to adversity regulate the developing phenotype to match stressful conditions. As the application of epigenetic methods to human behavior has exploded in the last decade, research has begun to shed light on the role of epigenetic mechanisms in explaining how prenatal conditions shape later susceptibilities to mental and physical health problems. In this review, we describe and attempt to integrate two dominant fetal programming models: the cumulative stress model (a disease-focused approach) and the match–mismatch model (an evolutionary–developmental approach). In conjunction with biological sensitivity to context theory, we employ these two models to generate new hypotheses regarding epigenetic mechanisms through which prenatal and postnatal experiences program child stress reactivity and, in turn, promote development of adaptive versus maladaptive phenotypic outcomes. We conclude by outlining priority questions and future directions for the fetal programming field.
There is limited evidence on ethnic differences in personality disorder prevalence rates. We compared rates of people with personality disorder admitted to hospital in East London from 2007 to 2013.
Of all people admitted to hospital, 9.7% had a personality disorder diagnosis. The admission rate for personality disorder has increased each year. Compared with White subjects, personality disorder was significantly less prevalent among Black and other minority ethnic (BME) groups. Personality disorder was diagnosed in 20% of forensic, 11% of general adult, 8% of adolescent and 2% of old-age in-patients.
The increasing number of personality disorder diagnoses year on year indicates the increasing impact of personality disorder on in-patient services. It is important to identify and appropriately manage patients with a personality disorder diagnosis due to the significant strain they place on resources. The reasons for fewer admissions of BME patients may reflect alternative service use, a truly lower prevalence rate or under-detection.