To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The mental health of third-level students is of major societal concern with the gap between the demand for services and supports offered at crisis level. In Ireland, similar to elsewhere, colleges have responded to this need in vastly differing ways, with student counselling services available to all institutions, and student health departments and sessional psychiatry in some of the larger institutions, with none operating as a single multidisciplinary service. There is an increasing recognition for a more systematised approach, with the establishment of International Networks, Charters and Frameworks. These advocate for a whole institutional approach to student mental health, in addition to the development of an integrated system of supports with effective pathways to appropriate care. This paper, by members of the Youth and Student Special Interest Group of the College of Psychiatrists of Ireland, contextualises student mental health currently and describes future directions for this emerging field. It is a call to action to develop a structure that supports the needs of students with mental health problems across the full range of the spectrum from mild to severe.
Water-filled boreholes in cold ice refreeze in hours to days, and prior attempts to keep them open with antifreeze resulted in a plug of slush effectively freezing the hole even faster. Thus, antifreeze as a method to stabilize hot-water boreholes has largely been abandoned. In the hot-point drilling case, no external water is added to the hole during drilling, so earlier antifreeze injection is possible while the drill continues melting downward. Here, we use a cylindrical Stefan model to explore slush formation within the parameter space representative of hot-point drilling. We find that earlier injection timing creates an opportunity to avoid slush entirely by injecting sufficient antifreeze to dissolve the hole past the drilled radius. As in the case of hot-water drilling, the alternative is to force mixing in the hole after antifreeze injection to ensure that ice refreezes onto the borehole wall instead of within the solution as slush.
With human influences driving populations of apex predators into decline, more information is required on how factors affect species at national and global scales. However, camera-trap studies are seldom executed at a broad spatial scale. We demonstrate how uniting fine-scale studies and utilizing camera-trap data of non-target species is an effective approach for broadscale assessments through a case study of the brown hyaena Parahyaena brunnea. We collated camera-trap data from 25 protected and unprotected sites across South Africa into the largest detection/non-detection dataset collected on the brown hyaena, and investigated the influence of biological and anthropogenic factors on brown hyaena occupancy. Spatial autocorrelation had a significant effect on the data, and was corrected using a Bayesian Gibbs sampler. We show that brown hyaena occupancy is driven by specific co-occurring apex predator species and human disturbance. The relative abundance of spotted hyaenas Crocuta crocuta and people on foot had a negative effect on brown hyaena occupancy, whereas the relative abundance of leopards Panthera pardus and vehicles had a positive influence. We estimated that brown hyaenas occur across 66% of the surveyed camera-trap station sites. Occupancy varied geographically, with lower estimates in eastern and southern South Africa. Our findings suggest that brown hyaena conservation is dependent upon a multi-species approach focussed on implementing conservation policies that better facilitate coexistence between people and hyaenas. We also validate the conservation value of pooling fine-scale datasets and utilizing bycatch data to examine species trends at broad spatial scales.
Background: Updated IDSA-SHEA guidelines recommend different diagnostic approaches to C. difficile depending on whether There are pre-agreed institutional criteria for patient stool submission. If stool submission criteria are in place, nucleic acid amplification testing (NAAT) alone may be used. If not, a multistep algorithm is suggested, incorporating various combinations of toxin enzyme immunoassay (EIA), glutamate dehydrogenase (GDH), and NAAT, with discordant results adjudicated by NAAT. At our institution, we developed a multistep algorithm leading with NAAT with reflex to EIA for toxin testing if NAAT is positive. This algorithm resulted in a significant proportion of patients with discordant results (NAAT positive and toxin EIA negative) that some experts have categorized as possible carriers or C. difficile colonized. In this study, we describe the impact of a multistep algorithm on hospital-onset, community-onset, and healthcare-facility–associated C. difficile infection (HO-CDI, CO-CDI, and HFA-CDI, respectively) rates and the management of possible carriers. Methods: The study setting was a 399-bed, tertiary-care VA Medical Center in Richmond, Virginia. A retrospective chart review was conducted. The multistep C. difficile testing algorithm was implemented June 4, 2019 (Fig. 1). C. difficile testing results and possible carriers were reviewed for the 5 months before and 4 months after implementation (January 2019 to September 2019). Results: In total, 587 NAATs were performed in the inpatient and outpatient setting (mean, 58.7 per month). Overall, 123 NAATs (21%) were positive: 59 in the preintervention period and 63 in the postintervention period. In the postintervention period, 23 positive NAATs (26%) had a positive toxin EIA. Based on LabID events, the mean rate of HO+CO+HCFA CDI cases per 10,000 bed days of care (BDOC) decreased significantly from 9.49 in the preintervention period to 1.15 in the postintervention period (P = .019) (Fig. 2). Also, 9 of the possible carriers (22%) were treated for CDI based on high clinical suspicion, and 6 of the possible carriers (14%) had a previous history of CDI. Of these, 5 (83%) were treated for CDI. In addition, 1 patient (2%) converted from possible carrier to positive toxin EIA within 14 days. The infectious diseases team was consulted for 11 possible carriers (27%). Conclusions: Implementation of a 2-step C difficile algorithm leading with NAAT was associated with a lower rate of HO+CO+HCFA CDI per 10,000 BDOC. A considerable proportion (22%) of possible carriers were treated for CDI but did not count as LabID events. Only 2% of the possible carriers in our study converted to a positive toxin EIA.
The review aimed to identify factors influencing opioid prescribing as regular pain-management medication for older people.
Chronic pain occurs in 45%–85% of older people, but appears to be under-recognised and under-treated. However, strong opiate prescribing is more prevalent in older people, increasing at the fastest rate in this age group.
This review included all study types, published 1990–2017, which focused on opioid prescribing for pain management among older adults. Arksey and O’Malley’s framework was used to scope the literature. PubMed, EBSCO Host, the UK Drug Database, and Google Scholar were searched. Data extraction, carried out by two researchers, included factors explaining opioid prescribing patterns and prescribing trends.
A total of 613 papers were identified and 53 were included in the final review consisting of 35 research papers, 10 opinion pieces and 8 grey literature sources. Factors associated with prescribing patterns were categorised according to whether they were patient-related, prescriber-driven, or system-driven. Patient factors included age, gender, race, and cognition; prescriber factors included attitudes towards opioids and judgements about ‘normal’ pain; and policy/system factors related to the changing policy landscape over the last three decades, particularly in the USA.
A large number of context-dependent factors appeared to influence opioid prescribing for chronic pain management in older adults, but the findings were inconsistent. There is a gap in the literature relating to the UK healthcare system; the prescriber and the patient perspective; and within the context of multi-morbidity and treatment burden.
When it comes to electing the chief executive of the United States, the presidential debates play an important role in shaping public opinion and the choices facing voters. Having a fair process in place to determine who is eligible to participate in the debates and to guarantee that the debates are conducted neutrally is crucial to ensuring the integrity of the electoral process as a whole. In the past, controversies have arisen concerning which candidates should be invited to participate, which political parties should be represented, and whether the debates have been conducted in a way that is fair and neutral. Most of these controversies have never been resolved satisfactorily. Today, much more work needs to be done to ensure that our presidential primary and general election debates live up to their potential to provide truly diverse policy views to the public and are conducted in a manner that is wholly free from bias. Gender bias in terms of the questions asked of the candidates was evident in 2016, and other kinds of biases may appear in the future. Problematically, the eligibility rules for the general presidential debates have remained unchanged for decades. Meanwhile, government oversight of the debates remains virtually non-existent.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
Decisions to treat large-vessel occlusion with endovascular therapy (EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ comorbidities. We explored EVT/alteplase decision-making by stroke experts in the setting of comorbidity/disability.
In an international multi-disciplinary survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case scenarios. Five included comorbidities (cancer, cardiac/respiratory/renal disease, mild cognitive impairment [MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions.
Among 607 physicians (38 countries), EVT was chosen less often in comorbidity-related scenarios (79.6% under current resources, 82.7% assuming ideal conditions) versus six “level-1A” scenarios for which EVT/alteplase was clearly indicated by current guidelines (91.1% and 95.1%, respectively, odds ratio [OR] [current resources]: 0.38, 95% confidence interval 0.31–0.47). However, EVT was chosen more often in comorbidity-related scenarios compared to all other 17 scenarios (79.6% versus 74.4% under current resources, OR: 1.34, 1.17–1.54). Responses favoring alteplase for comorbidity-related scenarios (e.g. 75.0% under current resources) were comparable to level-1A scenarios (72.2%) and higher than all others (60.4%). No comorbidity independently diminished EVT odds when considering all scenarios. MCI and dependence carried higher alteplase odds; cancer and cardiac/respiratory/renal disease had lower odds. Being older/female carried lower EVT odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT-, lower alteplase odds), practicing in East Asia (higher EVT odds), and in interventional neuroradiology (lower alteplase odds vs neurology).
Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT. Differences in decision-making by patient age/sex merit further study.
Characterizing non-lethal damage within dry seeds may allow us to detect early signs of ageing and accurately predict longevity. We compared RNA degradation and viability loss in seeds exposed to stressful conditions to quantify relationships between degradation rates and stress intensity or duration. We subjected recently harvested (‘fresh’) ‘Williams 82’ soya bean seeds to moisture, temperature and oxidative stresses, and measured time to 50% viability (P50) and rate of RNA degradation, the former using standard germination assays and the latter using RNA Integrity Number (RIN). RIN values from fresh seeds were also compared with those from accessions of the same cultivar harvested in the 1980s and 1990s and stored in the refrigerator (5°C), freezer (−18°C) or in vapour above liquid nitrogen (−176°C). Rates of viability loss (P50−1) and RNA degradation (RIN⋅d−1) were highly correlated in soya bean seeds that were exposed to a broad range of temperatures [holding relative humidity (RH) constant at about 30%]. However, the correlation weakened when fresh seeds were maintained at high RH (holding temperature constant at 35°C) or exposed to oxidizing agents. Both P50−1 and RIN⋅d−1 parameters exhibited breaks in Arrhenius behaviour near 50°C, suggesting that constrained molecular mobility regulates degradation kinetics of dry systems. We conclude that the kinetics of ageing reactions at RH near 30% can be simulated by temperatures up to 50°C and that RNA degradation can indicate ageing prior to and independent of seed death.
OBJECTIVES/GOALS: Access to pediatric subspecialty care varies by sociodemographic factors. Providers for gender diverse youth (GDY) are rare, and GDY face health disparities, stigma, and discrimination. We examined the association between GDY access to medical and mental health care and rurality, race, parental education, and other GDY-specific factors. METHODS/STUDY POPULATION: We surveyed parents of GDY (<18 years old) across the United States. Participants were recruited through online communities and listserves specific to parents of GDY. We determined associations between access to gender-specific medical or mental health providers and rurality, race, parental education, as well as other GDY-specific factors including age, time since telling their parent their gender identity, parent-adolescent communication, parent stress, and gender identity using chi-square or Fisher’s exact tests. We calculated adjusted odds ratios using logistic regression models. RESULTS/ANTICIPATED RESULTS: We surveyed 166 parents and caregivers from 31 states. The majority (73.2%) identified as white, 66.5% had earned a bachelor’s degree or higher, and 7.6% lived in a zip code designated rural by the Federal Office of Rural Health Policy. We found no evidence of association between reported GDY access to medical or mental health care and race, parental education, or rurality. We did find a significant univariate association between access to mental health care and feminine (either female or transfeminine/transfemale) gender identity (p = 0.033, OR 2.60, 95% CI 1.06 – 6.36). After controlling for parent-adolescent communication in a backwards elimination logistic regression model, it was no longer significant (p = 0.137, OR 2.05, 95% CI 0.80 – 5.25). DISCUSSION/SIGNIFICANCE OF IMPACT: Despite rurality, race, and parental education impacting access to pediatric subspecialty care, we failed to find these associations among GDY accessing gender care. There is a need to better understand structural and societal barriers to care for this population including the impact of stigma and discrimination.
OBJECTIVES/GOALS: Sodium (Na) intake can elevate blood pressure and is a factor in developing chronic kidney disease (CKD). Twenty-four-hour urinary Na (24hUNa) is the gold standard for assessing Na intake but is burdensome. Validated equations estimate 24hUNa (e24hUNa) from a spot urine sample, but these estimations are not validated against a known Na intake in CKD. METHODS/STUDY POPULATION: The current study is a secondary analysis of a 9-day controlled feeding study in moderate CKD patients matched to healthy adults. Only CKD patients were used for the current analyses (n = 8). Participants consumed a controlled diet for 9 days, providing ~2400 mg Na/d as determined by inductively coupled plasma optical emission spectroscopy (ICP). On days 7 and 8, participants collected all urine in an inpatient setting, beginning with a fasting sample on day 7. Urine sample mineral analyses were performed by ICP and urinary creatinine by the Jaffe reaction. The day 7 fasting urine sample was used to calculate e24hUNa using 6 published equations. Log-transformed Na intake, measured 24hUNa, and e24hUNa were compared by repeated-measures ANOVA with planned contrasts using SAS. RESULTS/ANTICIPATED RESULTS: Fifty percent of the CKD patients (n = 4) were female; 63% (n = 5) were white, and 37% (n = 3) were black. On average, participants were aged 56.6 ± 13.8 y with a BMI of 31.7 ± 9.4 kg/m2 and eGFR of 40.7 ± 7.9 mL/min. Based on actual food intake, average Na intake on day 7 was 2024 ± 388 mg. Average measured 24hUNa was 2529 ± 1334 mg. The main ANOVA was significant (p = 0.02). Results from the planned contrasts found that e24hUNa from the SALTED cohort, an equation developed specifically for CKD patients, was significantly higher than both Na intake (p<0.001) and measured 24hUNa (p = 0.007). For the remaining 5 equations, e24hUNa was not significantly different from measured 24hUNa nor dietary Na intake. DISCUSSION/SIGNIFICANCE OF IMPACT : Our results suggest that e24hUNa calculated using most published equations may provide a reliable and low-burden method of assessing dietary Na intake in moderate CKD patients. These findings should be confirmed in larger samples. Additional studies are needed to validate or dispute the use of the SALTED equation for estimating Na intake.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Introduction: Emergency department (ED) buprenorphine/naloxone inductions for opioid use disorder are an effective and safe way to initiate addictions care in the ED. Kelowna General Hospital's ED buprenorphine/naloxone (KEDSS) program was implemented in September 2018 in order to respond to a community need for accessible and evidence-based addictions care. The objective of our program evaluation study was to examine the implementation of the first five months of the KEDSS program through evaluating patient characteristics and service outcomes. Methods: The KEDSS treatment pathway consists of a standardized protocol (pre-printed order set) to facilitate buprenorphine/naloxone induction and stabilization in the acute care setting (ED and inpatient wards) at Kelowna General Hospital, a community academic hospital. All patients referred to the outpatient addictions clinic via the order set during September 2018-January 2019 (the first 5 months) were included in the study population. A retrospective descriptive chart review was completed. Outcome measures included population characteristics (sociodemographic information, clinical characteristics) and service outcomes (number of patients initiated, patient follow-up). Descriptive statistics and bivariate analyses using t-tests or Pearson's χ2 statistic, as appropriate, were conducted to compare the ED-initiated group with the inpatient-initiated group. Results: During the first five months of the KEDSS program, a total of 35 patients (26% female, mean age 36.6 years, 54% homeless) were started on the treatment pathway, 16 (46%) in the ED. Compared to the inpatient-initiated group, the ED-initiated group were less likely to have psychiatric comorbidities (ED 1.0 vs. inpatient 1.5, p = 0.002), require methadone or sustained-release oral morphine (ED 13% vs. inpatient 37%, p = 0.048), and have attended follow-up (ED 56% vs. inpatient 84%, p = 0.004). Conclusion: This study provides a preliminary look at a new opioid agonist therapy (OAT) treatment pathway (KEDSS) at Kelowna General Hospital, and provides insight into the population that is accessing the program. We found that the majority of patients who are started on buprenorphine/naloxone in the ED are seen in follow-up at the addictions clinic. Future work will examine ongoing follow-up and OAT adherence rates in the study population to quantify the program's impact on improving access to addictions treatment within this community hospital setting.
Early growth pattern is increasingly recognized as a determinant of later obesity. This study aimed to identify the association between weight gain in early life and anthropometry, adiposity, leptin, and fasting insulin levels in adolescence. A cross-sectional study was conducted in 366 school children aged 11–13 years. Weight, height, and waist circumference (WC) were measured. Fat mass (FM) was assessed using bioelectrical impedance analysis. Blood was drawn after a 12-h fast for insulin and leptin assay. Birth weight and weight at 6 months and at 18 months were extracted from Child Health Development Records. An increase in weight SD score (SDS) by ≥0.67 was defined as accelerated weight gain. Linear mixed-effects modeling was used to predict anthropometry, adiposity, and metabolic outcomes using sex, pubertal status, accelerated weight gain as fixed factors; age, birth weight, and family income as fixed covariates, and school as a random factor. Children with accelerated weight gain between birth and 18 months had significantly higher body mass index (BMI) SDS, WC SDS, height SDS, %FM, fat mass index (FMI), fat free mass index (FFMI), and serum leptin levels in adolescence. Accelerated weight gain between 6 and 18 months was associated with higher BMI SDS, WC SDS, %FM, and FMI, but not with height SDS or FFMI. Accelerated weight gain at 0–6 months, in children with low birth weight, was associated with higher height SDS, BMI SDS, WC SDS, %FM, and FMI; in children with normal birth weight, it was associated with BMI SDS, WC SDS, height SDS, and FFMI, but not with %FM or FMI. Effects of accelerated weight gain in early life on anthropometry and adiposity in adolescence varied in different growth windows. Accelerated weight gain during 6–18 months was associated with higher FM rather than linear growth. Effects of accelerated weight gain between 0 and 6 months varied with birth weight.
There is limited knowledge of how individuals reflect on their involuntary admission.
To investigate, at one year after an involuntary admission,
(i) peoples perception of the necessity of their involuntary admission
(ii) the enduring impact on the relationship with their family, consultant psychiatrist and employment prospects
(iii) readmission rates to hospital and risk factors for readmission.
People that were admitted involuntarily over a 15 month period were re-interviewed at one year following discharge.
Sixty eight people were re-interviewed at one year and this resulted in a follow-up rate of 84%. Prior to discharge, 72% of people reported that their involuntary admission had been necessary however this reduced to 60% after one year. Over one third of people changed their views and the majority of these patients reflected negatively towards their involuntary admission.
One quarter of people continued to experience a negative impact on the relationship with a family member and their consultant psychiatrist one year after an involuntary admission, while 13% reported a positive impact. A similar proportion perceived that it had negative consequences in their employment.
Within one year, 43% of all patients involuntarily admitted in the study period were readmitted to hospital and half of these admissions were involuntary. Involuntary readmission was associated with a sealing over recovery style.
Peoples’ perception of the necessity of their involuntary admissions changes significantly over time. Involuntary admissions can have a lasting negative impact on the relationship with family members and treating consultant psychiatrist.
The use of physical coercion and involuntary admission is one of the most controversial practices in medicine, it is now understood that perceived coercion is multidimensional and is associated with procedural justice and perceived pressures, and not simply related to the legal status of the patient.
We sought to determine the rate of physical coercion used and the perceived pressures and procedural justice experienced by the person at the time of involuntary admission and whether this influenced future engagement with the mental health services.
Over a 15 month period, people admitted involuntarily were interviewed prior to discharge and at one year follow-up.
81 people participated in the study and 81% were interviewed at one year follow-up. At the time of involuntary admission, over half of people experienced at least one form of physical coercion and it was found that the level of procedural justice experienced was unrelated to the use of physical coercive measures. A total of 20% of participants intended not to voluntarily engage with the mental health services upon discharge and they were more likely to have experienced lower levels of procedural justice at the time of admission. At one year following discharge, 65% of participants were adherent with outpatient appointments and 18% had been readmitted involuntarily. The level of procedural justice experienced at admission did not predict future engagement with services.
This study demonstrates that the use of physical coercive measures is a separate entity from procedural justice and perceived pressures.
We sought to determine the level of procedural justice experienced by individuals at the time of involuntary admission and whether this influenced future engagement with the mental health services.
Over a 15-month period, individuals admitted involuntarily were interviewed prior to discharge and at one-year follow-up.
Eighty-one people participated in the study and 81% were interviewed at one-year follow-up. At the time of involuntary admission, over half of individuals experienced at least one form of physical coercion and it was found that the level of procedural justice experienced was unrelated to the use of physical coercive measures. A total of 20% of participants intended not to voluntarily engage with the mental health services upon discharge and they were more likely to have experienced lower levels of procedural justice at the time of admission. At one year following discharge, 65% of participants were adherent with outpatient appointments and 18% had been readmitted involuntarily. Insight was associated with future engagement with the mental health services; however, the level of procedural justice experienced at admission did not influence engagement.
This study demonstrates that the use of physical coercive measures is a separate entity from procedural justice and perceived pressures.
Impaired insight is commonly seen in psychosis and some studies have proposed that is a biologically based deficit. Support for this view comes from the excess of neurological soft signs (NSS) observed in patients with psychoses and their neural correlates which demonstrate a degree of overlap with the regions of interest implicated in neuroimaging studies of insight. The aim was to examine the relationship between NSS and insight in a sample of 241 first-episode psychosis patients.
Total scores and subscale scores from three insight measures and two NSS scales were correlated in addition to factors representing overall insight and NSS which we created using principal component analysis.
There were only four significant associations when we controlled for symptoms. “Softer” Condensed Neurological Evaluation (CNE) signs were associated with our overall insight factor (r = 0.19, P = 0.02), with total Birchwood (r = −0.24, P<0.01), and the Birchwood subscales; recognition of mental illness (r = −0.24, P<0.01) and need for treatment (r = −0.18, P = 0.02). Total Neurological Evaluation Scale (NES) and recognition of the achieved effects of medication were also weakly correlated (r = 0.14, P = 0.04).
This study does not support a direct link between neurological dysfunction and insight in psychosis. Our understanding of insight as a concept remains in its infancy.