To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
MD-PhD training programs train physician-scientists to pursue careers involving both clinical care and research, but decreasing numbers of physician-scientists stay engaged in clinical research. We sought to identify current clinical research training methods utilized by MD–PhD programs and to assess how effective they are in promoting self-efficacy for clinical research.
The US MD–PhD students were surveyed in April–May 2018. Students identified the clinical research training methods they participated in, and self-efficacy in clinical research was determined using a modified 12-item Clinical Research Appraisal Inventory.
Responses were received from 61 of 108 MD–PhD institutions. Responses were obtained from 647 MD–PhD students in all years of training. The primary methods of clinical research training included no clinical research training, and various combinations of didactics, mentored clinical research, and a clinical research practicum. Students with didactics plus mentored clinical research had similar self-efficacy as those with didactics plus clinical research practicum. Training activities that differentiated students who did and did not have the clinical research practicum experience and were associated with higher self-efficacy included exposure to Institutional Review Boards and participation in human subject recruitment.
A clinical research practicum was found to be an effective option for MD–PhD students conducting basic science research to gain experience in clinical research skills. Clinical research self-efficacy was correlated with the amount of clinical research training and specific clinical research tasks, which may inform curriculum development for a variety of clinical and translational research training programs, for example, MD–PhD, TL1, and KL2.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
Reductions in insulin sensitivity in periparturient dairy cows develop as a means to support lactation; however, excessive mobilization of fatty acids (FA) increases the risk for peripartal metabolic disorders. Our objectives were to investigate the effect of prepartum body condition score (BCS) on systemic glucose and insulin tolerance, and to compare direct and indirect measurements of insulin sensitivity in peripartal lean and overweight dairy cows. Fourteen multiparous Holstein cows were allocated into two groups according to their BCS at day −28 prepartum: lean (n = 7; BCS ≤ 3.0) or overweight; (n = 7; BCS ≥ 4.0). Liver biopsies were performed on day −27, −14 and 4, relative to expected parturition. Intravenous insulin or glucose tolerances tests were performed following each liver biopsy. Relative to lean cows, overweight cows exhibited lower dry matter intake, lost more BCS and displayed increased plasma FA and β-hydroxybutyrate concentrations and elevated liver lipid content during peripartum. Glucose clearance rate was lower for all cows postpartum. Prepartum BCS had minimal effects on insulin and glucose tolerance; however, the ability of the cow to restore blood glucose levels following an insulin challenge was suppressed by increased BCS. Glucose-dependent parameters of insulin and glucose tolerance were not correlated with surrogate indices of insulin sensitivity. We conclude that prepartum BCS had minimal effect on systemic insulin sensitivity following parturition. The observed inconsistency between surrogate indices of insulin sensitivity and direct measurements of insulin and glucose tolerance adds support to growing concerns regarding their usefulness as tools to estimate systemic insulin action in periparturient cows.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Studies were conducted to determine the tolerance of sweetpotato and Palmer amaranth control to a premix of flumioxazin and pyroxasulfone pretransplant (PREtr) followed by (fb) irrigation. Greenhouse studies were conducted in a factorial arrangement of four herbicide rates (flumioxazin/pyroxasulfone PREtr at 105/133 and 57/72 g ai ha–1, S-metolachlor PREtr 803 g ai ha–1, nontreated) by three irrigation timings [2, 5, and 14 d after transplanting (DAP)]. Field studies were conducted in a factorial arrangement of seven herbicide treatments (flumioxazin/pyroxasulfone PREtr at 40/51, 57/72, 63/80, and 105/133 g ha–1, 107 g ha–1 flumioxazin PREtr fb 803 g ha–1S-metolachlor 7 to 10 DAP, and season-long weedy and weed-free checks) by three 1.9-cm irrigation timings (0 to 2, 3 to 5, or 14 DAP). In greenhouse studies, flumioxazin/pyroxasulfone reduced sweetpotato vine length and shoot and storage root fresh biomass compared to the nontreated check and S-metolachlor. Irrigation timing had no influence on vine length and root fresh biomass. In field studies, Palmer amaranth control was≥91% season-long regardless of flumioxazin/pyroxasulfone rate or irrigation timing. At 38 DAP, sweetpotato injury was≤37 and≤9% at locations 1 and 2, respectively. Visual estimates of sweetpotato injury from flumioxazin/pyroxasulfone were greater when irrigation timing was delayed 3 to 5 or 14 DAP (22 and 20%, respectively) compared to 0 to 2 DAP (7%) at location 1 but similar at location 2. Irrigation timing did not influence no.1, jumbo, or marketable yields or root length-to-width ratio. With the exception of 105/133 g ha–1, all rates of flumioxazin/pyroxasulfone resulted in marketable sweetpotato yield and root length-to-width ratio similar to flumioxazin fb S-metolachlor or the weed-free checks. In conclusion, flumioxazin/pyroxasulfone PREtr at 40/51, 57/72, and 63/80 g ha–1 has potential for use in sweetpotato for Palmer amaranth control without causing significant crop injury and yield reduction.
Field studies were conducted in 2015 and 2016 in North Carolina to determine the response of ‘Covington’ and ‘Murasaki-29’ sweetpotato cultivars to four rates of linuron (420, 560, 840, and 1,120 g ai ha–1) alone or with S-metolachlor (803 g ai ha–1) applied 7 or 14 d after transplanting (DAP). Injury (chlorosis/necrosis and stunting) to both cultivars was greater when linuron was applied with S-metolachlor as compared to linuron applied alone. Herbicide application at 14 DAP caused greater injury (chlorosis/necrosis and stunting) to both cultivars than when applied at 7 DAP. At 4 wk after treatment (WAT), stunting of Covington and Murasaki-29 (hereafter Murasaki) from linuron at 420 to 1,120 g ha–1 increased from 27% to 50% and 25% to 53%, respectively. At 7 or 8 WAT, crop stunting of 8% or less and 0% was observed in Covington and Murasaki, respectively, regardless of application rate and timing. Murasaki root yields were similar in the linuron alone or with S-metolachlor treatments, and were lower than the nontreated check. In 2016, no. 1 and marketable sweetpotato yields of Covington were similar for the nontreated check, linuron alone, or linuron plus S-metolachlor treatments, but not in 2015. Decreases in no. 1 and marketable root yields were observed when herbicides were applied 14 DAP compared to 7 DAP for Covington in 2015 and for Murasaki in both years. No. 1 and marketable yields of Covington were similar for 420 to 1,120 g ha–1 linuron and nontreated check except marketable root yields in 2015. No. 1 and marketable sweetpotato yields of Murasaki decreased as application rates increased.
Treatment for hoarding disorder is typically performed by mental health professionals, potentially limiting access to care in underserved areas.
We aimed to conduct a non-inferiority trial of group peer-facilitated therapy (G-PFT) and group psychologist-led cognitive–behavioural therapy (G-CBT).
We randomised 323 adults with hording disorder 15 weeks of G-PFT or 16 weeks of G-CBT and assessed at baseline, post-treatment and longitudinally (≥3 months post-treatment: mean 14.4 months, range 3–25). Predictors of treatment response were examined.
G-PFT (effect size 1.20) was as effective as G-CBT (effect size 1.21; between-group difference 1.82 points, t = −1.71, d.f. = 245, P = 0.04). More homework completion and ongoing help from family and friends resulted in lower severity scores at longitudinal follow-up (t = 2.79, d.f. = 175, P = 0.006; t = 2.89, d.f. = 175, P = 0.004).
Peer-led groups were as effective as psychologist-led groups, providing a novel treatment avenue for individuals without access to mental health professionals.
Declaration of interest
C.A.M. has received grant funding from the National Institutes of Health (NIH) and travel reimbursement and speakers’ honoraria from the Tourette Association of America (TAA), as well as honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. K.D. receives research support from the NIH and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. R.S.M. receives research support from the National Institute of Mental Health, National Institute of Aging, the Hillblom Foundation, Janssen Pharmaceuticals (research grant) and the Alzheimer's Association. R.S.M. has also received travel support from the National Institute of Mental Health for Workshop participation. J.Y.T. receives research support from the NIH, Patient-Centered Outcomes Research Institute and the California Tobacco Related Research Program, and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. All other authors report no conflicts of interest.
Predictive analytics in health is a complex, transdisciplinary field requiring collaboration across diverse scientific and stakeholder groups. Pilot implementation of participatory research to foster team science in predictive analytics through a partnered-symposium and funding competition. In total, 85 stakeholders were engaged across diverse translational domains, with a significant increase in perceived importance of early inclusion of patients and communities in research. Participatory research approaches may be an effective model for engaging broad stakeholders in predictive analytics.
The unique phenotypic and genetic aspects of obsessive-compulsive (OCD) and attention-deficit/hyperactivity disorder (ADHD) among individuals with Tourette syndrome (TS) are not well characterized. Here, we examine symptom patterns and heritability of OCD and ADHD in TS families.
OCD and ADHD symptom patterns were examined in TS patients and their family members (N = 3494) using exploratory factor analyses (EFA) for OCD and ADHD symptoms separately, followed by latent class analyses (LCA) of the resulting OCD and ADHD factor sum scores jointly; heritability and clinical relevance of the resulting factors and classes were assessed.
EFA yielded a 2-factor model for ADHD and an 8-factor model for OCD. Both ADHD factors (inattentive and hyperactive/impulsive symptoms) were genetically related to TS, ADHD, and OCD. The doubts, contamination, need for sameness, and superstitions factors were genetically related to OCD, but not ADHD or TS; symmetry/exactness and fear-of-harm were associated with TS and OCD while hoarding was associated with ADHD and OCD. In contrast, aggressive urges were genetically associated with TS, OCD, and ADHD. LCA revealed a three-class solution: few OCD/ADHD symptoms (LC1), OCD & ADHD symptoms (LC2), and symmetry/exactness, hoarding, and ADHD symptoms (LC3). LC2 had the highest psychiatric comorbidity rates (⩾50% for all disorders).
Symmetry/exactness, aggressive urges, fear-of-harm, and hoarding show complex genetic relationships with TS, OCD, and ADHD, and, rather than being specific subtypes of OCD, transcend traditional diagnostic boundaries, perhaps representing an underlying vulnerability (e.g. failure of top-down cognitive control) common to all three disorders.
Individuals with childhood-onset coronary artery anomalies are at increased risk of lifelong complications. Although pregnancy is thought to confer additional risk, a few data are available regarding outcomes in this group of women. We sought to define outcomes of pregnancy in this unique population.
We performed a retrospective survey of women with paediatric-onset coronary anomalies and pregnancy in our institution, combined with a systematic review of published cases. We defined paediatric-onset coronary artery anomalies as congenital coronary anomalies and inflammatory arteriopathies of childhood that cause coronary aneurysms. Major cardiovascular events were defined as pulmonary oedema, sustained arrhythmia requiring treatment, stroke, myocardial infarction, cardiac arrest, or death.
A total of 25 surveys were mailed, and 20 were returned (80% response rate). We included 46 articles from the literature, which described cardiovascular outcomes in 82 women (138 pregnancies). These data were amalgamated for a total of 102 women and 194 pregnancies; 59% of women were known to have paediatric-onset coronary artery anomalies before pregnancy. In 23%, the anomaly was unmasked during or shortly after pregnancy. The remainder, 18%, was diagnosed later in life. Major cardiovascular events occurred in 14 women (14%) and included heart failure (n=5, 5%), myocardial infarction (n=7, 7%), maternal death (n=2, 2%), cardiac arrest secondary to ventricular fibrillation (n=1, 1%), and stroke (n=1, 1%). The majority of maternal events (13/14, 93%) occurred in women with no previous diagnosis of coronary disease.
Women with paediatric-onset coronary artery anomalies have a 14% risk of adverse cardiovascular events in pregnancy, indicating the need for careful assessment and close follow-up. Prospective, multicentre studies are required to better define risk and predictors of complications during pregnancy.
This study evaluated the psychometric properties of the Strengths and Difficulties Questionnaire Self-Report (SDQ-S) in South African adolescents, and compared findings with data from the UK, Australia and China.
A sample of 3451 South African adolescents in grade 8, the first year of secondary school (Mage = 13.7 years), completed the SDQ-S in Afrikaans, English or isiXhosa. Means, group differences and internal consistency were analysed using SPSS V22, and confirmatory factor analyses were conducted using MPlus V7.
In the South African sample, significant gender differences were found for four of the five sub-scale means and for total difficulties, but gender differences of alpha scores were negligible. The internal consistency for the total difficulties, prosocial behaviour and emotional symptoms sub-scales were fair. UK cut-off values for caseness (set to identify the top 10% of scores in a UK sample) led to a higher proportion of South African adolescents classified in the ‘abnormal’ range on emotional and peer difficulties and a lower proportion classified in the ‘abnormal’ range for hyperactivity. South African cut-offs were therefore generated. The cross-country comparison with UK, Australian and Chinese data showed that South African adolescent boys and girls had the highest mean scores on total difficulties as well as on the subscales of emotional symptoms and conduct problems. In contrast, South African boys and girls had the lowest mean scores for hyperactivity/inattention. The UK boys and girls had the highest mean scores for hyperactivity/inattention, while the Australian sample had the highest scores for prosocial behaviours. The Chinese boys had the highest peer problem mean scores and Chinese boys and girls had the lowest means on prosocial behaviours. Confirmatory factor analyses showed significant item loadings with loadings higher than 0.40 for the emotional and prosocial behaviour sub-scales on the five-factor model, but not for all relevant items on the other three domains.
Findings support the potential usefulness of the SDQ-S in a South African setting, but suggest that the SDQ-S should not be used with UK cut-off values, and indicate the need for further validation and standardisation work in South African adolescents. We recommend that in-country cut-offs for ‘caseness’ should be used for clinical purposes in South Africa, that cross-country comparisons should be made with caution, and that further examination of naturalistic clusters and factors of the SDQ should be performed in culturally and contextually diverse settings.
Evidence from the Ross embayment, Antarctica, suggests an abrupt cooling and a concomitant increase in sea-ice cover at about 6000 BP (6 ka). Stable-isotope (δD) concentrations in the Taylor Dome ice core, at the western edge of the Ross embayment, decline rapidly after 6 ka, and continue to decline through the late Holocene. Methanesulfonic acid concentrations at Taylor Dome show opposite trends to δD Sediment cores from the western Ross Sea show a percentage minimum for the sea-ice diatom Fragilariopsis curta between 9 and 6 ka, whenTaylor Dome δD values are highest, followed by an increase through the late Holocene. Radiocarbon dates from raised beach deposits indicate that the retreat of ice shelves in the Ross embayment ceased at about 6 ka, coincident with the environmental changes inferred from the sediment and ice-core records. The similarity in timing suggests an important role for climate in controlling the evolution of ice-shelf margins following the end of the last glaciation.
Genetic–epidemiological studies that estimate the contributions of genetic factors to variation in tic symptoms are scarce. We estimated the extent to which genetic and environmental influences contribute to tics, employing various phenotypic definitions ranging between mild and severe symptomatology, in a large population-based adult twin-family sample.
In an extended twin-family design, we analysed lifetime tic data reported by adult mono- and dizygotic twins (n = 8323) and their family members (n = 7164; parents and siblings) from 7311 families in the Netherlands Twin Register. We measured tics by the abbreviated version of the Schedule for Tourette and Other Behavioral Syndromes. Heritability was estimated by genetic structural equation modeling for four tic disorder definitions: three dichotomous and one trichotomous phenotype, characterized by increasingly strictly defined criteria.
Prevalence rates of the different tic disorders in our sample varied between 0.3 and 4.5% depending on tic disorder definition. Tic frequencies decreased with increasing age. Heritability estimates varied between 0.25 and 0.37, depending on phenotypic definitions. None of the phenotypes showed evidence of assortative mating, effects of shared environment or non-additive genetic effects.
Heritabilities of mild and severe tic phenotypes were estimated to be moderate. Overlapping confidence intervals of the heritability estimates suggest overlapping genetic liabilities between the various tic phenotypes. The most lenient phenotype (defined only by tic characteristics, excluding criteria B, C and D of DSM-IV) rendered sufficiently reliable heritability estimates. These findings have implications in phenotypic definitions for future genetic studies.
Goats make up the largest group of ruminant livestock in Nigeria and are strategic in bridging animal protein supply gap and improving the economy of rural households. The hypervariable region 1 (HVR1) of the caprine mitochondrial genome was investigated to better understand genetic diversity important for improving selection for animal breeding and conservation programs. We sequenced and analysed the mitochondrial DNA (mtDNA) HVR1 in 291 unrelated indigenous Nigerian goats (West African Dwarf (WAD), Red Sokoto (RSO) and Sahel (SAH)), randomly sampled from around the country, and compared them with the HVR1 sequences of 336 Indian goats and 12 other sequences in five different species in the genus Capra (C. falconeri, C. ibex nubiana, C. aegagrus, C. cylindricornis and C. sibirica). A total of 139 polymorphic sites from 291 individuals were captured in 204 haplotypes. Within and among population variations were 77.25 and 22.74 percent, respectively. Nigerian goats showed high genetic diversity (0.87) and high FST values, and separate from Indian goats and other wild species. Haplogroups in WAD separates it from RSO and SAH concomitant with a different demographic history. Clear genetic structure was found among Nigerian goat breeds with appreciable variation in mtDNA HVR1 region. This study grouped Nigerian goat breeds into two major groups suggesting two different demographic origins for Northern and Southern breeds. High genetic admixing denotes different maternal origins and in contrast to evidence from goats from Levant and Central Asia, where goats were originally domesticated.
The removal of organics by photoelectrocatalytic oxidation offers a viable option to remove the contaminants at low concentrations. In this paper, we propose a BiVO4 thin films synthesized via spray pyrolysis for photoelectrocatalyic oxidation of phenol with solar light. We compare the properties of BiVO4 with those of the commonly used photocatalyst TiO2. In addition, BiVO4 films with W gradient doping were fabricated and tested for improving the photocatalytic performance of BiVO4. X-ray diffraction, atomic force microscopy, incident photon to current efficiency and spectrophotometry have been conducted for BiVO4 films of different thicknesses, as well as for TiO2. The electrochemical impedance spectroscopy and dark conductivity measurements were conducted. Phenol removal has been measured for both the TiO2 and BiVO4 samples. The best performance was found to be for a 300 nm undoped BiVO4 film, being able to reduce the phenol concentration up to 30.0% of the initial concentration in four hours.
Deciphering the folding pathways and predicting the structures of complex three-dimensional biomolecules is central to elucidating biological function. RNA is single-stranded, which gives it the freedom to fold into complex secondary and tertiary structures. These structures endow RNA with the ability to perform complex chemistries and functions ranging from enzymatic activity to gene regulation. Given that RNA is involved in many essential cellular processes, it is critical to understand how it folds and functions in vivo. Within the last few years, methods have been developed to probe RNA structures in vivo and genome-wide. These studies reveal that RNA often adopts very different structures in vivo and in vitro, and provide profound insights into RNA biology. Nonetheless, both in vitro and in vivo approaches have limitations: studies in the complex and uncontrolled cellular environment make it difficult to obtain insight into RNA folding pathways and thermodynamics, and studies in vitro often lack direct cellular relevance, leaving a gap in our knowledge of RNA folding in vivo. This gap is being bridged by biophysical and mechanistic studies of RNA structure and function under conditions that mimic the cellular environment. To date, most artificial cytoplasms have used various polymers as molecular crowding agents and a series of small molecules as cosolutes. Studies under such in vivo-like conditions are yielding fresh insights, such as cooperative folding of functional RNAs and increased activity of ribozymes. These observations are accounted for in part by molecular crowding effects and interactions with other molecules. In this review, we report milestones in RNA folding in vitro and in vivo and discuss ongoing experimental and computational efforts to bridge the gap between these two conditions in order to understand how RNA folds in the cell.
Fall panicum is the most troublesome annual grass weed in sugarcane in Florida. The critical timing of fall panicum removal in sugarcane or the maximum amount of early season interference that sugarcane can tolerate before it suffers irrecoverable yield loss is not known. Field studies were conducted from 2012 to 2015 in Belle Glade, FL to determine the critical timing of fall panicum removal and season-long interference in sugarcane. The effect of season-long fall panicum interference and critical timing of removal based on 5 and 10% acceptable yield loss (AYL) levels were determined by fitting a log-logistic equation to percentage millable stalk, cane, and sugar yield loss data. Millable stalks, cane, and sucrose yield decreased as the duration of fall panicum interference increased. Season-long interference of fall panicum resulted in 34 to 60%, 34 to 62%, and 44 to 60% millable stalk, cane, and sucrose yield loss, respectively. The critical timing of fall panicum removal based on 5 and 10% AYL for millable stalks was 5 to 9 wk after sugarcane emergence (WAE). At 5 and 10% AYL, the critical timing of fall panicum removal ranged from 5 to 9 WAE and 6 to 8 WAE for cane and sucrose yield loss, respectively. These results show that fall panicum is competitive with sugarcane early in the season, demonstrating the need for timely early-season control to reduce negative effect on yield.
Obsessive-compulsive disorder (OCD) is associated with an abnormally large error-related negativity (ERN), an electrophysiological measure of error monitoring in response to performance errors, but it is unclear if hoarding disorder (HD) also shows this abnormality. This study aimed to determine whether the neurophysiological mechanisms underlying error monitoring are similarly compromised in HD and OCD.
We used a visual flanker task to assess ERN in response to performance errors in 14 individuals with HD, 27 with OCD, 10 with HD+OCD, and 45 healthy controls (HC). Age-corrected performance and ERN amplitudes were examined using analyses of variance and planned pairwise group comparisons.
A main effect of hoarding on ERN (p = 0.031) was observed, indicating ERN amplitudes were attenuated in HD relative to non-HD subjects. A group × age interaction effect on ERN was also evident. In HD-positive subjects, ERN amplitude deficits were significantly greater in younger individuals (r = −0.479, p = 0.018), whereas there were no significant ERN changes with increasing age in OCD and HC participants.
The reduced ERN in HD relative to OCD and HC provides evidence that HD is neurobiologically distinct from OCD, and suggests that deficient error monitoring may be a core pathophysiological feature of HD. This effect was particularly prominent in younger HD participants, further suggesting that deficient error monitoring manifests most strongly early in the illness course and/or in individuals with a relatively early illness onset.