To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sleep quantity and quality are associated with executive function (EF) in experimental studies, and in individuals with sleep disorders. With advancing age, sleep quantity and quality decline, as does the ability to perform EF tasks, suggesting that sleep disruption may contribute to age-related EF declines. This cross-sectional cohort study tested the hypothesis that poorer sleep quality (i.e., the frequency and duration of awakenings) and/or quantity may partly account for age-related EF deficits.
Community-dwelling older adults (N = 184) completed actigraphic sleep monitoring then a range of EF tasks. Two EF factors were extracted using exploratory structural equation modeling. Sleep variables did not mediate the relationship between age and EF factors. Post hoc moderated mediation analyses were conducted to test whether cognitive reserve compensates for sleep-related EF deficits, using years of education as a proxy measure of cognitive reserve.
We found a significant interaction between cognitive reserve and the number and frequency of awakenings, explaining a small (approximately 3%), but significant amount of variance in EF. Specifically, in individuals with fewer than 11 years of education, greater sleep disturbance was associated with poorer EF, but sleep did not impact EF in those with more education. There was no association between age and sleep quantity.
This study highlights the role of cognitive reserve in the sleep–EF relationship, suggesting individuals with greater cognitive reserve may be able to counter the impact of disturbed sleep on EF. Therefore, improving sleep may confer some protection against EF deficits in vulnerable older adults.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Shock control bumps can be used to control and weaken the shock waves that form on engine intakes at high angles of attack. In this paper, it is demonstrated how shock control bumps applied to an engine intake can reduce or eliminate shock-induced separation at high incidence, and also increase the incidence at which critical separation occurs. Three-dimensional Reynolds-average Navier–Stokes (RANS) simulations are used to model the flow through a large civil aircraft engine intake at high incidence. The variation in shock strength and separation with incidence is first studied, along with the flow distribution around the nacelle. An optimisation process is then employed to design shock control bumps that reduce shock strength and separation at a fixed high incidence condition. The bump geometry is allowed to vary in shape, size, streamwise position and circumferential direction around the nacelle. This is shown to be key to the success of the shock control geometry. A further step is then taken, using the optimisation methodology to design bumps that can increase the incidence at which critical separation occurs. It is shown that, by using this approach, the operating range of the engine intake can be increased by at least three degrees.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Maintaining nutritional adequacy contributes to successful ageing. B vitamins involved in one-carbon metabolism regulation (folate, riboflavin, vitamins B6 and B12) are critical nutrients contributing to homocysteine and epigenetic regulation. Although cross-sectional B vitamin intake in ageing populations is characterised, longitudinal changes are infrequently reported. This systematic review explores age-related changes in dietary adequacy of folate, riboflavin, vitamins B6 and B12 in community-dwelling older adults (≥65 years at follow-up). Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, databases (MEDLINE, Embase, BIOSIS, CINAHL) were systematically screened, yielding 1579 records; eight studies were included (n 3119 participants, 2–25 years of follow-up). Quality assessment (modified Newcastle–Ottawa quality scale) rated all of moderate–high quality. The estimated average requirement cut-point method estimated the baseline and follow-up population prevalence of dietary inadequacy. Riboflavin (seven studies, n 1953) inadequacy progressively increased with age; the prevalence of inadequacy increased from baseline by up to 22·6 and 9·3 % in males and females, respectively. Dietary folate adequacy (three studies, n 2321) improved in two studies (by up to 22·4 %), but the third showed increasing (8·1 %) inadequacy. Evidence was similarly limited (two studies, respectively) and inconsistent for vitamins B6 (n 559; −9·9 to 47·9 %) and B12 (n 1410; −4·6 to 7·2 %). This review emphasises the scarcity of evidence regarding micronutrient intake changes with age, highlighting the demand for improved reporting of longitudinal changes in nutrient intake that can better direct micronutrient recommendations for older adults. This review was registered with PROSPERO (CRD42018104364).
The Tailored Activity Program (TAP) is an evidence-based occupational therapist-led intervention for people living with dementia and their care partners at home, developed in the USA. This study sought to understand its acceptability to people living with dementia, their care partners, and health professionals, and factors that might influence willingness to participate prior to its implementation in Australia.
This study used qualitative descriptive methods. Semi-structured interviews were conducted with people living with dementia in the community (n = 4), their care partners (n = 13), and health professionals (n = 12). People living with dementia were asked about health professionals coming to their home to help them engage in activities they enjoy, whereas care partners’ and health professionals’ perspectives of TAP were sought, after it was described to them. Interviews were conducted face-to-face or via telephone. All interviews were recorded and transcribed. Framework analysis was used to identify key themes.
Analysis identified four key themes labelled: (i) TAP sounds like a good idea; (ii) the importance of enjoyable activities; (iii) benefits for care partners; and (iv) weighing things up. Findings suggest the broad, conditional acceptability of TAP from care partners and health professionals, who also recognised challenges to its use. People living with dementia expressed willingness to receive help to continue engaging in enjoyable activities, if offered.
While TAP appeared generally acceptable, a number of barriers were identified that must be considered prior to, and during its implementation. This study may inform implementation of non-pharmacological interventions more broadly.
Basal melt of ice shelves is not only an important part of Antarctica's ice sheet mass budget, but it is also the origin of platelet ice, one of the most distinctive types of sea ice. In many coastal Antarctic regions, ice crystals form and grow in supercooled plumes of Ice Shelf Water. They usually rise towards the surface, becoming trapped under an ice shelf as marine ice or forming a semi-consolidated layer, known as the sub-ice platelet layer, below an overlying sea ice cover. In the latter, sea ice growth consolidates loose crystals to form incorporated platelet ice. These phenomena have numerous and profound impacts on the physical properties, biological processes and biogeochemical cycles associated with Antarctic fast ice: platelet ice contributes to sea ice mass balance and may indicate the extent of ice-shelf basal melting. It can also host a highly productive and uniquely adapted ecosystem. This paper clarifies the terminology and reviews platelet ice formation, observational methods as well as the geographical and seasonal occurrence of this ice type. The physical properties and ecological implications are presented in a way understandable for physicists and biologists alike, thereby providing the background for much needed interdisciplinary research on this topic.
Background: Carbapenemase-producing Enterobacterales (CPE) have rapidly become a global health concern and are associated with substantial morbidity and mortality due to limited treatment options. Travel to endemic areas, especially healthcare exposure in these areas, is an important risk factor for acquisition. We describe the evolving epidemiology, molecular features, and outcomes of CPE in Canada through surveillance by the Canadian Nosocomial Infection Surveillance Program (CNISP). Methods: CNISP has conducted surveillance for CPE among inpatients and outpatients of all ages since 2010. Participating acute-care facilities submit eligible specimens to the National Microbiology Laboratory for detection of carbapenemase production, and epidemiological data are collected. Incidence rates per 10,000 patient days are calculated based on inpatient data. Results: In total, 59 CNISP hospitals in 10 Canadian provinces representing 21,789 beds and 6,785,013 patient days participated in this surveillance. From 2010 to 2018, 118 (26%) CPE-infected and 547 (74%) CPE-colonized patients were identified. Few pediatric cases were identified (n = 18). Infection incidence rates remain low and stable (0.02 per 10,000 patient days in 2010 to 0.03 per 10,000 patient days in 2018), and colonization incidence rates have increased by 89% over the surveillance period. Overall, 92% of cases were acquired in a healthcare facility: 61% (n = 278) in a Canadian healthcare facility and 31% (n = 142) in a healthcare facility outside Canada. Of the 8% of cases not acquired in a healthcare facility, 50% (16 of 32) reported travel outside of Canada in the 12 months prior to positive culture. The distribution of carbapenemases varied by region; New Delhi metallo-B-lactamase (NDM) was dominant (59%) in western Canada and Klebsiella pneumoniae carbapenemase (KPC) (66%) in central Canada. NDM and class D carbapenemase OXA-48 were more commonly identified among those who traveled outside of Canada, whereas KPC was more commonly identified among patients without travel. In addition, 30-day all-cause mortality was 14% (25 of 181) among CPE infected patients and 32% (14 of 44) among those with bacteremia. Conclusions: CPE rates remain low in Canada; however, national surveillance data suggest that the increase in CPE in Canada is now being driven by local nosocomial transmission as well as travel and healthcare within endemic areas. Changes in screening practices may have contributed to the increase in colonizations; however, these data are currently lacking and will be collected moving forward. These data highlight the need to intensify surveillance and coordinate infection control measures to prevent further spread of CPE in Canadian acute-care hospitals.
Susy Hota reports contracted research for Finch Therapeutics. Allison McGeer reports funds to her institution for projects for which she is the principal investigator from Pfizer and Merck, as well as consulting fees from the following companies: Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
Background: Nosocomial central-line–associated bloodstream infections (CLABSIs) are an important cause of morbidity and mortality in hospitalized patients. CLABSI surveillance establishes rates for internal and external comparison, identifies risk factors, and allows assessment of interventions. Objectives: To determine the frequency of CLABSIs among adult patients admitted to intensive care units (ICUs) in CNISP hospitals and evaluate trends over time. Methods: CNISP is a collaborative effort of the Canadian Hospital Epidemiology Committee, the Association of Medical Microbiologists and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, CNISP has conducted hospital-based sentinel surveillance of healthcare-associated infections. Overall, 55 CNISP hospitals participated in ≥1 year of CLABSI surveillance. Adult ICUs are categorized as mixed ICUs or cardiovascular (CV) surgery ICUs. Data were collected using standardized definitions and collection forms. Line-day denominators for each participating ICU were collected. Negative-binomial regression was used to test for linear trends, with robust standard errors to account for clustering by hospital. We used the Fisher exact test to compare binary variables. Results: Each year, 28–42 adult ICUs participated in surveillance (27–37 mixed, 6–8 CV surgery). In both mixed ICUs and CV-ICUs, rates remained relatively stable between 2011 and 2018 (Fig. 1). In mixed ICUs, CLABSI rates were 1.0 per 1,000 line days in 2011, and 1.0 per 1,000 line days in 2018 (test for linear trend, P = .66). In CV-ICUs, CLABSI rates were 1.1 per 1,000 line days in 2011 and 0.8 per 1,000 line days in 2018 (P = .19). Case age and gender distributions were consistent across the surveillance period. The 30-day all-cause mortality rate was 29% in 2011 and in 2018 (annual range, 29%–35%). Between 2011 and 2018, the percentage of isolated microorganisms that were coagulase-negative staphylococci (CONS) decreased from 31% to 18% (P = .004). The percentage of other gram-positive organisms increased from 32% to 37% (P = .34); Bacillus increased from 0% to 4% of isolates and methicillin-susceptible Staphylococcus aureus from 2% to 6%). The gram-negative organisms increased from 21% to 27% (P = .19). Yeast represented 16% in 2011 and 18% in 2018; however, the percentage of yeast that were Candida albicans decreased over time (58% of yeast in 2011 and 30% in 2018; P = .04). Between 2011 and 2018, the most commonly identified species of microorganism in each year were CONS (18% in 2018) and Enterococcus spp (18% in 2018). Conclusions: Ongoing CLABSI surveillance has shown stable rates of CLABSI in adult ICUs from 2011 to 2018. The causative microorganisms have changed, with CONS decreasing from 31% to 18%.
Funding: CNISP is funded by the Public Health Agency of Canada.
Disclosures: Allison McGeer reports funds to her for studies, for which she is the principal investigator, from Pfizer and Merck, as well as consulting fees from Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
A new high time resolution observing mode for the Murchison Widefield Array (MWA) is described, enabling full polarimetric observations with up to
MHz of bandwidth and a time resolution of
s. This mode makes use of a polyphase synthesis filter to ‘undo’ the polyphase analysis filter stage of the standard MWA’s Voltage Capture System observing mode. Sources of potential error in the reconstruction of the high time resolution data are identified and quantified, with the
loss induced by the back-to-back system not exceeding
dB for typical noise-dominated samples. The system is further verified by observing three pulsars with known structure on microsecond timescales.
The criteria for objective memory impairment in mild cognitive impairment (MCI) are vaguely defined. Aggregating the number of abnormal memory scores (NAMS) is one way to operationalise memory impairment, which we hypothesised would predict progression to Alzheimer’s disease (AD) dementia.
As part of the Australian Imaging, Biomarkers and Lifestyle Flagship Study of Ageing, 896 older adults who did not have dementia were administered a psychometric battery including three neuropsychological tests of memory, yielding 10 indices of memory. We calculated the number of memory scores corresponding to z ≤ −1.5 (i.e., NAMS) for each participant. Incident diagnosis of AD dementia was established by consensus of an expert panel after 3 years.
Of the 722 (80.6%) participants who were followed up, 54 (7.5%) developed AD dementia. There was a strong correlation between NAMS and probability of developing AD dementia (r = .91, p = .0003). Each abnormal memory score conferred an additional 9.8% risk of progressing to AD dementia. The area under the receiver operating characteristic curve for NAMS was 0.87 [95% confidence interval (CI) .81–.93, p < .01]. The odds ratio for NAMS was 1.67 (95% CI 1.40–2.01, p < .01) after correcting for age, sex, education, estimated intelligence quotient, subjective memory complaint, Mini-Mental State Exam (MMSE) score and apolipoprotein E ϵ4 status.
Aggregation of abnormal memory scores may be a useful way of operationalising objective memory impairment, predicting incident AD dementia and providing prognostic stratification for individuals with MCI.
The coronavirus disease 2019 (COVID-19) has greatly impacted health-care systems worldwide, leading to an unprecedented rise in demand for health-care resources. In anticipation of an acute strain on established medical facilities in Dallas, Texas, federal officials worked in conjunction with local medical personnel to convert a convention center into a Federal Medical Station capable of caring for patients affected by COVID-19. A 200,000 square foot event space was designated as a direct patient care area, with surrounding spaces repurposed to house ancillary services. Given the highly transmissible nature of the novel coronavirus, the donning and doffing of personal protective equipment (PPE) was of particular importance for personnel staffing the facility. Furthermore, nationwide shortages in the availability of PPE necessitated the reuse of certain protective materials. This article seeks to delineate the procedures implemented regarding PPE in the setting of a COVID-19 disaster response shelter, including workspace flow, donning and doffing procedures, PPE conservation, and exposure event protocols.
Differential susceptibility theory (DST) posits that individuals differ in their developmental plasticity: some children are highly responsive to both environmental adversity and support, while others are less affected. According to this theory, “plasticity” genes that confer risk for psychopathology in adverse environments may promote superior functioning in supportive environments. We tested DST using a broad measure of child genetic liability (based on birth parent psychopathology), adoptive home environmental variables (e.g., marital warmth, parenting stress, and internalizing symptoms), and measures of child externalizing problems (n = 337) and social competence (n = 330) in 54-month-old adopted children from the Early Growth and Development Study. This adoption design is useful for examining DST because children are placed at birth or shortly thereafter with nongenetically related adoptive parents, naturally disentangling heritable and postnatal environmental effects. We conducted a series of multivariable regression analyses that included Gene × Environment interaction terms and found little evidence of DST; rather, interactions varied depending on the environmental factor of interest, in both significance and shape. Our mixed findings suggest further investigation of DST is warranted before tailoring screening and intervention recommendations to children based on their genetic liability or “sensitivity.”
Diet has a major influence on the composition and metabolic output of the gut microbiome. Higher-protein diets are often recommended for older consumers; however, the effect of high-protein diets on the gut microbiota and faecal volatile organic compounds (VOC) of elderly participants is unknown. The purpose of the study was to establish if the faecal microbiota composition and VOC in older men are different after a diet containing the recommended dietary intake (RDA) of protein compared with a diet containing twice the RDA (2RDA). Healthy males (74⋅2 (sd 3⋅6) years; n 28) were randomised to consume the RDA of protein (0⋅8 g protein/kg body weight per d) or 2RDA, for 10 weeks. Dietary protein was provided via whole foods rather than supplementation or fortification. The diets were matched for dietary fibre from fruit and vegetables. Faecal samples were collected pre- and post-intervention for microbiota profiling by 16S ribosomal RNA amplicon sequencing and VOC analysis by head space/solid-phase microextraction/GC-MS. After correcting for multiple comparisons, no significant differences in the abundance of faecal microbiota or VOC associated with protein fermentation were evident between the RDA and 2RDA diets. Therefore, in the present study, a twofold difference in dietary protein intake did not alter gut microbiota or VOC indicative of altered protein fermentation.
OBJECTIVES/GOALS: Oligodendrocytes (OL) are glial cells of the central nervous system (CNS) responsible for the energy demanding task of generating myelin sheaths during development and remyelination after demyelinating injury. One metabolite shown to significantly increase ATP production in OL is the nitrogenous organic acid, creatine. Creatine plays an essential role in ATP buffering within tissues with highly fluctuating energy demands such as brain and muscle. Interestingly, mature OL, which are the cells capable of myelin production, are the main cells in the CNS expressing the rate-limiting enzyme for creatine synthesis, guanidinoacetate methyltransferase (Gamt). Patients with mutations in Gamt display intellectual disabilities, impaired myelination and seizures. Therefore, we hypothesize that creatine may be essential for developmental myelination and improve remyelination. METHODS/STUDY POPULATION: To investigate these hypotheses, we developed a new transgenic mouse model with LoxP sites flanking exons 2-6 of the Gamt gene where excision leads to expression of a green fluorescent tag allowing us to track the cells normally expressing Gamt. RESULTS/ANTICIPATED RESULTS: In this mouse model, we show a 95% (±0.47%, n = 3) co-localization of Gamt within mature OL during postnatal (P) day P14. Next, we show that knocking out Gamt leads to a significant reduction in OL in the major CNS white matter tract, the corpus callosum, at P14 and P21 (P14: 0.007, n = 3; P21: 0.04, n = 3). Here, we also investigate whether dietary creatine can enhance remyelination in the cuprizone model of toxic demyelination. DISCUSSION/SIGNIFICANCE OF IMPACT: These studies highlight the important role creatine plays in developmental myelination and investigate whether creatine can provide a therapeutic value during a CNS demyelinating insult.
Raw milk cheeses are commonly consumed in France and are also a common source of foodborne outbreaks (FBOs). Both an FBO surveillance system and a laboratory-based surveillance system aim to detect Salmonella outbreaks. In early August 2018, five familial FBOs due to Salmonella spp. were reported to a regional health authority. Investigation identified common exposure to a raw goats' milk cheese, from which Salmonella spp. were also isolated, leading to an international product recall. Three weeks later, on 22 August, a national increase in Salmonella Newport ST118 was detected through laboratory surveillance. Concomitantly isolates from the earlier familial clusters were confirmed as S. Newport ST118. Interviews with a selection of the laboratory-identified cases revealed exposure to the same cheese, including exposure to batches not included in the previous recall, leading to an expansion of the recall. The outbreak affected 153 cases, including six cases in Scotland. S. Newport was detected in the cheese and in the milk of one of the producer's goats. The difference in the two alerts generated by this outbreak highlight the timeliness of the FBO system and the precision of the laboratory-based surveillance system. It is also a reminder of the risks associated with raw milk cheeses.
Research using single-word paradigms has established that forced language switching incurs processing costs for some bilinguals, yet, less research has addressed this phenomenon at the utterance level or considered real-world applications. The current study examined the impacts of forced language switching on spoken output and stress using a simulated virtual meeting. Twenty Spanish–English heritage bilinguals responded to general work-oriented questions in monolingual English (control) or language-switching (experimental) conditions. Responses were analyzed for mean length of utterance (MLU) and type-token-ratio (TTR). Multilevel modeling revealed an interaction effect of Condition (control vs. experimental) and question order on MLU, such that participants in the experimental condition produced significantly shorter utterances by the end of the task. Participants also had significantly lower lexical variation (TTR) overall in the experimental than the control condition. A 2 × 2 ANOVA revealed a significant effect of Condition and an interaction of Task (pre- vs. posttask) and Condition, such that participants in the control condition reported significantly lower stress after the activity. Results demonstrated the impact of a forced switching condition on production at the utterance level. Findings have implications for theory and scenarios in which heritage bilinguals are asked to use multiple languages in the workplace.
Psychiatric morbidity was assessed in 55 HIV seropositive women who were attending either an HIV centre in Paris (n = 30) or a genitourinary clinic in London (n = 25). Demographic data and information concerning HIV disease, openness about diagnosis, counselling received, social and family support, sexual behaviour and attitudes towards fertility and pregnancy were recorded using a semi-structured interview. Moderate or severe levels of psychiatric distress were found in 60% of the women in Paris and 28% of those in London. Overall, these rates are higher than those found in comparable studies of HIV seropositive men. Psychiatric disorder was associated with a past history of intravenous drug use and older age. Over half of the women were in regular sexual relationships but safe sex precautions were frequently not used. Sixteen subjects among those of child bearing age were prepared to consider having children.
Compulsive behavior is a core symptom of both obsessive compulsive disorder (OCD) and cocaine addiction (CA). Across both pathologies, one can identify a priori goal-directed actions (purportedly anxiolytic checking or washing in OCD and pleasure-seeking drug use in addiction) that turn into rigid, ritualized and repetitive behaviors over which the patient loose control. One possible psychopathological mechanism underlying compulsivity is behavioral inflexibility, namely a deficit in the aptitude to dynamically adapt to novel contexts and changing reward rules. The probabilistic reversal learning paradigm allows to objectively assess behavioral flexibility by challenging participants with a task where they have to learn through trials-and-errors which of two stimuli is the most-often rewarded one, while adjusting to sudden inconspicuous contingency reversals. We therefore hypothesized that both OCD and CA would be associated with impaired cognitive flexibility, as measured through perseverative response rate following contingency reversals in this task. Interestingly, impulsivity may also be assessed within this task via the tendency of participants to switch from one stimulus to the other following probabilistic errors. To investigate cognitive inflexibility in relation to CA and OCD respectively, we first compared the performance in a probabilistic reversal learning task of cocaine users, ex cocaine users (abstinent for 2 months or more), and controls, as well as that of participants from the general population whose obsessive-compulsive traits were assessed using the OCI-R, a well-validated self-questionnaire. Our task yielded results similar to those found in the literature: cocaine addicts changed their responses more often, and learned less effectively. Ex-cocaine addicts performed better than addicts but worse than controls, suggesting that addicts’ poor results may be in part explained by reversible cognitive consequences of addiction. Addicts with less cognitive impairments may also be less likely to relapse. Regarding the relationship of flexibility to subclinical OCD traits, we found no link between OCI-R score and perseveration, or between impulsiveness and excessive switching.