To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Attentional bias to threat has been implicated as a cognitive mechanism in anxiety disorders for youth. Yet, prior studies documenting this bias have largely relied on a method with questionable reliability (i.e. dot-probe task) and small samples, few of which included adolescents. The current study sought to address such limitations by examining relations between anxiety – both clinically diagnosed and dimensionally rated – and attentional bias to threat.
The study included a community sample of adolescents and employed eye-tracking methodology intended to capture possible biases across the full range of both automatic (i.e. vigilance bias) and controlled attentional processes (i.e. avoidance bias, maintenance bias). We examined both dimensional anxiety (across the full sample; n = 215) and categorical anxiety in a subset case-control analysis (n = 100) as predictors of biases.
Findings indicated that participants with an anxiety disorder oriented more slowly to angry faces than matched controls. Results did not suggest a greater likelihood of initial orienting to angry faces among our participants with anxiety disorders or those with higher dimensional ratings of anxiety. Greater anxiety severity was associated with greater dwell time to neutral faces.
This is the largest study to date examining eye-tracking metrics of attention to threat among healthy and anxious youth. Findings did not support the notion that anxiety is characterized by heightened vigilance or avoidance/maintenance of attention to threat. All effects detected were extremely small. Links between attention to threat and anxiety among adolescents may be subtle and highly dependent on experimental task dimensions.
For many years, archaeologists have relied on Munsell Soil Color Charts (MSCC) as tools for standardizing the recording of soil and sediment colors in the field and artifacts such as pottery in the lab. Users have identified multiple potential sources of discrepancy in results, such as differences in inter-operator perception, light source, or moisture content of samples. In recent years, researchers have developed inexpensive digital methods for color identification, but these typically cannot be done in real time. Now, a field-ready digital color-matching instrument is marketed to archaeologists as a replacement for MSCC, but the accuracy and overall suitability of this device for archaeological research has not been demonstrated. Through three separate field and laboratory trials, we found systematic mismatches in the results obtained via device, including variable accuracy against standardized MSCC chips, which should represent ideal samples. At the same time, the instrument was consistent in its readings. This leads us to question whether using the “subjective” human eye or the “objective” digital eye is preferable for data recording of color. We discuss how project goals and limitations should be considered when deciding which color-recording method to employ in field and laboratory settings, and we identify optimal procedures.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
OBJECTIVES/GOALS: In 2017, new guidelines recommended multi-step algorithms for CDI diagnosis, and clinical centers rapidly implemented changes despite limited pediatric data. We assessed a multi-step algorithm using NAAT followed by EIA for ability to differentiate symptomatic CDI from colonization in children. METHODS/STUDY POPULATION: We prospectively enrolled pediatric patients with cancer, cystic fibrosis, or inflammatory bowel disease who were not being tested or treated for CDI and obtained a stool sample for NAAT. If positive by NAAT (colonized), EIA was performed. Children with symptomatic CDI who tested positive by NAAT via the clinical laboratory were also enrolled and EIA performed on residual stool. A functional cell cytotoxicity neutralization assay (CCNA) was performed in addition. RESULTS/ANTICIPATED RESULTS: Of the 138 asymptomatic children enrolled, 24 (17%) were colonized. An additional 37 children with symptomatic CDI were enrolled. Neither EIA positivity (41% versus 21%, P = 0.11) or CCNA positivity (49% versus 46%, P = 0.84) were significantly different between symptomatic versus colonized children. When both EIA and CCNA were positive, children were more commonly symptomatic than colonized (33% versus 13%, P = 0.04). DISCUSSION/SIGNIFICANCE OF IMPACT: A multi-step testing algorithm with NAAT and EIA failed to differentiate symptomatic CDI from colonization in our pediatric cohort. As multi-step algorithms are moved into clinical care, pediatric providers will need to be aware of the continued limitations in diagnostic testing.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
We investigated the contribution of polymorphisms shown to moderate transcription of serotonin transporter (5HTT) and monoamine oxidase A (MAOA) to the development of violence, and furthermore to test for gene x environment interactions. To do so, a cohort of 184 adult male volunteers referred for forensic assessment were assigned to a violent or non-violent group. 45% of violent, but only 30% of non-violent individuals carried the low-activity, short MAOA allele. In the violent group, carriers of low-function variants of 5HTT were found in 77%, as compared to 59%. Logistic regression was performed and the best fitting model revealed a significant, independent effect of childhood environment and MAOA genotype. A significant influence of an interaction between childhood environment and 5HTT genotype was found (Fig. 1). MAOA thus appears to be independently associated with violent crime, while there is a relevant 5HTT x environment interaction.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Measurements in the infrared wavelength domain allow direct assessment of the physical state and energy balance of cool matter in space, enabling the detailed study of the processes that govern the formation and evolution of stars and planetary systems in galaxies over cosmic time. Previous infrared missions revealed a great deal about the obscured Universe, but were hampered by limited sensitivity.
SPICA takes the next step in infrared observational capability by combining a large 2.5-meter diameter telescope, cooled to below 8 K, with instruments employing ultra-sensitive detectors. A combination of passive cooling and mechanical coolers will be used to cool both the telescope and the instruments. With mechanical coolers the mission lifetime is not limited by the supply of cryogen. With the combination of low telescope background and instruments with state-of-the-art detectors SPICA provides a huge advance on the capabilities of previous missions.
SPICA instruments offer spectral resolving power ranging from R ~50 through 11 000 in the 17–230 μm domain and R ~28.000 spectroscopy between 12 and 18 μm. SPICA will provide efficient 30–37 μm broad band mapping, and small field spectroscopic and polarimetric imaging at 100, 200 and 350 μm. SPICA will provide infrared spectroscopy with an unprecedented sensitivity of ~5 × 10−20 W m−2 (5σ/1 h)—over two orders of magnitude improvement over what earlier missions. This exceptional performance leap, will open entirely new domains in infrared astronomy; galaxy evolution and metal production over cosmic time, dust formation and evolution from very early epochs onwards, the formation history of planetary systems.
A novel, alloy-agnostic, nanofunctionalization process has been utilized to produce metal matrix composites (MMCs) via additive manufacturing, providing new geometric freedom for MMC design. MMCs were produced with the addition of tungsten carbide nanoparticles to commercially available AlSi10Mg alloy powder. Tungsten carbide was chosen due to the potential for coherent crystallographic phases that were identified utilizing a lattice-matching approach to promote wetting and increase dislocation interactions. Structures were produced with evenly distributed strengthening phases leading to tensile strengths >385 MPa and a 50% decrease in wear rate over the commercially available AlSi10Mg alloy at only 1 vol% loading of tungsten carbide.
Outpatient parenteral antimicrobial therapy (OPAT) programmes facilitate hospital discharge, but patients remain at risk of complications and consequent healthcare utilisation (HCU). Here we elucidated the incidence of and risk factors associated with HCU in OPAT patients. This was a retrospective, single-centre, case–control study of adult patients discharged on OPAT. Cases (n = 63) and controls (n = 126) were patients that did or did not utilise the healthcare system within 60 days. Characteristics associated with HCU in bivariate analysis (P ≤ 0.2) were included in a multivariable logistic regression model. Variables were retained in the final model if they were independently (P < 0.05) associated with 60-day HCU. Among all study patients, the mean age was 55 ± 16, 65% were men, and wound infection (22%) and cellulitis (14%) were common diagnoses. The cumulative incidence of 60-day unplanned HCU was 27% with a disproportionately higher incidence in the first 30 days (21%). A statin at discharge (adjusted odds ratios (aOR) 0.23, 95% confidence intervals (CIs) 0.09–0.57), number of prior admissions in past 12 months (aOR 1.48, 95% CIs 1.05–2.10), and a sepsis diagnosis (aOR 4.62, 95% CIs 1.23–17.3) were independently associated with HCU. HCU was most commonly due to non-infection related complications (44%) and worsening primary infection (31%). There are multiple risk factors for HCU in OPAT patients, and formal OPAT clinics may help to risk stratify and target the highest risk groups.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
To determine the scope, source, and mode of transmission of a multifacility outbreak of extensively drug-resistant (XDR) Acinetobacter baumannii.
SETTING AND PARTICIPANTS
Residents and patients in skilled nursing facilities, long-term acute-care hospital, and acute-care hospitals.
A case was defined as the incident isolate from clinical or surveillance cultures of XDR Acinetobacter baumannii resistant to imipenem or meropenem and nonsusceptible to all but 1 or 2 antibiotic classes in a patient in an Oregon healthcare facility during January 2012–December 2014. We queried clinical laboratories, reviewed medical records, oversaw patient and environmental surveillance surveys at 2 facilities, and recommended interventions. Pulsed-field gel electrophoresis (PFGE) and molecular analysis were performed.
We identified 21 cases, highly related by PFGE or healthcare facility exposure. Overall, 17 patients (81%) were admitted to either long-term acute-care hospital A (n=8), or skilled nursing facility A (n=8), or both (n=1) prior to XDR A. baumannii isolation. Interfacility communication of patient or resident XDR status was not performed during transfer between facilities. The rare plasmid-encoded carbapenemase gene blaOXA-237 was present in 16 outbreak isolates. Contact precautions, chlorhexidine baths, enhanced environmental cleaning, and interfacility communication were implemented for cases to halt transmission.
Interfacility transmission of XDR A. baumannii carrying the rare blaOXA-237 was facilitated by transfer of affected patients without communication to receiving facilities.
Objectives: This study examined whether individuals with Parkinson’s disease (PD) are at increased vulnerability for vascular-related cognitive impairment relative to controls. The underlying assumption behind this hypothesis relates to brain reserve and that both PD and vascular risk factors impair similar fronto-executive cognitive systems. Methods: The sample included 67 PD patients and 61 older controls (total N=128). Participants completed neuropsychological measures of executive functioning, processing speed, verbal delayed recall/memory, language, and auditory attention. Cardiovascular risk was assessed with the Framingham Cardiovascular Risk index. Participants underwent brain imaging (T1 and T2 FLAIR). Trained raters measured total and regional leukoaraiosis (periventricular, deep subcortical, and infracortical). Results: Hierarchical regressions revealed that more severe cardiovascular risk was related to worse executive functioning, processing speed, and delayed verbal recall in both Parkinson patients and controls. More severe cardiovascular risk was related to worse language functioning in the PD group, but not controls. In contrast, leukoaraiosis related to both cardiovascular risk and executive functioning for controls, but not the PD group. Conclusions: Overall, results revealed that PD and cardiovascular risk factors are independent risk factors for cognitive impairment. Generally, the influence of cardiovascular risk factors on cognition is similar in PD patients and controls. (JINS, 2017, 23, 322–331)
Puumala virus (PUUV) causes many human infections in large parts of Europe and can lead to mild to moderate disease. The bank vole (Myodes glareolus) is the only reservoir of PUUV in Central Europe. A commercial PUUV rapid field test for rodents was validated for bank-vole blood samples collected in two PUUV-endemic regions in Germany (North Rhine-Westphalia and Baden-Württemberg). A comparison of the results of the rapid field test and standard ELISAs indicated a test efficacy of 93–95%, largely independent of the origin of the antigens used in the ELISA. In ELISAs, reactivity for the German PUUV strain was higher compared to the Swedish strain but not compared to the Finnish strain, which was used for the rapid field test. In conclusion, the use of the rapid field test can facilitate short-term estimation of PUUV seroprevalence in bank-vole populations in Germany and can aid in assessing human PUUV infection risk.
The nutrient choline is necessary for membrane synthesis and methyl donation, with increased requirements during lactation. The majority of immune development occurs postnatally, but the importance of choline supply for immune development during this critical period is unknown. The objective of this study was to determine the importance of maternal supply of choline during suckling on immune function in their offspring among rodents. At parturition, Sprague–Dawley dams were randomised to either a choline-devoid (ChD; n 7) or choline-sufficient (ChS, 1 g/kg choline; n 10) diet with their offspring euthanised at 3 weeks of age. In a second experiment, offspring were weaned to a ChS diet until 10 weeks of age (ChD-ChS, n 5 and ChS-ChS, n 9). Splenocytes were isolated, and parameters of immune function were measured. The ChD offspring received less choline in breast milk and had lower final body and organ weight compared with ChS offspring (P<0·05), but this effect disappeared by week 10 with choline supplementation from weaning. ChD offspring had a higher proportion of T cells expressing activation markers (CD71 or CD28) and a lower proportion of total B cells (CD45RA+) and responded less to T cell stimulation (lower stimulation index and less IFN-γ production) ex vivo (P<0·05). ChD-ChS offspring had a lower proportion of total and activated CD4+ T cells, and produced less IL-6 after mitogen stimulation compared with cells from ChS-ChS (P<0·05). Our study suggests that choline is required in the suckling diet to facilitate immune development, and choline deprivation during this critical period has lasting effects on T cell function later in life.