To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abnormal effort-based decision-making represents a potential mechanism underlying motivational deficits (amotivation) in psychotic disorders. Previous research identified effort allocation impairment in chronic schizophrenia and focused mostly on physical effort modality. No study has investigated cognitive effort allocation in first-episode psychosis (FEP).
Cognitive effort allocation was examined in 40 FEP patients and 44 demographically-matched healthy controls, using Cognitive Effort-Discounting (COGED) paradigm which quantified participants’ willingness to expend cognitive effort in terms of explicit, continuous discounting of monetary rewards based on parametrically-varied cognitive demands (levels N of N-back task). Relationship between reward-discounting and amotivation was investigated. Group differences in reward-magnitude and effort-cost sensitivity, and differential associations of these sensitivity indices with amotivation were explored.
Patients displayed significantly greater reward-discounting than controls. In particular, such discounting was most pronounced in patients with high levels of amotivation even when N-back performance and reward base amount were taken into consideration. Moreover, patients exhibited reduced reward-benefit sensitivity and effort-cost sensitivity relative to controls, and that decreased sensitivity to reward-benefit but not effort-cost was correlated with diminished motivation. Reward-discounting and sensitivity indices were generally unrelated to other symptom dimensions, antipsychotic dose and cognitive deficits.
This study provides the first evidence of cognitive effort-based decision-making impairment in FEP, and indicates that decreased effort expenditure is associated with amotivation. Our findings further suggest that abnormal effort allocation and amotivation might primarily be related to blunted reward valuation. Prospective research is required to clarify the utility of effort-based measures in predicting amotivation and functional outcome in FEP.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Little is known about long-term employment outcomes for patients with first-episode schizophrenia-spectrum (FES) disorders who received early intervention services.
We compared the 10-year employment trajectory of patients with FES who received early intervention services with those who received standard care. Factors differentiating the employment trajectories were explored.
Patients with FES (N = 145) who received early intervention services in Hong Kong between 1 July 2001 and 30 June 2002 were matched with those who entered standard care 1 year previously. We used hierarchical clustering analysis to explore the 10-year employment clusters for both groups. We used the mixed model test to compare cluster memberships and piecewise regression analysis to compare the employment trajectories of the two groups.
There were significantly more patients who received the early intervention service in the good employment cluster (early intervention: N = 98 [67.6%]; standard care: N = 76 [52.4%]; P = 0.009). In the poor employment cluster, there was a significant difference in the longitudinal pattern between early intervention and standard care for years 1–5 (P < 0.0001). The number of relapses during the first 3 years, months of full-time employment during the first year and years of education were significant in differentiating the clusters of the early intervention group.
Results suggest there was an overall long-term benefit of early intervention services on employment. However, the benefit was not sustained for all patients. Personalisation of the duration of the early intervention service with a focus on relapse prevention and early vocational reintegration should be considered for service enhancement.
Declaration of interests
No relevant conflicts of interests reported by C.L.M.H., Y.N.S., P.S., H.H.P. and K.K.Y. S.K.W.C., W.C.C. and E.H.M.L. report that they are members of the working group of the Early Assessment Service for Young People with Psychosis (EASY) programme of the Hospital Authority in Hong Kong. E.Y.H.C. is the convener of the working group of the EASY programme of the Hospital Authority in Hong Kong.
We investigated whether neurobehavioral markers of risk for emotion dysregulation were evident among newborns, as well as whether the identified markers were associated with prenatal exposure to maternal emotion dysregulation. Pregnant women (N = 162) reported on their emotion dysregulation prior to a laboratory assessment. The women were then invited to the laboratory to assess baseline respiratory sinus arrhythmia (RSA) and RSA in response to an infant cry. Newborns were assessed after birth via the NICU Network Neurobehavioral Scale. We identified two newborn neurobehavioral factors—arousal and attention—via exploratory factor analysis. Low arousal was characterized by less irritability, excitability, and motor agitation, while low attention was related to a lower threshold for auditory and visual stimulation, less sustained attention, and poorer visual tracking abilities. Pregnant women who reported higher levels of emotion dysregulation had newborns with low arousal levels and less attention. Larger decreases in maternal RSA in response to cry were also related to lower newborn arousal. We provide the first evidence that a woman's emotion dysregulation while pregnant is associated with risks for dysregulation in her newborn. Implications for intergenerational transmission of emotion dysregulation are discussed.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
To determine the factors related to multiple ventilation tube insertions in children with otitis media with effusion.
A retrospective review was performed of 126 ears of 81 children aged less than 12 years who had undergone insertion of a Paparella type 1 ventilation tube for the first time between August 2012 and March 2018.
Mean age at the first operation was 4.0 ± 2.2 years, and the mean duration of otitis media with effusion before the first ventilation tube insertion was 5.4 ± 4.5 months. Among 126 ears, 80 (63.5 per cent) had a single ventilation tube insertion and 46 (36.5 per cent) had multiple insertions. On multivariate logistic regression, tympanic membrane retraction, serous middle-ear discharge, and early recurrence of otitis media with effusion were independent predictive factors of multiple ventilation tube insertions.
Tympanic membrane retraction, serous middle-ear discharge, and early recurrence of otitis media with effusion after the first tube extrusion are associated with multiple ventilation tube insertions.
A new ESCA (electron spectroscopy for chemical analysis) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient X-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatments and conditioning followed immediafely by ESCA analysis of the sample.
Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some “chemical shifts” measured by the instrument are compared with those obtained by X-ray spectroscopy.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
Research suggests intergenerational links between childhood abuse and neglect and subsequent parenting quality, but little is known about the potential mechanisms underlying intergenerational continuities in parenting. Adult romantic functioning may be one plausible mechanism, given its documented associations with both adverse caregiving in childhood and parenting quality in adulthood. The present study used data from the Minnesota Longitudinal Study of Risk and Adaptation to (a) investigate prospective associations between childhood experiences of abuse and neglect and multiple parenting outcomes in adulthood, and (b) evaluate the degree to which adult romantic functioning mediates those associations. Information regarding childhood abuse and neglect was gathered prospectively from birth through age 17.5 years. Multimethod assessments of romantic functioning were collected repeatedly through early adulthood (ages 20 to 32 years), and parenting quality was assessed as participants assumed a parenting role (ages 21 to 38 years). As expected, childhood abuse and neglect experiences predicted less supportive parenting (observed and interview rated) and higher likelihood of self-reported Child Protective Services involvement. The association with interview-rated supportive parenting was partially mediated by lower romantic competence, whereas the association with Child Protective Services involvement was partially mediated by more relational violence in adult romantic relationships. Implications of these novel prospective findings for research and clinical intervention are discussed.
The white-backed planthopper, Sogatella furcifera (Horváth) (Hemiptera, Delphacidae), has emerged as a serious rice pest in Asia. In the present study, 12 microsatellite markers were employed to investigate the genetic structure, diversity and migration route of 43 populations sampled from seven Asian countries (Bangladesh, China, Korea, Laos, Nepal, Thailand, and Vietnam). According to the isolation by distance analysis, a significant positive correlation was observed between genetic and geographic distances by the Mantel test (r2 = 0.4585, P = 0.01), indicating the role of geographic isolation in the genetic structure of S. furcifera. A population assignment test using the first-generation migrants detection method (thresholds a = 0.01) revealed southern China and northern Vietnam as the main sources of S. furcifera in Korea. Nepal and Bangladesh might be additional potential sources via interconnection with Vietnam populations. This paper provides useful data for the migration route and origin of S. furcifera in Korea and will contribute to planthopper resistance management.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
Arthropod communities in the tropics are increasingly impacted by rapid changes in land use. Because species showing distinct seasonal patterns of activity are thought to be at higher risk of climate-related extirpation, global warming is generally considered a lower threat to arthropod biodiversity in the tropics than in temperate regions. To examine changes associated with land use and weather variables in tropical arthropod communities, we deployed Malaise traps at three major anthropogenic forests (secondary reserve forest, oil palm forest, and urban ornamental forest (UOF)) in Peninsular Malaysia and collected arthropods continuously for 12 months. We used metabarcoding protocols to characterize the diversity within weekly samples. We found that changes in the composition of arthropod communities were significantly associated with maximum temperature in all the three forests, but shifts were reversed in the UOF compared with the other forests. This suggests arthropods in forests in Peninsular Malaysia face a double threat: community shifts and biodiversity loss due to exploitation and disturbance of forests which consequently put species at further risk related to global warming. We highlight the positive feedback mechanism of land use and temperature, which pose threats to the arthropod communities and further implicates ecosystem functioning and human well-being. Consequently, conservation and mitigation plans are urgently needed.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
Little is known about the combined use of benzodiazepines and antidepressants in older psychiatric patients. This study examined the prescription pattern of concurrent benzodiazepines in older adults treated with antidepressants in Asia, and explored its demographic and clinical correlates.
The data of 955 older adults with any type of psychiatric disorders were extracted from the database of the Research on Asian Psychotropic Prescription Patterns for Antidepressants (REAP-AD) project. Demographic and clinical characteristics were recorded using a standardized protocol and data collection procedure. Both univariate and multiple logistic regression analyses were performed.
The proportion of benzodiazepine and antidepressant combination in this cohort was 44.3%. Multiple logistic regression analysis revealed that higher doses of antidepressants, younger age (<65 years), inpatients, public hospital, major comorbid medical conditions, antidepressant types, and country/territory were significantly associated with more frequent co-prescription of benzodiazepines and antidepressants.
Nearly, half of the older adults treated with antidepressants in Asia are prescribed concurrent benzodiazepines. Given the potentially adverse effects of benzodiazepines, the rationale of benzodiazepines and antidepressants co-prescription needs to be revisited.
It has not been well established whether dietary folate intake reduces the risk of diabetes development. We aimed to clarify the prospective association between dietary folate intake and type 2 diabetes (T2D) risk among 7333 Korean adults aged 40 years or older who were included in the Multi-Rural Communities Cohort. Dietary folate intake was estimated from all 106 food items listed on a FFQ, not including folate intake from supplements. Two different measurements of dietary folate intake were used: the baseline consumption and the average consumption from baseline until just before the end of follow-up. The association between folate intake and T2D risk was determined through a modified Poisson regression model with a robust error estimator controlling for potential confounders. For 29 745 person years, 319 cases of diabetes were ascertained. In multivariable analyses, dietary folate intake was inversely associated with risk of T2D for women, not for men. For women, the incidence rate ratio of diabetes in the third tertile compared with the first tertile was 0·57 (95 % CI 0·38–0·87, Pfor trend=0·0085) in the baseline consumption model and 0·64 (95 % CI 0·43–0·95, Pfor trend=0·0244) in the average consumption model. These inverse associations was found in both normal fasting blood glucose group and impaired fasting glucose group among women. Among non-users of multinutrients and vitamin supplements, the significant inverse association remained. Thus, higher dietary intake of folate is prospectively associated with lower risk of diabetes for women.