To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess the Framingham risk score as a prognostic tool for idiopathic sudden sensorineural hearing loss patients.
Medical records were reviewed for unilateral idiopathic sudden sensorineural hearing loss patients between January 2010 and October 2017. The 10-year risk of developing cardiovascular disease was calculated. Patients were subdivided into groups: group 1 – Framingham risk score of less than 10 per cent (n = 28); group 2 – score of 10 to less than 20 per cent (n = 6); and group 3 – score of 20 per cent or higher (n = 5).
Initial pure tone average and Framingham risk score were not significantly associated (p = 0.32). Thirteen patients in group 1 recovered completely (46.4 per cent), but none in groups 2 and 3 showed complete recovery. Initial pure tone average and Framingham risk score were significantly associated in multivariable linear regression analysis (R2 = 0.36). The regression coefficient was 0.33 (p = 0.003) for initial pure tone average and −0.67 (p = 0.005) for Framingham risk score.
Framingham risk score may be useful in predicting outcomes for idiopathic sudden sensorineural hearing loss patients, as those with a higher score showed poorer hearing recovery.
We recently identified association between GRIN2B rs2058878 variant and abstinence length in acamprosate-treated alcoholics (Karpyak et al. 2014). Here we present results of additional analyses exploring associations in the same sample (225 alcoholics treated with acamprosate for three months) at the gene and gene-set levels, for 12 genes involved in glycine signaling, 4 genes involved in glutamate reuptake, synthesis and degradation and 7 genes encoding NMDA receptor subunits.
After adjustment for relevant covariates, gene-level tests were performed using principal components (PC) analysis. Gene-set analyses were performed using the PC-Gamma approach with varying soft truncation threshold (STT) for the Gamma method for combining gene-level p-values.
Shorter abstinence was associated with increased intensity of alcohol craving and lower number of days between last drink and initiation of acamprosate treatment. After adjustment for covariates, we observed nominally significant association of abstinence length with variation in the AMT (p=0.024), GRIN3A (p=0.016) and SHMT2 (p=0.039) genes, and marginally significant evidence for association with the GRIN2B (p=0.067) and GLRB (p=0.060) genes. At the gene-set level, association of abstinence length with variation in the glycine pathway was nominally significant (p=0.042 with STT=0.37). Marginal evidence of association with abstinence length was also observed for variation in the NMDA-receptor subunits (p<0.1 for STT<0.15).
Our findings suggest association of abstinence length in acamprosate-treated alcoholics with variation in the glycine signaling pathway and genes encoding NMDA receptor subunits. Investigation of the mechanisms underlying these associations and their usefulness for individualized treatment selection should follow.
Psychiatric comorbidities and alcohol craving are known contributors to differences in alcohol consumption patterns.
Univariate and multivariable linear regression models were used to examine the association and interactions between the Inventory of Drug Taking Situations (IDTS) negative, positive and temptation sub-scale scores, sex, as well as co-morbid depression and anxiety determined by Psychiatric Research Interview of Substance and Mood Disorders (PRISM) with alcohol consumption measured by Time Line Follow Back (TLFB) during preceding 90 days in 287 males and 156 females meeting DSM-IV criteria for alcohol dependence.
IDTS positive, negative and temptation scores were strongly associated with increased alcohol consumption measures including the number of drinks per day and number of drinking days per week (P < 0.0001). Male sex was associated with higher amount of alcohol consumption per drinking day (P < 0.001), but not with the number of drinking days per week (P > 0.05). In men, lifetime history of depression was associated with fewer drinking days (P = 0.0084) and fewer hazardous drinking days (P = 0.0214) but not with differences in daily alcohol consumption. In women, depression history was not significantly associated with alcohol consumption measures. Post-hoc sex-stratified analyses suggested that the association of the negative IDTS score with total amount of alcohol consumed by men may be modified (decreased) by lifetime depression history. We found no associations of alcohol consumption measures with anxiety or substance-induced depression.
Decreased frequency of drinking in male alcoholics with lifetime depression history is unexpected. This finding emphasizes the complex relationships between alcoholism and depression, which require further investigation.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Although a number of studies have examined the relationship between depression and obesity, it is still insufficient to establish the specific pattern of relationship between depression and body mass index (BMI) categories. Thus, this study was aimed to investigate the relationship between depression and BMI categories.
A cross-sectional study was conducted for a cohort of 159,390 Korean based on Kangbuk Samsung Health Study (KSHS). Study participants were classified into 5 groups by Asian-specific cut-off of BMI (18.5, 23, 25 and 30 kg/m2). The presence of depression was determined by Center for Epidemiologic Studies-Depression scales (CES-D) = 16 and = 25. The adjusted odd ratios (ORs) for depression were evaluated by multiple logistic regression analysis, in which independent variable was 5 categories of BMI and dependent variable was depression. Subgroup analysis was conducted by gender and age.
When normal group was set as a reference, the adjusted ORs for depression formed U-shaped pattern of relationship with BMI categories [underweight: 1.31 (1.14–1.50), overweight: 0.94 (0.85–1.04), obese group: 1.01 (0.91–1.12), severe obese group: 1.28 (1.05–1.54)]. This pattern of relationship was more prominent in female and young age group than male and elderly subgroup. BMI level with the lowest likelihood of depression was 18.5 kg/m2 to 25 kg/m2 in women and 23 kg/m2 to 25 kg/m2 in men.
There was a U-shaped relationship between depression and BMI categories. This finding suggests that both underweight and severe obesity are associated with the increased risk for depression.
The area of dry-season rice (Oryza sativa L.) has rapidly increased in Cambodia owing to the large-scale development of irrigation infrastructure. But little is known of potential productivity and adaptive crop management. The objective of our study was to evaluate potential yield and nutrient requirements of dry-season rice in Cambodia, and the economic feasibility of soil-specific management recommended by the government. Field experiments were conducted on four soil types (Bakan, equivalent to Alfisol; Krakor, Inceptisol; Prateah Lang, Plinthustalfs; and Toul Samroung, Endoaqualfs) in four provinces (Battambang, Kampong Thom, Pursat, and Siem Reap) during the 2016 and 2017 dry seasons to compare 14 (2016) and 8 (2017) N-P-K combinations. Grain yield ranged from 1.0 to 5.5 t ha−1 in 2016 and from 1.3 to 6.7 t ha−1 in 2017. Potential yield from the experiments was 6–7 t ha−1 on Toul Samroung soil, 5–6 t ha−1 on Bakan soil, and 3–5 t ha−1 on Prateah Lang and Krakor soils. A rate of 140-60-60 kg ha−1 of N-P2O5-K2O was more than enough to achieve the best yields on any soil group. On the other hand, modest application rates in soil-specific management (44–78 kg ha−1 of N, 23–28 kg ha−1 of P2O5, 0–30 kg ha−1 of K2O) proved reasonable for resource-poor farmers in Cambodia, since the treatment always provided >75 % of the highest economic profit in high-input plots.
Rice is widely grown in rainfed lowlands during the wet season in the Mekong region. Limited nutrient availability is a common constraint on crop yield, and the optimal rate of fertilizer application depends on the soil type. The objective of our study was to evaluate rice productivity and the economic feasibility of various nutrient management regimes in Cambodia. We conducted field experiments on three soil types (Prey Khmer, Prateah Lang, and Toul Samroung, equivalent to Psamments, Plinthustalfs, and Endoaqualfs, respectively) in four provinces (Battambang, Kampong Thom, Pursat, and Siem Reap) during the 2016 and 2017 wet seasons to compare nine (2016) and seven (2017) N–P–K combinations. Grain yield ranged from 0.9 to 4.8 t ha−1 in 2016 and from 1.0 to 5.2 t ha−1 in 2017, depending on soil type and nutrient management. The Prey Khmer soil contained around 80% sand, and rice yield responded most weakly to nutrient management. The moderate fertilizer input in the current soil-specific recommendation was effective on this soil type. However, on more fertile soils with a higher clay content and a higher cation-exchange capacity (Toul Samroung and Prateah Lang), an additional 20 kg N ha−1 combined with adding 15 kg ha−1 of P2O5 or 20 kg ha−1 of K2O significantly increased yield and economic return. Although P and K use during Cambodia’s wet season is uncommon, our results demonstrate the importance of these nutrients in improving the country’s rice production.
Late-life depression, falls, and fall worry are public health problems. While previous research confirms the cross-sectional relationship between depression and fall worry, few longitudinal studies have examined whether changes in fall worry are associated with changes in depressive illness and vice versa. This study examined longitudinal relationships between probable major depression (PMD) and activity-limiting fall worry (ALW).
Design, Setting, Participants, Measurements:
This longitudinal panel observational study used data from the National Health and Aging Trends Study (NHATS) waves 5 (referred to as T1 in this study) and 6 (T2), conducted in 2015 and 2016, respectively (N = 6,299, aged 65 and older). We examined associations of new and continued ALW between T1 and T2 with T2 PMD, controlling for T1 PMD; and associations of new and continued PMD between T1 and T2 with T2 ALW, controlling for T1 ALW. We used χ2 and t tests for descriptive statistics and logistic regression for multivariable analysis.
Those with new ALW at T2 had significantly greater odds of T2 PMD compared to those without ALW at both time points (AOR = 2.64, 95% CI = 1.98−3.51), and those with new PMD at T2 had significantly greater odds of T2 ALW (AOR = 2.42, 95% CI = 1.66−3.52). Those with continued PMD also had greater odds of T2 ALW compared to those without PMD at either time point (AOR = 2.31, 95% CI = 1.62−3.29).
The findings add to knowledge about bidirectional (mutually reinforcing) relationships between depression and activity-limiting fall worry. Innovative interventions are needed to reduce both late-life depression and activity-limiting fall worry.
Background: For robot-assisted telesurgery, the workstation, in particular the haptic handcontroller itself a robot, is paramount to the performance of surgery. Based on the requirements for microsurgery, a novel haptic handcontroller Excalibur has been developed. Methods: Thirty-two surgeons performed a peg-in-hole task (simulating micromanipulation) with Excalibur and two commercially available handcontrollers (Sigma 7 and PHANToM Premium 3.0). A modified Kuka endeffector with bipolar forceps, and Leica microscope completed the remote robotic site. Comparisons were made based on training time, task completion time and number of errors. All participants completed a questionnaire. Results: Repeated measures ANOVA demonstrated significance for task completion time (p=0.004), training time (p=0.021) and number of errors (p=0.004). Surgeons were faster with Excalibur (72s) than with Sigma (96s,p=0.005) and PHANToM (96s,p=0.036). Training time was shorter with Excalibur than with PHANToM (210s vs 310s,p=0.013), and users made fewer errors (0.7 vs 2.1,p=0.008). Training time required for Sigma (285s) and the number of errors (1.3) were not significant. The surgeons found Excalibur smoother, more comfortable, less tiring and easier to maneuver, with more realistic force feedback and superior movement fidelity. Conclusions: Surgical performance was superior with Excalibur compared to the other handcontrollers. This may reflect the microsurgical requirements and unique design architecture of Excalibur.
Recently there has been significant work in the social sciences involving ensembles of social networks, that is, multiple, independent, social networks such as students within schools or employees within organizations. There remains, however, very little methodological work on exploring these types of data structures. We present methods for clustering social networks with observed nodal class labels, based on statistics of walk counts between the nodal classes. We extend this method to consider only non-backtracking walks, and introduce a method for normalizing the counts of long walk sequences using those of shorter ones. We then present a method for clustering networks based on these statistics to explore similarities among networks. We demonstrate the utility of this method on simulated network data, as well as on advice-seeking networks in education.
Infrared signal measurements from a micro-turbojet engine are conducted to understand the characteristics of the engine performance and the infrared signal by varying the exhaust nozzle configuration. A cone type nozzle and five rectangle type nozzles whose aspect ratios vary from one to five are used for this experimental work. As a result, it is confirmed that the thrust and the fuel consumption rate of the engine do not change greatly by varying the exhaust nozzle shape. In the case of the aspect ratio of 5, the specific fuel consumption of the engine is increased by about 3% compared to the reference cone nozzle, but the infrared signal can be reduced by up to 14%. As a result of measuring the temperature distribution of the plume gas, the correlation of infrared signal with plume gas temperature distribution can be understood. In the case of a cone shape, the distribution of plume gas formed to circular shape, and the high-temperature core region of plume gas continued to develop farther to the downstream. However, the temperature distribution was maintained in the rectangular shape as the aspect ratio increased, and the average temperature decreased sharply. As the aspect ratio increases, the plume spreads more widely.
OBJECTIVES/SPECIFIC AIMS: The purpose of the study was to describe patient characteristics associated with subsequent development of bowel ischemia. Primary outcomes were survival to discharge, 30-day and 1-year survival in patients with LVAD who subsequently develop bowel ischemia. Secondary outcomes included characteristics of patients who survive to discharge after bowel ischemia and those who do not. These included markers of patient condition prior to surgical/endoscopic intervention such as lactate levels, ICU admission, ventilator dependence, vasopressor and renal replacement requirements, as well as presence of sepsis. Of these, we predicted that lactate levels and white blood cell count would be significantly elevated pre- and post-operatively in patients who do not recover from bowel ischemic event. We used Mann-Whitney U Test to examine lactate levels between the two groups as our sample size was <30 and therefore necessitated the use of non-parametric testing. METHODS/STUDY POPULATION: In this single-center retrospective study, we analyzed all patients who underwent durable, CF-LVAD implantation at Duke University Medical Center (DUMC) between January 2008 and November 2018. Patients were screened using CPT codes for abdominal surgical exploration or ICD codes for intestinal vascular insufficiency. Final cohort was selected with confirmed diagnosis of intestinal ischemia based on surgical exploration or endoscopic intervention. Patient characteristics including pre-LVAD comorbidities, indication for LVAD implant, and clinical picture prior to bowel ischemic event were collected. Specific characteristics related to bowel ischemia were summarized, including diagnostic imaging, time from imaging study to operative intervention, and intraoperative details. Patient outcomes including survival to discharge, 30-day-, and 1-year survival were summarized. Patients were stratified based on survival to discharge status. Continuous variables were reported as median and interquartile range and compared using Mann-Whitney U test. Categorical variables were reported as proportions and compared using Fisher’s exact test as appropriate. RESULTS/ANTICIPATED RESULTS: A total of 754 patients underwent durable, CF-LVAD implant at DUMC, of which 21 subsequently developed intestinal ischemia (incidence 2.8%). The majority were male (81%) and treated as destination therapy (76.2%). Ten patients (50%) survived to discharge (one remains hospitalized). The proportions of patients receiving HeartMate II (60% vs. 50%, p=1.0), HeartMate III (20% vs. 10%, p=1.0), and HeartWare (20% vs. 40%, p=0.6) were not significantly different between patients who survived to discharge and patients who did not. Median time from LVAD implant to diagnosis of bowel ischemia did not vary significantly between the patient groups (11.5 days, IQR 34.75 vs. 16.5 days, IQR 173.8; p=0.40), nor did the median time from diagnosis to surgical intervention (264.5 minutes, IQR 497.8 vs. 323 minutes, IQR 440, p=0.82). In the 48 hours leading to diagnosis and intervention, renal replacement therapy (50% vs. 0%, p=0.033) was more prevalent in patients who did not survive to discharge. Differences in pre- and post-operative lactate levels were not significantly different in patient groups. A similar pattern of diagnostic study preference emerged from both groups, with CT being the most common (76.2%) followed by KUB (42.9%). Upper endoscopy/colonoscopy was performed in 7 patients (33.3%), of which 5 also had operative exploration. A total of 19 patients underwent abdominal exploration (90.5%). Nine had large bowel resection (42.9%) while 14 had small bowel resection (66.7% with average 75cm removed). Overall survival at 1-year was 33%. For those making it to discharge (n=10), one year survival was 60%. DISCUSSION/SIGNIFICANCE OF IMPACT: This is the first institutional study to our knowledge to describe intestinal ischemia in patients receiving CF-LVAD therapy. Intestinal ischemia in patients receiving CF-LVAD therapy is associated with high mortality and morbidity. Diagnosis of bowel ischemia should be considered in patients presenting with clinical symptoms of bowel ischemia in addition to requirement of renal replacement therapy. Imaging modalities used were dependent on the clinical situation and were not always necessary prior to intervention. Further investigation is warranted to identify predictors of this morbid complication.
The aim of this study was to examine the extent to which an exposure to disaster is associated with change in health behaviors.
Federal disaster declarations were matched at the county-level to self-reported behaviors for participants in the Health and Retirement Study (HRS), 2000-2014. Multivariable logistic regression was used to evaluate the relationship between disaster and change in physical activity, body mass index (BMI), and cigarette smoking.
The sample included 20,671 individuals and 59,450 interviews; 1,451 unique disasters were declared in counties in which HRS respondents lived during the study period. Exposure to disaster was significantly associated with weight gain (unadjusted RRR=1.19; 95% CI, 1.11-1.27; adjusted RRR=1.21; 95% CI, 1.13-1.30). Vigorous physical activity was significantly lower among those who had experienced a disaster compared to those who had not (unadjusted OR=0.89; 95% CI, 0.84-0.95; adjusted OR=0.84; 95% CI, 0.79-0.89). No significant difference in cigarette smoking was found.
This study found an increase in weight gain and decrease in physical activity among older adults after disaster exposure. Adverse health behaviors such as these can contribute to functional decline among older adults.
BellSA, ChoiH, LangaKM, IwashynaTJ. Health Risk Behaviors after Disaster Exposure Among Older Adults. Prehosp Disaster Med. 2019;34(1):95–97.
Given the rapid increase in prescription and illicit drug poisoning deaths in the 50+ age group, we examined precipitating/risk factors and toxicology results associated with poisoning deaths classified as suicides compared to intent-undetermined death (UnD) among decedents aged 50+.
Data were from the 2005–2015 US National Violent Death Reporting System (N = 15,453). χ2 tests and multinomial logistic regression models were used to compare three groups of decedents: suicide decedent who left a suicide note, suicide decedent who did not leave a note, and UnD cases.
Compared to suicide decedents without a note (37.7% of the sample), those with a note (29.4%) were more likely to have been depressed and had physical health problems and other life stressors, while UnD cases (32.9%) were less likely to have had mental health problems and other life stressors but more likely to have had substance use and health problems. UnD cases were also more likely to be opioid (RRR = 2.65, 95% CI = 2.42–2.90) and cocaine (RRR = 2.59, 95% CI = 2.09–3.21) positive but less likely to be antidepressant positive. Blacks were more than twice as likely as non-Hispanic Whites to be UnDs. Results from separate regression models in the highest UnD states (Maryland and Utah) and in states other than Maryland/Utah were similar.
Many UnDs may be more correctly classified as unintentional overdose deaths. Along with more accurate determination processes for intent/manner of death, substance use treatment and approaches to curbing opioid and other drug use problems are needed to prevent intentional and unintentional poisoning deaths.
While studies suggest that nutritional supplementation may reduce aggressive behavior in children, few have examined their effects on specific forms of aggression. This study tests the primary hypothesis that omega-3 (ω-3), both alone and in conjunction with social skills training, will have particular post-treatment efficacy for reducing childhood reactive aggression relative to baseline.
In this randomized, double-blind, stratified, placebo-controlled, factorial trial, a clinical sample of 282 children with externalizing behavior aged 7–16 years was randomized into ω-3 only, social skills only, ω-3 + social skills, and placebo control groups. Treatment duration was 6 months. The primary outcome measure was reactive aggression collected at 0, 3, 6, 9, and 12 months, with antisocial behavior as a secondary outcome.
Children in the ω-3-only group showed a short-term reduction (at 3 and 6 months) in self-report reactive aggression, and also a short-term reduction in overall antisocial behavior. Sensitivity analyses and a robustness check replicated significant interaction effects. Effect sizes (d) were small, ranging from 0.17 to 0.31.
Findings provide some initial support for the efficacy of ω-3 in reducing reactive aggression over and above standard care (medication and parent training), but yield only preliminary and limited support for the efficacy of ω-3 in reducing overall externalizing behavior in children. Future studies could test further whether ω-3 shows promise in reducing more reactive, impulsive forms of aggression.
Mentorship is perceived to be an important component of residency education. However, evidence of the impact of mentorship on professional development in Emergency Medicine (EM) is lacking.
Online survey distributed to attending physician members of the Canadian Association of Emergency Physicians (CAEP), using a modified Dillman method. Survey contained questions about mentorship during residency training, and perceptions of the impact of mentorship on career development.
The response rate was 23.5% (309/1314). 63.6% reported having at least one mentor during residency. The proportion of participants with a formal mentorship component during residency was higher among those with mentors (44.5%) compared to those without any formal mentorship component during residency (8.0%, p<0.001). The most common topics discussed with mentors were career planning and work-life balance. The least common topics included research and finances. While many participants consulted their mentor regarding their first job (56.5%), fewer consulted their mentor regarding subspecialty training (45.1%) and research (41.1%). 71.8% chose to work in a similar centre as their mentor, but few completed the same subspecialty (24.8%), or performed similar research (30.4%). 94.1% stated that mentorship was important to success during residency. Participants in a formal mentorship program did not rate their experience of mentorship higher than those without a formal program.
Among academic EM physicians with an interest in mentorship, mentorship during EM residency may have a greater association with location of practice than academic scholarship or subspecialty choice. Formal mentorship programs increase the likelihood of obtaining a mentor, but do not appear to improve reported mentorship experiences.
US suicide rates among older women have substantially increased over the past decade. We examined potential differences in sociodemographic and risk/precipitating factors among older female suicide decedents who died by drug overdose versus firearms, hanging/suffocation, and other means, and postmortem toxicology results by suicide means.
Data are from the 2005 to 2015 US National Violent Death Reporting System (N = 12,401 female decedents aged 50 years and over). We used three logistic regression models, with overdose versus firearms, overdose versus hanging/suffocation, and overdose versus “other” means as the dependent variables, to examine associations between suicide means and sociodemographic and risk/precipitating factors. χ2 tests were used to examine positive toxicology of prescription and illicit drugs by suicide means.
Compared to firearm users, overdose users were younger and had higher odds of having had previous suicide attempts/intent disclosures, mental disorders (e.g. depression/dysthymia: AOR = 1.18, 95% CI = 1.05–1.34), and substance abuse other than alcohol, but lower odds of having had relationship problems and any crisis. Compared to hanging/suffocation, overdose declined (AOR = 0.95, 95% CI = 0.93–0.97) during the study period and was less prevalent among Hispanic and Asian women and those with job/finance/housing problems. Toxicology reports showed that 47%, 43%, and 45% of overdose users were antidepressant, opiate, and benzodiazepine positive, respectively. Firearm users had the lowest rates of positive toxicology results for these drugs.
Suicide prevention should include limiting access to large quantities of prescription medications and firearms for those at risk of suicide. More effective mental health/substance abuse treatment and chronic illness management support are also needed.
Compared to their non-using age peers, older marijuana users are known to have lower marijuana risk perceptions. We examined associations of older marijuana users’ risk perceptions with their marijuana use patterns and substance use disorders.
Data are from 2013 to 2015 National Survey of Drug Use and Health (N = 24,057 respondents aged 50+ years). Bivariate logistic regression was used to compare risk perceptions among never users, former users, and past-year users aged 50+ years. Multivariable logistic regression was used to test associations between risk perception and marijuana use status and between risk perception and marijuana use patterns.
Among the total sample, former (AOR = 0.30, 95% CI = 0.27–0.32) and past-year (AOR = 0.05, 95% CI = 0.04–0.06) marijuana users had significantly lower odds of moderate/great risk perception (as opposed to no/slight risk perception) than never users. Among past-year users, odds of moderate/great risk perception were lower among those who used marijuana more frequently (AOR = 0.14, 95% CI = 0.07–0.28 for 300+ days of use compared to 1–11 days of use) and who reported any medical marijuana use (AOR = 0.27, 95% CI = 0.14–0.51). However, those who had marijuana use disorder were 3.5 times more likely to report moderate/great risk perception (AOR = 3.50, 95% CI = 1.62–7.58). Those who had a college education, had higher incomes, and resided in states with medical marijuana laws also had lower risk perceptions.
Public health education on scientific evidence about marijuana's benefits and harms and age-appropriate treatment for older adults with substance use problems are needed. Research on risk perception formation using longitudinal data among older adults is also needed.
It has not been well established whether dietary folate intake reduces the risk of diabetes development. We aimed to clarify the prospective association between dietary folate intake and type 2 diabetes (T2D) risk among 7333 Korean adults aged 40 years or older who were included in the Multi-Rural Communities Cohort. Dietary folate intake was estimated from all 106 food items listed on a FFQ, not including folate intake from supplements. Two different measurements of dietary folate intake were used: the baseline consumption and the average consumption from baseline until just before the end of follow-up. The association between folate intake and T2D risk was determined through a modified Poisson regression model with a robust error estimator controlling for potential confounders. For 29 745 person years, 319 cases of diabetes were ascertained. In multivariable analyses, dietary folate intake was inversely associated with risk of T2D for women, not for men. For women, the incidence rate ratio of diabetes in the third tertile compared with the first tertile was 0·57 (95 % CI 0·38–0·87, Pfor trend=0·0085) in the baseline consumption model and 0·64 (95 % CI 0·43–0·95, Pfor trend=0·0244) in the average consumption model. These inverse associations was found in both normal fasting blood glucose group and impaired fasting glucose group among women. Among non-users of multinutrients and vitamin supplements, the significant inverse association remained. Thus, higher dietary intake of folate is prospectively associated with lower risk of diabetes for women.