To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Few personalised medicine investigations have been conducted for mental health. We aimed to generate and validate a risk tool that predicts adult attention-deficit/hyperactivity disorder (ADHD).
Using logistic regression models, we generated a risk tool in a representative population cohort (ALSPAC – UK, 5113 participants, followed from birth to age 17) using childhood clinical and sociodemographic data with internal validation. Predictors included sex, socioeconomic status, single-parent family, ADHD symptoms, comorbid disruptive disorders, childhood maltreatment, ADHD symptoms, depressive symptoms, mother's depression and intelligence quotient. The outcome was defined as a categorical diagnosis of ADHD in young adulthood without requiring age at onset criteria. We also tested Machine Learning approaches for developing the risk models: Random Forest, Stochastic Gradient Boosting and Artificial Neural Network. The risk tool was externally validated in the E-Risk cohort (UK, 2040 participants, birth to age 18), the 1993 Pelotas Birth Cohort (Brazil, 3911 participants, birth to age 18) and the MTA clinical sample (USA, 476 children with ADHD and 241 controls followed for 16 years from a minimum of 8 and a maximum of 26 years old).
The overall prevalence of adult ADHD ranged from 8.1 to 12% in the population-based samples, and was 28.6% in the clinical sample. The internal performance of the model in the generating sample was good, with an area under the curve (AUC) for predicting adult ADHD of 0.82 (95% confidence interval (CI) 0.79–0.83). Calibration plots showed good agreement between predicted and observed event frequencies from 0 to 60% probability. In the UK birth cohort test sample, the AUC was 0.75 (95% CI 0.71–0.78). In the Brazilian birth cohort test sample, the AUC was significantly lower –0.57 (95% CI 0.54–0.60). In the clinical trial test sample, the AUC was 0.76 (95% CI 0.73–0.80). The risk model did not predict adult anxiety or major depressive disorder. Machine Learning approaches did not outperform logistic regression models. An open-source and free risk calculator was generated for clinical use and is available online at https://ufrgs.br/prodah/adhd-calculator/.
The risk tool based on childhood characteristics specifically predicts adult ADHD in European and North-American population-based and clinical samples with comparable discrimination to commonly used clinical tools in internal medicine and higher than most previous attempts for mental and neurological disorders. However, its use in middle-income settings requires caution.
Introduction: Low acuity patients have been controversially tagged as a source of emergency department (ED) misuse. Authorities for many Canadian health regions have set up policies so these patients preferably present to walk-in clinics (WIC). We compared the cost and quality of the care given to low acuity patients in an academic ED and a WIC of Québec City during fiscal year 2015-16. Methods: We conducted an ambidirectional (prospective and retrospective) cohort study using a time-driven activity-based costing method. This method uses duration of care processes (e.g., triage) to allocate to patient care all direct costs (e.g., personnel, consumables), overheads (e.g., building maintenance) and physician charges. We included consecutive adult patients, ambulatory at all time and discharged from the ED or WIC with a diagnosis of upper respiratory tract infection (URTI), urinary tract infection (UTI) or low back pain. Mean cost [95%CI] per patient per condition was compared between settings after risk-adjustment for age, sex, vital signs, number of regular medications and co-morbidities using generalized log-gamma regression models. Proportions [95%CI] of antibiotic prescription and chest X-Ray use in URTI, compliance with provincial guidelines on use of antibiotics in UTI, and column X-Ray use in low back pain were compared between settings using a Pearson Chi-Square test. Results: A total of 409 patients were included. ED and WIC groups were similar in terms of age, sex and vital signs on presentation, but ED patients had a greater burden of comorbidities. Adjusted mean cost (2016 CAN$) of care was significantly higher in the ED than in the WIC (p < 0.0001) for URTI (78.42[64.85-94.82] vs. 59.43[50.43-70.06]), UTI (78.88[69.53-89.48] vs. 53.29[43.68-65.03]), and low back pain (87.97[68.30-113.32] vs. 61.71[47.90-79.51]). For URTI, antibiotics were more frequently prescribed in the WIC (44.1%[34.3-54.3] vs. 5.8%[1.2-16.0]; p < 0.0001) and chest X-Rays, more frequently used in the ED (26.9%[15.6-41.0] vs. 13.7%[7.7-22.0]; p = 0.05). No significant differences were observed in the compliance with guidelines on use of antibiotics in UTI and in the use of column X-Ray in low back pain. Conclusion: Total cost of care for low acuity patients is lower in walk-in clinics than in EDs. However, our results suggest that quality-of-care issues should be considered in determining the best alternate setting for treating ambulatory emergency patients.
The hypothalamic–pituitary–adrenal axis (HPAA) plays a critical role in the functioning of all other biological systems. Thus, studying how the environment may influence its ontogeny is paramount to understanding developmental origins of health and disease. The early post-conceptional (EPC) period could be particularly important for the HPAA as the effects of exposures on organisms’ first cells can be transmitted through all cell lineages. We evaluate putative relationships between EPC maternal cortisol levels, a marker of physiologic stress, and their children’s pre-pubertal HPAA activity (n=22 dyads). Maternal first-morning urinary (FMU) cortisol, collected every-other-day during the first 8 weeks post-conception, was associated with children’s FMU cortisol collected daily around the start of the school year, a non-experimental challenge, as well as salivary cortisol responses to an experimental challenge (all Ps<0.05), with some sex-related differences. We investigated whether epigenetic mechanisms statistically mediated these links and, therefore, could provide cues as to possible biological pathways involved. EPC cortisol was associated with >5% change in children’s buccal epithelial cells’ DNA methylation for 867 sites, while children’s HPAA activity was associated with five CpG sites. Yet, no CpG sites were related to both, EPC cortisol and children’s HPAA activity. Thus, these epigenetic modifications did not statistically mediate the observed physiological links. Larger, prospective peri-conceptional cohort studies including frequent bio-specimen collection from mothers and children will be required to replicate our analyses and, if our results are confirmed, identify biological mechanisms mediating the statistical links observed between maternal EPC cortisol and children’s HPAA activity.
The goal of most rice improvement programs is the enhancement of farmers’ yield using less land and limited water. This study evaluated 77 upland rice genotypes under optimal upland growing conditions in the field and ranked the genotypes using base indices. Subsequently, eighteen cultivars selected from the field trial were screened under drought in rainout-shelter conditions. The traits evaluated for index selection were yield, days to flowering, plant height, number of panicles and filled grains. Under field conditions, based on the sum of economic weight assigned to five traits used to compute the selection index, IR 68704-145-1-1-B and IR 63380-16 were the best genotypes. In the rainout-shelter experiment, Ofada 2 (508 gm−2) had the highest grain yield under non-stress conditions while ITA 117 (152.38 gm−2) had the highest grain yield under drought stress. The base index was efficient for selecting superior genotypes with the best combination for all the traits considered. Susceptibility to drought stress of the landraces leads to poor grain yield.
Objectives: Studies suggest that impairments in some of the same domains of cognition occur in different neuropsychiatric conditions, including those known to share genetic liability. Yet, direct, multi-disorder cognitive comparisons are limited, and it remains unclear whether overlapping deficits are due to comorbidity. We aimed to extend the literature by examining cognition across different neuropsychiatric conditions and addressing comorbidity. Methods: Subjects were 486 youth consecutively referred for neuropsychiatric evaluation and enrolled in the Longitudinal Study of Genetic Influences on Cognition. First, we assessed general ability, reaction time variability (RTV), and aspects of executive functions (EFs) in youth with non-comorbid forms of attention-deficit/hyperactivity disorder (ADHD), mood disorders and autism spectrum disorder (ASD), as well as in youth with psychosis. Second, we determined the impact of comorbid ADHD on cognition in youth with ASD and mood disorders. Results: For EFs (working memory, inhibition, and shifting/ flexibility), we observed weaknesses in all diagnostic groups when participants’ own ability was the referent. Decrements were subtle in relation to published normative data. For RTV, weaknesses emerged in youth with ADHD and mood disorders, but trend-level results could not rule out decrements in other conditions. Comorbidity with ADHD did not impact the pattern of weaknesses for youth with ASD or mood disorders but increased the magnitude of the decrement in those with mood disorders. Conclusions: Youth with ADHD, mood disorders, ASD, and psychosis show EF weaknesses that are not due to comorbidity. Whether such cognitive difficulties reflect genetic liability shared among these conditions requires further study. (JINS, 2018, 24, 91–103)
Introduction: Poor physicians’ knowledge of health care costs has been identified as an important barrier to improving efficiency and reducing overuse in care delivery. Moreover, costs of tests and treatments estimated with traditional costing methods have been shown to be imprecise and unreliable. We estimated the cost of frequent care activities in the emergency department (ED) using the time-driven activity-based costing (TDABC) method. Methods: We conducted a TDABC study in the ED of the CHUL, Québec city (77000 visits/year). We estimated the cost of all potential care activities (e.g. triage) provided to adult patients with selected urgent (e.g. pulmonary sepsis) and non urgent (e.g. urinary tract infection) conditions frequently encountered in the ED. Following Lean management principles, process maps were developed by a group of ED care providers for each care activity to identify human resources, supplies and equipment involved, and to estimate the time required to complete each process. Resource unit cost (e.g. cost per minute of a nurse) and overhead rate were calculated using financial information from fiscal year 2015-16. Estimated cost of each care activity (e.g. chest X-ray) including physicians’ charges was calculated by summing overhead allocation and the cost of each process (e.g. disinfection of the X-ray machine) as obtained by multiplying the resource unit cost by the time for process completion. Results: Process maps were developed for 14 conditions and 68 ED care activities. We estimated the costs of activities (CAN$) related to nursing (e.g. urinalysis and culture triage ordering $14.70), clerk tasks (e.g. patient registration $3.40), physicians (e.g. FAST scan $20.90), laboratory testing (e.g. CBC $6.30), diagnostic imaging (e.g. abdominal CT scan $146.50), therapy (e.g. 5 mg of iv morphine $20.40), and resuscitation (rapid sequence intubation with ketamine and succinylcholine $146.40). Overall, emergency physicians’ charges, personnel salaries and overheads accounted for 38%, 22% and 16% of all ED care costs, respectively. Conclusion: Our results represent an important step toward increasing emergency physicians’ awareness on the real cost of their interventions and empowering them to adopt more cost-effective practice patterns.
Introduction: Redirecting low acuity patients from emergency departments to primary care walk-in clinics has been identified as a priority by many health authorities. Promoting family physicians for the management of ambulatory patients with urgent health concerns reflects the assumption that primary care facilities can offer high-quality and more affordable ambulatory emergency care. However, no performance assessment framework has been developed for ambulatory emergency care and consequently, quality of care provided in these alternate settings has never been formally compared. Primary objective: To identify structure, process and outcome indicators for ambulatory emergency care. Methods: We will identify and develop quality indicators (QIs) for ambulatory emergency care using a RAND/UCLA Appropriateness Method (RAM) composed of three different steps. First, we will perform a scoping literature review to inventory 1) all previously recommended QIs assessing care provided to ambulatory emergency patients in the ED or the primary care settings; 2) all conditions evaluated with the retrieved QIs; and 3) all outcomes measured by the same QIs. Second, a steering committee composed of the research team and of international experts in performance assessment in emergency and primary care will be presented with the lists of QI-related conditions and outcomes. They will be asked to identify potential outcome indicators for ambulatory emergency care by generating any relevant combinations of one condition and one outcome (e.g. acute asthma exacerbation/re-consultation). Committee members will be given the latitude to use and pair any conditions or outcomes not included in the lists as long as they think the resulting indicators are compatible with the study objectives. Using a structured nominal group approach, they will combine their suggestions and refine the list of potential QIs. This list of potential outcome indicators composed of pairs “condition/outcome” will be merged with the list of already published QIs identified during the literature review. Third, as per the RAM standards, we will assemble an international multidisciplinary panel (n=20) of patients, emergency and primary care providers, researchers and decision makers, after recommendations from international emergency and primary care associations, and from the Canadian Strategy for Patient-Oriented Research (SPOR) Support Units. Through iterative rounds of ratings using both web-based survey tools and videoconferencing, panelists will independently assess all candidate QIs. They will be asked to rate on a nine-level scale to what extent each QI is a relevant and useful measure of ambulatory emergency care quality. From one round to the next, QIs with a median panelist rating score of one to three will be excluded. Those with a median score of seven or more will be automatically included in the final list. QIs with median score of four to six will be retained for future deliberations among the panelists. Rounds of ratings will be conducted until all QIs are classified. Impact: The QIs identified will be used to develop a performance assessment framework for ambulatory emergency care. This will represent an essential step toward testing the assumption that EDs and primary care walk-in clinics provide equivalent care quality to low acuity patients.
Introduction: In its prospective cohorts of independent seniors with minor injuries, the CETIe (Canadian Emergency Team Initiative) has shown that minor injuries trigger a spiral of mobility and functional decline in 18% of those seniors up to 6 months post-injury. Because of their effects on multiple physiological systems, multicomponent mobility interventions with physical exercises are among the best methods to limit frailty and improve mobility & function in seniors. Methods: Pilot clinical trial among 4 groups of seniors, discharged home post-ED consultation for minor injuries. Interventions: 2x 1 hour /week/12 weeks with muscle strengthening, functional and balance exercises under kinesiology supervision either at home (Jintronix tele-rehabilitation platform) or at community-based programs (YWCA, PIED) vs usual ED-discharge (CONTROL). Measures: Functional Status in ADLs (Older American Ressources Scale); Global physical & social functioning (SF-12 questionnaire), physical activity level (RAPA questionnaire) at initial ED visit and at 3 months. Results: 135 seniors were included (Controls: n=50; PIED: n=28; Jintronix: n=27; YWCA: n=18). Mean age was 72.6±6.2 years, 45% were prefrail, 86% and 8% had a fall or motor vehicle-related injuries (e.g. fractures: 30%; contusions: 37%). Intervention could start as early as 7 days post-injury. Seniors in interventions (Home, YWCA or PIED) maintained or improved their functional status (84% vs 60%, p≤0.05), their physical (73% vs 59%, p=0.05) and social (45% vs 23%, p≤0.05) functioning. While 21% of CONTROLs improved their physical activity level three months post-injury, 46% of seniors in intervention did (p≤0.05). Conclusion: Exercises-based interventions can help improve seniors’ function and mobility after a minor injury.
The gastrointestinal alterations associated with the consumption of an obesogenic diet, such as inflammation, permeability impairment and oxidative stress, have been poorly explored in both diet-induced obesity (DIO) and genetic obesity. The aim of the present study was to examine the impact of an obesogenic diet on the gut health status of DIO rats in comparison with the Zucker (fa/fa) rat leptin receptor-deficient model of genetic obesity over time. For this purpose, female Wistar rats (n 48) were administered a standard or a cafeteria diet (CAF diet) for 12, 14·5 or 17 weeks and were compared with fa/fa Zucker rats fed a standard diet for 10 weeks. Morphometric variables, plasma biochemical parameters, myeloperoxidase (MPO) activity and reactive oxygen species (ROS) levels in the ileum were assessed, as well as the expressions of proinflammatory genes (TNF-α and inducible nitric oxide synthase (iNOS)) and intestinal permeability genes (zonula occludens-1, claudin-1 and occludin). Both the nutritional model and the genetic obesity model showed increased body weight and metabolic alterations at the final time point. An increase in intestinal ROS production and MPO activity was observed in the gastrointestinal tracts of rats fed a CAF diet but not in the genetic obesity model. TNF-α was overexpressed in the ileum of both CAF diet and fa/fa groups, and ileal inflammation was associated with the degree of obesity and metabolic alterations. Interestingly, the 17-week CAF group and the fa/fa rats exhibited alterations in the expressions of permeability genes. Relevantly, in the hyperlipidic refined sugar diet model of obesity, the responses to chronic energy overload led to time-dependent increases in gut inflammation and oxidative stress.
Strong winds from massive stars are a topic of interest to a wide range of astrophysical fields. In High-Mass X-ray Binaries the presence of an accreting compact object on the one side allows to infer wind parameters from studies of the varying properties of the emitted X-rays; but on the other side the accretor’s gravity and ionizing radiation can strongly influence the wind flow. Based on a collaborative effort of astronomers both from the stellar wind and the X-ray community, this presentation attempts to review our current state of knowledge and indicate avenues for future progress.
In the present study, we used separate measures of parental monitoring and parental knowledge and compared their associations with youths’ antisocial behavior during preadolescence, between the ages of 10 and 12. Parental monitoring and knowledge were reported by mothers, fathers, and youths taking part in the Environmental Risk (E-Risk) Longitudinal Twin Study that follows 1,116 families with twins. Information on youths’ antisocial behavior was obtained from mothers as well as teachers. We report two main findings. First, longitudinal cross-lagged models revealed that greater parental monitoring did not predict less antisocial behavior later, once family characteristics were taken into account. Second, greater youth antisocial behavior predicted less parental knowledge later. This effect of youths’ behavior on parents’ knowledge was consistent across mothers’, fathers’, youths’, and teachers’ reports, and robust to controls for family confounders. The association was partially genetically mediated according to a Cholesky decomposition twin model; youths’ genetically influenced antisocial behavior led to a decrease in parents’ knowledge of youths’ activities. These two findings question the assumption that greater parental monitoring can reduce preadolescents’ antisocial behavior. They also indicate that parents’ knowledge of their children's activities is influenced by youths’ behavior.
Pathogenic invasion by Escherichia coli and Salmonellae remains a constant threat to the integrity of the intestinal epithelium and can rapidly induce inflammatory responses. At birth, colostrum consumption exerts numerous beneficial effects on the properties of intestinal epithelial cells and protects the gastrointestinal tract of newborns from pathogenic invasion. The present study aimed to investigate the effect of colostrum on the early and late inflammatory responses induced by pathogens. The short-term (2 h) and long-term (24 h) effects of exposure to heat-killed (HK) E. coli and Salmonella enterica Typhimurium on gene expression in the porcine intestinal epithelial cell (IPEC-J2) model were first evaluated by microarray and quantitative PCR analyses. Luciferase assays were performed using a NF-κB-luc reporter construct to investigate the effect of colostrum whey treatment on the activation of NF-κB induced by HK bacteria. Luciferase assays were also performed using NF-κB-luc, IL-8-luc and IL-6-luc reporter constructs in human colon adenocarcinoma Caco-2/15 cells exposed to dose–response stimulations with HK bacteria and colostrum whey. Bovine colostrum whey treatment decreased the expression of early and late inflammatory genes induced by HK bacteria in IPEC-J2, as well as the transcriptional activation of NF-κB-luc induced by HK bacteria. Unlike that with colostrum whey, treatment with other milk fractions failed to decrease the activation of NF-κB-luc induced by HK bacteria. Lastly, the reduction of the HK bacteria-induced activation of NF-κB-luc, IL-8-luc and IL-6-luc by colostrum whey was dose dependent. The results of the present study indicate that bovine colostrum may protect and preserve the integrity of the intestinal mucosal barrier in the host by controlling the expression levels of early and late inflammatory genes following invasion by enteric pathogens.
Bovine colostrum is well known for its beneficial properties on health and development. It contains a wide variety of bioactive ingredients that are known to promote a number of cellular processes. Therefore the use of colostrum whey as a feed additive to promote intestinal health has been proposed, yet little is known about mechanisms implicated in its beneficial properties on intestinal epithelial cells. In the present paper, casein were removed from bovine colostrum and the remaining liquid, rich in bioactive compounds, was evaluated for its capacity to modulate cellular processes in porcine intestinal epithelial cell line IPEC-J2 and human colon adenocarcinoma cell line Caco-2/15. First, we verified the effect of colostrum whey and cheese whey on processes involved in intestinal wound healing, including cell proliferation, attachment, morphology and migration. Our results showed that colostrum whey promoted proliferation and migration, and decreased specifically the attachment of Caco-2/15 cells on the culture dish. On the other hand, cheese whey induced proliferation and morphological changes in IPEC-J2 cells, but failed to induce migration. The gene expression profile of IPEC-J2 cells following colostrum whey treatment was evaluated by microarray analysis. Results revealed that the expression of a significant number of genes involved in cell migration, adhesion and proliferation was indeed affected in colostrum whey-treated cells. In conclusion, colostrum specific bioactive content could be beneficial for intestinal epithelial cell homoeostasis by controlling biological processes implicated in wound healing through a precise gene expression programme.
We examine prospectively the influence of two separate but potentially inter-related factors in the etiology of post-traumatic stress disorder (PTSD): childhood maltreatment as conferring a susceptibility to the PTSD response to adult trauma and juvenile disorders as precursors of adult PTSD.
The Dunedin Multidisciplinary Health and Development Study (DMHDS) is a birth cohort (n = 1037) from the general population of New Zealand's South Island, with multiple assessments up to age 38 years. DSM-IV PTSD was assessed among participants exposed to trauma at ages 26–38. Complete data were available on 928 participants.
Severe maltreatment in the first decade of life, experienced by 8.5% of the sample, was associated significantly with the risk of PTSD among those exposed to adult trauma [odds ratio (OR) 2.64, 95% confidence interval (CI) 1.16–6.01], compared to no maltreatment. Moderate maltreatment, experienced by 27.2%, was not associated significantly with that risk (OR 1.55, 95% CI 0.85–2.85). However, the two estimates did not differ significantly from one another. Juvenile disorders (ages 11–15), experienced by 35% of the sample, independent of childhood maltreatment, were associated significantly with the risk of PTSD response to adult trauma (OR 2.35, 95% CI 1.32–4.18).
Severe maltreatment is associated with risk of PTSD response to adult trauma, compared to no maltreatment, and juvenile disorders, independent of earlier maltreatment, are associated with that risk. The role of moderate maltreatment remains unresolved. Larger longitudinal studies are needed to assess the impact of moderate maltreatment, experienced by the majority of adult trauma victims with a history of maltreatment.
Verb–particle constructions are a notoriously difficult aspect of English to acquire for second-language (L2) learners. The present study investigated whether L2 English speakers are sensitive to gradations in semantic transparency of verb–particle constructions (e.g., finish up vs. chew out). French–English bilingual participants (first language: French, second language: English) completed an off-line similarity ratings survey, as well as an on-line masked priming task. Results of the survey showed that bilinguals’ similarity ratings became more native-like as their English proficiency levels increased. Results from the masked priming task showed that response latencies from high, but not low-proficiency bilinguals were similar to those of monolinguals, with mid- and high-similarity verb–particle/verb pairs (e.g., finish up/finish) producing greater priming than low-similarity pairs (e.g., chew out/chew). Taken together, the results suggest that L2 English speakers develop both explicit and implicit understanding of the semantic properties of verb–particle constructions, which approximates the sensitivity of native speakers as English proficiency increases.
Persons developing schizophrenia (SCZ) manifest various pre-morbid neuropsychological deficits, studied most often by measures of IQ. Far less is known about pre-morbid neuropsychological functioning in individuals who later develop bipolar psychoses (BP). We evaluated the specificity and impact of family history (FH) of psychosis on pre-morbid neuropsychological functioning.
We conducted a nested case-control study investigating the associations of neuropsychological data collected systematically at age 7 years for 99 adults with psychotic diagnoses (including 45 SCZ and 35 BP) and 101 controls, drawn from the New England cohort of the Collaborative Perinatal Project (CPP). A mixed-model approach evaluated full-scale IQ, four neuropsychological factors derived from principal components analysis (PCA), and the profile of 10 intelligence and achievement tests, controlling for maternal education, race and intra-familial correlation. We used a deviant responder approach (<10th percentile) to calculate rates of impairment.
There was a significant linear trend, with the SCZ group performing worst. The profile of childhood deficits for persons with SCZ did not differ significantly from BP. Neuropsychological impairment was identified in 42.2% of SCZ, 22.9% of BP and 7% of controls. The presence of psychosis in first-degree relatives (FH+) significantly increased the severity of childhood impairment for SCZ but not for BP.
Pre-morbid neuropsychological deficits are found in a substantial proportion of children who later develop SCZ, especially in the SCZ FH+ subgroup, but less so in BP, suggesting especially impaired neurodevelopment underlying cognition in pre-SCZ children. Future work should assess genetic and environmental factors that explain this FH effect.
Random bead microarray in which each bead has been functionalized with different biomolecules are of great interest. The process by which one would identify the location of each bead is referred to as decoding. The decoding step has played a challenging role in microarray technologies due to the increase in complexity with the increase in array size. Here we report a novel, fast and reliable method of decoding randomly assembled arrays based on polymer beads with unique spectroscopic signatures. Beads were synthesized by dispersion polymerization of a family of styrene monomers and methacrylic acid to generate a spectroscopically encoded bead library. In addition to identifying the self-encoded beads through their unique spectrum in a tetraplex experiment, Raman spectroscopy was used to monitor antibody-antigen binding events on the barcoded beads. The simplicity, versatility and rapid analysis enabled by this self-encoded bead array platform technology demonstrates its potential in high-throughput biomolecular multiplex screening.
Background. Suicide is a common cause of death in anorexia nervosa and suicide attempts occur often in both anorexia nervosa and bulimia nervosa. No studies have examined predictors of suicide attempts in a longitudinal study of eating disorders with frequent follow-up intervals. The objective of this study was to determine predictors of serious suicide attempts in women with eating disorders.
Method. In a prospective longitudinal study, women diagnosed with either DSM-IV anorexia nervosa (n=136) or bulimia nervosa (n=110) were interviewed and assessed for suicide attempts and suicidal intent every 6–12 months over 8·6 years.
Results. Fifteen percent of subjects reported at least one prospective suicide attempt over the course of the study. Significantly more anorexic (22·1%) than bulimic subjects (10·9%) made a suicide attempt. Multivariate analyses indicated that the unique predictors of suicide attempts for anorexia nervosa included the severity of both depressive symptoms and drug use over the course of the study. For bulimia nervosa, a history of drug use disorder at intake and the use of laxatives during the study significantly predicted suicide attempts.
Conclusions. Women with anorexia nervosa or bulimia nervosa are at considerable risk to attempt suicide. Clinicians should be aware of this risk, particularly in anorexic patients with substantial co-morbidity.
Some data suggest that the colonic microflora may adapt to produce more butyrate if given time and the proper substrate. To test this hypothesis, we investigated the effect of prolonged feeding of resistant potato starch on butyrate production. Rats were fed on either a low-fibre diet (basal) or the same diet supplemented with 90 g resistant potato starch/kg (PoS) for 0·5, 2 and 6 months. Short-chain fatty acid (SCFA) concentrations were determined in caecal and colonic contents at the end of each ingestion period. Total SCFA concentration increased over time throughout the caecocolonic tract with PoS, but was not modified with the basal diet. While propionate concentration was unchanged, butyrate concentration was highly increased by PoS at each time period in both the caecum and colon. Moreover, the butyrogenic effect of PoS increased over time, and the amount of butyrate was increased 6-fold in the caecum and proximal colon and 3-fold in the distal colon after 6 months compared with 0·5 months. Accordingly, the ratio butyrate: - total SCFA increased over time throughout the caecocolonic tract (12·6 (SE 2·8) v. 28 (SE 1·8) % in the caecum, 10·5 (SE 1·4) v. 26·8 (SE 0·9) % in the proximal colon, and 7·3 (SE 2·4) v. 23·9 (SE 2·7) % in the distal colon at 0·5 v. 6 months respectively), while the proportion of acetate decreased. Neither the proportion nor the concentration of butyrate was modified over time with the basal diet. Butyrate production was thus promoted by long-term ingestion of PoS, from the caecum towards the distal colon, which suggests that a slow adaptive process occurs within the digestive tract in response to a chronic load of indigestible carbohydrates.