Due to unplanned maintenance of the back-end systems supporting article purchase on Cambridge Core, we have taken the decision to temporarily suspend article purchase for the foreseeable future. We apologise for any inconvenience caused whilst we work with the relevant teams to restore this service.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Variants in voltage-gated sodium channels (VGSC) are a common cause of severe early onset epilepsy. Changes in CSF neurotransmitters (NT) were identified in 2 cases of VGSC-related epilepsy. Here we investigate NT changes in patients and a novel mouse model of VGSC-related epilepsy. METHODS/STUDY POPULATION: We conducted a single site IRB approved retrospective chart review of patients with VGSC-related epilepsy who underwent CSF NT testing for diagnostic purposes. In parallel, we examined NT levels from the brains of wild-type (WT) and a novel VGSC-related epilepsy mouse model after obtaining IACUC approval. We rapidly isolated forebrain, cortex, striatum, and brainstem from 5-6 animals per sex and genotype. A combination of HPLC with electrochemical detection and mass spectrometry were used to quantify NT levels from brain samples. RESULTS/ANTICIPATED RESULTS: We identified 10 patients with VGSC-related epilepsy who received CSF NT testing. Two of these patients had abnormal NT results including changes to dopamine (DA) or serotonin (5-HT) metabolites. We analyzed NT levels from four brain regions from male and female WT and VGSC-related epilepsy mice. We anticipate that most of the NT levels will be similar to WT, however subtle changes in the DA or 5-HT metabolites may be seen in VGSC-related epilepsy. DISCUSSION/SIGNIFICANCE OF IMPACT: Patients with VGSC-related epilepsy often have autism spectrum disorder, sleep, and movement disorders. Understanding the role of aberrant NT levels in VGSC-related epilepsy may provide additional therapeutic targets that address common neuropsychological comorbidities as well as seizures.
Recent survey in the Gulf of Carpentaria region of northern Australia has identified a unique assemblage of miniature and small-scale stencilled motifs depicting anthropomorphs, material culture, macropod tracks and linear designs. The unusual sizes and shapes of these motifs raise questions about the types of material used for the stencil templates. Drawing on ethnographic data and experimental archaeology, the authors argue that the motifs were created with a previously undocumented stencilling technique using miniature models sculpted from beeswax. The results suggest that beeswax and other malleable and adhesive resins may have played a more significant role in creating stencilled motifs than previously thought.
Facial expression is an independent and objective marker of affect. Basic emotions (fear, sadness, joy, anger, disgust and surprise) have been shown to be universal across human cultures. Techniques such as the Facial Action Coding System can capture emotion with good reliability. Such techniques visually process the changes in different assemblies of facial muscles that produce the facial expression of affect.
Recent groundbreaking advances in computing and facial expression analysis software now allow real-time and objective measurement of emotional states. In particular, a recently developed software package and equipment, the Imotion Attention Tool™, allows capturing information on discreet emotional states based on facial expressions while a subject is participating in a behavioural task.
Extending preliminary work by further experimentation and analysis, the present findings suggests a link between facial affect data to already established peripheral arousal measures such as event related potentials (ERP), heart rate variability (HRV) and galvanic skin response (GSR) using disruptively innovative, noninvasive and clinically applicable technology in patients reporting suicidal ideation and intent compared to controls. Our results hold promise for the establishment of a computerized diagnostic battery that can be utilized by clinicians to improve the evaluation of suicide risk.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The home environment is acknowledged as an important setting that can shape dietary habits in early childhood. For instance, parents influence their children's dietary intakes through the foods they make available to their children, their own eating habits and their parenting practices. The aim of this cross-sectional study was to determine the associations between home environmental characteristics and children's fruit, vegetable and confectionary/sugary drink intakes. A total of 332 children aged 3–5 years old and their parents/guardians participated in the study. Home environmental characteristics, including mealtimes, child television viewing, parental control feeding practices, food availability and accessibility, were explored using questions from validated questionnaires. Parent and child food consumption was also measured. The data were analysed using bivariate and multivariate binary logistic regression. Independent variables (home environment and parental diet) were included in the multivariate analysis if they were significant in the bivariate analysis. An association between household income and children's fruit intake was observed with children from lower income households being 54% less likely to eat fruit daily (95% CI 0.22–0.96, p < 0.040) compared with those from higher income households. Home food availability also influenced children's fruit intake. Greater variety of fruits available in the home increased the likelihood of fruit consumption in children (OR 1.35, 95% CI 1.09–1.68, p < 0.005). Watching television for ≥ 1 hour per day had a negative impact on children's diets, decreasing their probability of eating vegetables on a daily basis (OR 0.38 95% Cl 0.22–0.72, p < 0.003) and increasing by 2.7 times their likelihood of consuming confectionary/sugary drink more than once a week (95% CI 1.11–6.36, p < 0.027). Those children whose parents had lower vegetable consumption and higher confectionary intake were 59% less likely to eat vegetables, and 4.42 times more likely to consume confectionary/sugary drinks (OR 0.41, 95%Cl 0.21–0.82, p < 0.012) and (OR 4.42, 95% CI 1.69–11.59, p < 0.002) respectively. Pressure to eat from parents was associated with lower fruit intake only (OR 0.67, 95% CI 0.47–0.96, p < 0.032). This study has demonstrated that modifiable home environmental characteristics were significantly associated with decreased fruit and vegetable intake and increased confectionary/sugary drink consumption among children aged 3–5 years. These findings may help in the development of intervention strategies to encourage a healthier diet for this age group.
Frozen raw breaded chicken products (FRBCP) have been identified as a risk factor for Salmonella infection in Canada. In 2017, Canada implemented whole genome sequencing (WGS) for clinical and non-clinical Salmonella isolates, which increased understanding of the relatedness of Salmonella isolates, resulting in an increased number of Salmonella outbreak investigations. A total of 18 outbreaks and 584 laboratory-confirmed cases have been associated with FRBCP or chicken since 2017. The introduction of WGS provided the evidence needed to support a new requirement to control the risk of Salmonella in FRBCP produced for retail sale.
Smallholder livestock systems in Central America are typically based on pastures with traditional grasses and associated management practices, such as pasture burning and extensive grazing. With the rise of the global population and a corresponding increase in demand for meat and milk production, research efforts have focused on the development of improved grasses and the incorporation of legume species that can increase productivity and sustainability of Central American livestock systems. However, farmer adoption remains very limited, in part due to the lack of site-specific evaluation and recommendations by local institutions. Using a multi-site participatory approach, this study examined the potential of five improved grasses and five species of forage legumes as alternatives to the broadly disseminated grass Hyparrhenia rufa (cv. Jaragua) in pasture-based cattle systems in western Honduras and northern El Salvador. Improved grasses (four Brachiaria sp. and Megathyrsus maximus) produced significantly more biomass than H. rufa; also four of the five legume varieties evaluated (Canavalia ensiformis, Canavalia brasiliensis, Vigna unguiculata, and Vigna radiata) demonstrated high adaptability to diverse environmental conditions across sites. Farmer participatory evaluation offers a valuable means to assess performance of forages and will likely contribute to their improved utilization. Future research is needed on more refined management recommendations, pasture system design, costs and environmental benefits associated with the adoption of these forages in local livestock production systems.
Protected areas are central to global efforts to prevent species extinctions, with many countries investing heavily in their establishment. Yet the designation of protected areas alone can only abate certain threats to biodiversity. Targeted management within protected areas is often required to achieve fully effective conservation within their boundaries. It remains unclear what combination of protected area designation and management is needed to remove the suite of processes that imperil species. Here, using Australia as a case study, we use a dataset on the pressures facing threatened species to determine the role of protected areas and management in conserving imperilled species. We found that protected areas that are not resourced for threat management could remove one or more threats to 1,185 (76%) species and all threats to very few (n = 51, 3%) species. In contrast, a protected area network that is adequately resourced to manage threatening processes within their boundary could remove one or more threats to almost all species (n = 1,551; c. 100%) and all threats to almost half (n = 740, 48%). However, 815 (52%) species face one or more threats that require coordinated conservation actions that protected areas alone could not remove. This research shows that investing in the continued expansion of Australia's protected area network without providing adequate funding for threat management within and beyond the existing protected area network will benefit few threatened species. These findings highlight that as the international community expands the global protected area network in accordance with the 2020 Strategic Plan for Biodiversity, a greater emphasis on the effectiveness of threat management is needed.
In Ireland, National Clinical Programmes are being established to improve and standardise patient care throughout the Health Service Executive. In line with internationally recognised guidelines on the treatment of first episode psychosis the Early Intervention in Psychosis (EIP) programme is being drafted with a view to implementation by mental health services across the country. We undertook a review of patients presenting with a first episode of psychosis to the Dublin Southwest Mental Health Service before the implementation of the EIP. This baseline information will be used to measure the efficacy of our EIP programme.
Patients who presented with a first episode psychosis were retrospectively identified through case note reviews and consultation with treating teams. We gathered demographic and clinical information from patients as well as data on treatment provision over a 2-year period from the time of first presentation. Data included age at first presentation, duration of untreated psychosis, diagnosis, referral source, antipsychotic prescribing rates and dosing, rates of provision of psychological interventions and standards of physical healthcare monitoring. Outcome measures with regards to rates of admission over a 2-year period following initial presentation were also recorded.
In total, 66 cases were identified. The majority were male, single, unemployed and living with their family or spouse. The mean age at first presentation was 31 years with a mean duration of untreated psychosis of 17 months. Just under one-third were diagnosed with schizophrenia. Approximately half of the patients had no contact with a health service before presentation. The majority of patients presented through the emergency department. Two-thirds of all patients had a hospital admission within 2 years of presentation and almost one quarter of patients had an involuntary admission. The majority of patients were prescribed antipsychotic doses within recommended British National Formulary guidelines. Most patients received individual support through their keyworker and family intervention was provided in the majority of cases. Only a small number received formal Cognitive-Behavioural Therapy. Physical healthcare monitoring was insufficiently recorded in the majority of patients.
There is a shortage of information on the profile and treatment of patients presenting with a first episode of psychosis in Ireland. This baseline information is important in evaluating the efficacy of any new programme for this patient group. Many aspects of good practice were identified within the service in particular with regards to the appropriate prescribing of antipsychotic medication and the rates of family intervention. Deficiencies remain however in the monitoring of physical health and the provision of formal psychological interventions to patients. With the implementation of an EIP programme it is hoped that service provision would improve nationwide and to internationally recognised standards.
Angus and Hereford beef is marketed internationally for apparent superior meat quality attributes; DNA-based breed authenticity could be a useful instrument to ensure consumer confidence on premium meat products. The objective of this study was to develop an ultra-low-density genotype panel to accurately quantify the Angus and Hereford breed proportion in biological samples. Medium-density genotypes (13 306 single nucleotide polymorphisms (SNPs)) were available on 54 703 commercial and 4042 purebred animals. The breed proportion of the commercial animals was generated from the medium-density genotypes and this estimate was regarded as the gold-standard breed composition. Ten genotype panels (100 to 1000 SNPs) were developed from the medium-density genotypes; five methods were used to identify the most informative SNPs and these included the Delta statistic, the fixation (Fst) statistic and an index of both. Breed assignment analyses were undertaken for each breed, panel density and SNP selection method separately with a programme to infer population structure using the entire 13 306 SNP panel (representing the gold-standard measure). Breed assignment was undertaken for all commercial animals (n=54 703), animals deemed to contain some proportion of Angus based on pedigree (n=5740) and animals deemed to contain some proportion of Hereford based on pedigree (n=5187). The predicted breed proportion of all animals from the lower density panels was then compared with the gold-standard breed prediction. Panel density, SNP selection method and breed all had a significant effect on the correlation of predicted and actual breed proportion. Regardless of breed, the Index method of SNP selection numerically (but not significantly) outperformed all other selection methods in accuracy (i.e. correlation and root mean square of prediction) when panel density was ⩾300 SNPs. The correlation between actual and predicted breed proportion increased as panel density increased. Using 300 SNPs (selected using the global index method), the correlation between predicted and actual breed proportion was 0.993 and 0.995 in the Angus and Hereford validation populations, respectively. When SNP panels optimised for breed prediction in one population were used to predict the breed proportion of a separate population, the correlation between predicted and actual breed proportion was 0.034 and 0.044 weaker in the Hereford and Angus populations, respectively (using the 300 SNP panel). It is necessary to include at least 300 to 400 SNPs (per breed) on genotype panels to accurately predict breed proportion from biological samples.
Information on the genetic diversity and population structure of cattle breeds is useful when deciding the most optimal, for example, crossbreeding strategies to improve phenotypic performance by exploiting heterosis. The present study investigated the genetic diversity and population structure of the most prominent dairy and beef breeds used in Ireland. Illumina high-density genotypes (777 962 single nucleotide polymorphisms; SNPs) were available on 4623 purebred bulls from nine breeds; Angus (n=430), Belgian Blue (n=298), Charolais (n=893), Hereford (n=327), Holstein-Friesian (n=1261), Jersey (n=75), Limousin (n=943), Montbéliarde (n=33) and Simmental (n=363). Principal component analysis revealed that Angus, Hereford, and Jersey formed non-overlapping clusters, representing distinct populations. In contrast, overlapping clusters suggested geographical proximity of origin and genetic similarity between Limousin, Simmental and Montbéliarde and to a lesser extent between Holstein, Friesian and Belgian Blue. The observed SNP heterozygosity averaged across all loci was 0.379. The Belgian Blue had the greatest mean observed heterozygosity (HO=0.389) among individuals within breed while the Holstein-Friesian and Jersey populations had the lowest mean heterozygosity (HO=0.370 and 0.376, respectively). The correlation between the genomic-based and pedigree-based inbreeding coefficients was weak (r=0.171; P<0.001). Mean genomic inbreeding estimates were greatest for Jersey (0.173) and least for Hereford (0.051). The pair-wise breed fixation index (Fst) ranged from 0.049 (Limousin and Charolais) to 0.165 (Hereford and Jersey). In conclusion, substantial genetic variation exists among breeds commercially used in Ireland. Thus custom-mating strategies would be successful in maximising the exploitation of heterosis in crossbreeding strategies.
Traditionally only a small proportion of the workforce was engaged in shift work. Changing economic pressures have resulted in increased engagement in shift work, with approximately 17 % of the workforce in Europe engaged in this work pattern. The present narrative review aimed to summarise the data on the effects of shift work on the diet, lifestyle and health of employees, while addressing the barriers to, and opportunities for, improving health among shift workers. Shift work can result in low-quality diet and irregular eating patterns. Adverse health behaviours are also reported; particularly increased smoking and poor sleep patterns. These altered lifestyle habits, in conjunction with disruption to circadian rhythms, can create an unfavourable metabolic phenotype which facilitates the development and progression of chronic disease. Although the data are inconclusive due to issues such as poor study design and inadequate control for confounding factors; shift workers appear to be at increased mental and physical health risk, particularly with regard to non-communicable diseases. Information is lacking on the obstacles to leading a healthier lifestyle while working shifts, and where opportunities lie for intervention and health promotion among this group. In order to provide an informed evidence base to assist shift workers in overcoming associated occupational hazards, this gap must be addressed. This review highlights the unique nutritional issues faced by shift workers, and the subsequent effect on health. In societies already burdened with increased incidence of non-communicable chronic diseases, there is a clear need for education and behaviour change interventions among this group.
To examine women’s experience of professional support for breast-feeding and health-care professionals’ experience of providing support.
We conducted semi-structured qualitative interviews among women with experience of breast-feeding and health-care professionals with infant feeding roles. Interviews with women were designed to explore their experience of support for breast-feeding antenatally, in hospital and postnatally. Interviews with health-care professionals were designed to explore their views on their role and experience in providing breast-feeding support. Interview transcripts were analysed using content analysis and aspects of Grounded Theory. Overarching themes and categories within the two sets were identified.
Urban and suburban areas of North Dublin, Ireland.
Twenty-two women all of whom had experience of breast-feeding and fifty-eight health-care professionals.
Two overarching themes emerged and in each of these a number of categories were developed: theme 1, facilitators to breast-feeding support, within which being facilitated to breast-feed, having the right person at the right time, being discerning and breast-feeding support groups were discussed; and theme 2, barriers to breast-feeding support, within which time, conflicting information, medicalisation of breast-feeding and the role of health-care professionals in providing support for breast-feeding were discussed.
Breast-feeding is being placed within a medical model of care in Ireland which is dependent on health-care professionals. There is a need for training around breast-feeding for all health-care professionals; however, they are limited in their support due to external barriers such as lack of time. Alternative support such as peer support workers should be provided.
Blood culture contamination (BCC) has been associated with unnecessary antibiotic use, additional laboratory tests and increased length of hospital stay thus incurring significant extra hospital costs. We set out to assess the impact of a staff educational intervention programme on decreasing intensive care unit (ICU) BCC rates to <3% (American Society for Microbiology standard). BCC rates during the pre-intervention period (January 2006–May 2011) were compared with the intervention period (June 2011–December 2012) using run chart and regression analysis. Monthly ICU BCC rates during the intervention period were reduced to a mean of 3·7%, compared to 9·5% during the baseline period (P < 0·001) with an estimated potential annual cost savings of about £250 100. The approach used was simple in design, flexible in delivery and efficient in outcomes, and may encourage its translation into clinical practice in different healthcare settings.
Heart rate variability (HRV) is known to be reduced in depression; however, is unclear whether this is a consequence of the disorder or due to antidepressant medication.
We analysed data on 4750 participants from the first wave of The Irish Longitudinal Study on Ageing (TILDA). Time [standard deviation of normal to normal intervals (SDNN ms2)] and frequency domain [low frequency (LF) and high frequency (HF)] measures of HRV were derived from 3-lead surface electrocardiogram records obtained during 10 min of supine rest. Depression was assessed using the Center for Epidemiologic Studies – Depression scale.
Participants on antidepressants [with (n = 80) or without depression (n = 185)] differed significantly from controls (not depressed and not taking antidepressants n = 4107) on all measures of HRV. Depressed participants not taking antidepressants (n = 317) did not differ from controls on any measures of HRV. In linear regression analysis adjusted for relevant factors all antidepressants were associated with lower measures HRV. Participants on selective serotonin reuptake inhibitors (SSRIs) had higher measures of HRV relative to participants on tricyclic antidepressants or serotonin–norepinephrine reuptake inhibitors respectively.
Our results suggest that reductions in HRV observed among depressed older adults are driven by the effects of antidepressant medications. SSRIs have less impact on HRV than other antidepressants but they are still associated with lower measures of HRV. Study limitations include the use of a self-report measure of depression and floor effects of age on HRV could have limited our ability to detect an association between HRV and depression.
Our aim was to evaluate interrater reliability for the diagnosis of pediatric delirium by child psychiatrists.
Critically ill patients (N = 17), 0–21 years old, including 7 infants, 5 children with developmental delay, and 7 intubated children, were assessed for delirium using the Diagnostic and Statistical Manual–IV (DSM–IV) (comparable to DSM–V) criteria. Delirium assessments were completed by two psychiatrists, each blinded to the other's diagnosis, and interrater reliability was measured using Cohen's κ coefficient along with its 95% confidence interval.
Interrater reliability for the psychiatric assessment was high (Cohen's κ = 0.94, CI [0.83, 1.00]). Delirium diagnosis showed excellent interrater reliability regardless of age, developmental delay, or intubation status (Cohen's κ range 0.81–1.00).
Significance of results:
In our study cohort, the psychiatric interview and exam, long considered the “gold standard” in the diagnosis of delirium, was highly reliable, even in extremely young, critically ill, and developmentally delayed children. A developmental approach to diagnosing delirium in this challenging population is recommended.
The objective of this study was to quantify the accuracy of imputing the genotype of parents using information on the genotype of their progeny and a family-based and population-based imputation algorithm. Two separate data sets were used, one containing both dairy and beef animals (n=3122) with high-density genotypes (735 151 single nucleotide polymorphisms (SNPs)) and the other containing just dairy animals (n=5489) with medium-density genotypes (51 602 SNPs). Imputation accuracy of three different genotype density panels were evaluated representing low (i.e. 6501 SNPs), medium and high density. The full genotypes of sires with genotyped half-sib progeny were masked and subsequently imputed. Genotyped half-sib progeny group sizes were altered from 4 up to 12 and the impact on imputation accuracy was quantified. Up to 157 and 258 sires were used to test the accuracy of imputation in the dairy plus beef data set and the dairy-only data set, respectively. The efficiency and accuracy of imputation was quantified as the proportion of genotypes that could not be imputed, and as both the genotype concordance rate and allele concordance rate. The median proportion of genotypes per animal that could not be imputed in the imputation process decreased as the number of genotyped half-sib progeny increased; values for the medium-density panel ranged from a median of 0.015 with a half-sib progeny group size of 4 to a median of 0.0014 to 0.0015 with a half-sib progeny group size of 8. The accuracy of imputation across different paternal half-sib progeny group sizes was similar in both data sets. Concordance rates increased considerably as the number of genotyped half-sib progeny increased from four (mean animal allele concordance rate of 0.94 in both data sets for the medium-density genotype panel) to five (mean animal allele concordance rate of 0.96 in both data sets for the medium-density genotype panel) after which it was relatively stable up to a half-sib progeny group size of eight. In the data set with dairy-only animals, sufficient sires with paternal half-sib progeny groups up to 12 were available and the within-animal mean genotype concordance rates continued to increase up to this group size. The accuracy of imputation was worst for the low-density genotypes, especially with smaller half-sib progeny group sizes but the difference in imputation accuracy between density panels diminished as progeny group size increased; the difference between high and medium-density genotype panels was relatively small across all half-sib progeny group sizes. Where biological material or genotypes are not available on individual animals, at least five progeny can be genotyped (on either a medium or high-density genotyping platform) and the parental alleles imputed with, on average, ⩾96% accuracy.
The objective of this study was to evaluate the impact of restricting high-risk antibiotics on methicillin-resistant Staphylococcus aureus (MRSA) incidence rates in a hospital setting. A secondary objective was to assess the impact of reducing fluoroquinolone use in the primary-care setting on MRSA incidence in the community. This was an interventional, retrospective, ecological investigation in both hospital and community (January 2006 to June 2010). Segmented regression analysis of interrupted time-series was employed to evaluate the intervention. The restriction of high-risk antibiotics was associated with a significant change in hospital MRSA incidence trend (coefficient = −0·00561, P = 0·0057). Analysis showed that the intervention relating to reducing fluoroquinolone use in the community was associated with a significant trend change in MRSA incidence in community (coefficient = −0·00004, P = 0·0299). The reduction in high-risk antibiotic use and fluoroquinolone use contributed to both a reduction in incidence rates of MRSA in hospital and community (primary-care) settings.
The present study aimed to investigate socio-economic disparities in food and nutrient intakes among young Irish women. A total of 221 disadvantaged and seventy-four non-disadvantaged women aged 18–35 years were recruited. Diet was assessed using a diet history protocol. Of the total population, 153 disadvantaged and sixty-three non-disadvantaged women were classified as plausible dietary reporters. Food group intakes, nutrient intakes and dietary vitamin and mineral concentrations per MJ of energy consumed were compared between the disadvantaged and non-disadvantaged populations, as was compliance with dietary fibre, macronutrient and micronutrient intake guidelines. The disadvantaged women had lower intakes than the non-disadvantaged women of fruit, vegetables, fish, breakfast cereals, low-fat milk and wholemeal bread (all P< 0·001), yogurt (P= 0·001), low-fat spread (P= 0·002) and fresh meat (P= 0·003). They also had higher intakes of butter, processed red meats, white bread, sugar-sweetened beverages, fried potatoes and potato-based snacks (all P< 0·001) and full-fat milk (P= 0·014). Nutritionally, the disadvantaged women had higher fat, saturated fat and refined sugar intakes; lower dietary fibre, vitamin and mineral intakes; and lower dietary vitamin and mineral densities per MJ than their more advantaged peers. Non-achievement of carbohydrate (P= 0·017), fat (P< 0·001), saturated fat (P< 0·001), refined sugar (P< 0·001), folate (P= 0·050), vitamin C (P< 0·001), vitamin D (P= 0·047) and Ca (P= 0·019) recommendations was more prevalent among the disadvantaged women. Both groups showed poor compliance with Fe and Na guidelines. We conclude that the nutritional deficits present among these socially disadvantaged women are significant, but may be potentially ameliorated by targeted food-based interventions.