To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We evaluated the performance of three serological tests – an immunoglobulin G indirect enzyme linked immunosorbent assay (iELISA), a Rose Bengal test and a slow agglutination test (SAT) – for the diagnosis of bovine brucellosis in Bangladesh. Cattle sera (n = 1360) sourced from Mymensingh district (MD) and a Government owned dairy farm (GF) were tested in parallel. We used a Bayesian latent class model that adjusted for the conditional dependence among the three tests and assumed constant diagnostic accuracy of the three tests in both populations. The sensitivity and specificity of the three tests varied from 84.6% to 93.7%, respectively. The true prevalences of bovine brucellosis in MD and the GF were 0.6% and 20.4%, respectively. Parallel interpretation of iELISA and SAT yielded the highest negative predictive values: 99.9% in MD and 99.6% in the GF; whereas serial interpretation of both iELISA and SAT produced the highest positive predictive value (PPV): 99.9% in the GF and also high PPV (98.9%) in MD. We recommend the use of both iELISA and SAT together and serial interpretation for culling and parallel interpretation for import decisions. Removal of brucellosis positive cattle will contribute to the control of brucellosis as a public health risk in Bangladesh.
To investigate the sociodemographic and geographical variation in under- and overnutrition prevalence among children and mothers.
Data from the 2014 Bangladesh Demographic and Health Survey were analysed. Stunting and wasting for children and BMI<18·5 kg/m2 for mothers were considered as undernutrition; overweight was considered as overnutrition for both children and mothers. We estimated the prevalence and performed simple logistic regression analyses to assess the associations between outcome variables and predictors. Bayesian spatial models were applied to estimate region-level prevalence to identify the regions (districts) prone to under- and overnutrition.
Children aged<5 years and their mothers aged 15–49 years in Bangladesh.
A significant difference (P<0·001) was observed in both under- and overnutrition prevalence between poor and rich. A notable regional variation was also observed in under- and overnutrition prevalence. Stunting prevalence ranged from 20·3 % in Jessore to 56·2 % in Sunamgonj, wasting from 10·6 % in Dhaka to 19·2 % in Bhola, and overweight from 0·8 % in Shariatpur to 2·6 % in Dhaka. Of the sixty-four districts, twelve had prevalence of stunting and thirty-two districts had prevalence of wasting higher than the WHO critical threshold levels. Similarly, fifty-three districts had prevalence of maternal underweight higher than the national level. In contrast, the prevalence of overweight was comparatively high in the industrially equipped metropolitan districts.
Observed sociodemographic and geographical inequalities imply slow progress in the overall improvement of both under- and overnutrition. Therefore, effective intervention programmes and policies need to be designed urgently targeting the grass-roots level of such regions.
The Eastern Gangetic Plain is among the world's most intensively farmed regions, where rainfed and irrigated agriculture coexist. While the region and especially Bangladesh is a major producer of rice (Oryza sativa L. ssp. indica), there is potential to further develop sustainable rice production systems. Specifically, there is scope to include a replacement crop for the short fallow between rice crops in the dominant cropping pattern of rainfed monsoon rice harvest followed by irrigated spring rice. The aim of the current research was to identify a suitable cool-season legume crop – pea (Pisum sativum L.) or lentil (Lens culinaris Medik. ssp. culinaris) – that could be grown in the brief period between rice crops. The study comprised four crop sequence experiments comparing legume cultivars differing in maturity grown in between both long and short duration rice cultivars. These experiments were done at the Bangladesh Rice Research Institute regional station at Rajshahi over three cropping cycles. This was followed by an evaluation of pea vs. fallow between rice crops on three farmers’ fields in one cropping cycle. Here it is demonstrated that green pod vegetable pea is one of the best options to intensify the rainfed monsoon rice–fallow–spring irrigated rice cropping system, notwithstanding other remunerative rabi cropping options that could displace boro rice. The inclusion of an extra crop, pea as green pod vegetable, increased farm productivity by 1·4-fold over the dominant cropping sequence (rice–fallow–rice) and farm net income by fourfold. The study highlighted the advantages in total system productivity and monetary return of crop intensification with the inclusion of a pea crop between successive rice crops instead of a fallow period.
The aim of this feasibility trial was to evaluate the feasibility and acceptability of the locally adapted Group Problem Management Plus (PM+) intervention for women in the conflict affected settings in Swat, Pakistan.
This mixed-methods study incorporated a quantitative component consisting of a two arm cluster randomised controlled feasibility trial, and qualitative evaluation of the acceptability of the Group PM+ to a range of stakeholder groups. For the quantitative component, on average from each of the 20 Lady Health Workers (LHWs) catchment area (20 clusters), six women were screened and recruited for the trial with score of >2 on the General Health Questionnaire and score of >16 on the WHO Disability Assessment Schedule. These LHW clusters were randomised on a 1 : 1 allocation ratio using a computer-based software through a simple randomisation method to the Group PM+ intervention or Enhanced Usual Care. The Group PM+ intervention consisted of five weekly sessions of 2 h duration delivered by local non-specialist females under supervision. The primary outcome was individual psychological distress, measured by levels of anxiety and depression on the Hospital Anxiety and Depression Scale at 7th week after baseline. Secondary outcomes include symptoms of depression, post-traumatic stress disorder (PTSD), general psychological profile, levels of functioning and generalised psychological distress. Intervention acceptability was explored through in-depth interviews.
The results show that lay-helpers with no prior mental health experience can be trained to achieve the desired competency to successfully deliver the intervention in community settings under supervision. There was a good intervention uptake, with Group PM+ considered useful by participants, their families and lay-helpers. The outcome evaluation, which was not based on a large enough study to identify statistically significant results, indicated statistically significant improvements in depression, anxiety, general psychological profile and functioning. The PTSD symptoms and depressive disorder scores showed a trend in favour of the intervention.
This trial showed robust acceptance in the local settings with delivery by non-specialists under supervision by local trained females. The trial paves the way for further adaptation and exploration of the outcomes through larger-scale implementation and definitive randomised controlled trials in the local settings.
Introduction: Syncope can be caused by serious life-threatening conditions not obvious during the initial ED assessment leading to wide variations in management. We aimed to identify the reasons for consultations and hospitalizations, outcomes, and the potential cost savings if an outpatient cardiac monitoring strategy were developed. Methods: We conducted a prospective cohort study of adult syncope patients at 5 academic EDs over 41 months. We collected baseline characteristics, reasons for consultation and hospitalization, hospital length of stay and average total inpatient cost. Adjudicated 30-day serious adverse events (SAEs) included death, myocardial infarction, arrhythmia, structural heart disease, pulmonary embolism, significant hemorrhage and procedural intervention. We used descriptive statistics with 95% CI. Results: Of the 4,064 patients enrolled (mean age 53.1 years, 55.9% female), 3,255 (80.1%) were discharged from the ED, 209 (5.2%) had a SAE identified in the ED, 600 (14.8%) with no SAE were referred for consultation in the ED and 299 (7.4%) were hospitalized: 55.5% of referrals and 55.2% of hospitalizations were for suspected cardiac syncope (46.5% admitted for cardiac monitoring of whom 71.2% had no cause identified). SAE among groups were 9.7% in total; 2.5% discharged by ED physician; 3.4% discharged by consultant from ED; 21.7% as inpatient and 4.8% following discharge from hospital. The mean hospital length of stay for cardiac syncope was 6.7 (95%CI 5.8, 7.7) days with total estimated costs of $7,925 per patient (95% CI: 7434, 8417). Conclusion: Suspected cardiac syncope, particularly arrhythmia, was the major reason for ED referral and hospitalization. The majority of patients hospitalized for cardiac monitoring had no identified cause. An important number of patients suffered SAE, particularly arrhythmias outside the hospital. These findings highlight the need to develop a robust syncope prediction tool and a remote cardiac monitoring strategy to improve patient safety while saving substantial health care resources.
Sexual minorities experience excess psychological ill health globally, yet the UK data exploring reasons for poor mental health among sexual minorities is lacking. This study compares the prevalence of a measure of well-being, symptoms of common mental disorder (CMD), lifetime suicidal ideation, harmful alcohol and drug use among inner city non-heterosexual and heterosexual individuals. It is the first UK study which aims to quantify how much major, everyday and anticipated discrimination; lifetime and childhood trauma; and coping strategies for dealing with unfair treatment, predict excess mental ill health among non-heterosexuals. Further, inner city and national outcomes are compared.
Self-report survey data came from the South East London Community Health study (N = 1052) and the Adult Psychiatric Morbidity Survey (N = 7403).
Adjustments for greater exposure to measured experiences of discrimination and lifetime and childhood trauma had a small to moderate impact on effect sizes for adverse health outcomes though in fully adjusted models, non-heterosexual orientation remained strongly associated with CMD, lifetime suicidal ideation, harmful alcohol and drug use. There was limited support for the hypothesis that measured coping strategies might mediate some of these associations. The inner city sample had poorer mental health overall compared with the national sample and the discrepancy was larger for non-heterosexuals than heterosexuals.
Childhood and adult adversity substantially influence but do not account for sexual orientation-related mental health disparities. Longitudinal work taking a life course approach with more specific measures of discrimination and coping is required to further understand these associations. Sexual minorities should be considered as a priority in the design and delivery of health and social services.
There is limited information on percent expenditure of household income due to childhood diarrhoea especially in rural Bangladesh. A total of 4205 children aged <5 years with acute diarrhoea were studied. Percent expenditure was calculated as total expenditure for the diarrhoeal episode divided by monthly family income, multiplied by 100. Overall median percent expenditure was 3·04 (range 0·01–94·35). For Vibrio cholerae it was 6·42 (range 0·52–82·85), for enterotoxigenic Escherichia coli 3·10 (range 0·22–91·87), for Shigella 3·17 (range 0·06–77·80), and for rotavirus 3·08 (range 0·06–48·00). In a multinomial logistic regression model, for the upper tertile of percent expenditure, significant higher odds were found for male sex, travelling a longer distance to reach hospital (⩾median of 4 miles), seeking care elsewhere before attending hospital, vomiting, higher frequency of purging (⩾10 times/day), some or severe dehydration and stunting. V. cholerae was the highest and rotavirus was the least responsible pathogen for percent expenditure of household income due to childhood diarrhoea.
The cropping systems of the Eastern Gangetic Plains of Bangladesh, India and Nepal are based on rice. There is a scope to intensify such systems through diversification with lentil, the most popular food legume. Two strategies were evaluated to fit lentil into the short fallow between successive monsoonal (i.e., T. aman) and pre-monsoonal (aus) or irrigated rice (boro) crop. These were early-flowering sole-cropped lentil and relay-sown lentil into rice. Firstly, 18 early-flowering lentil lines at three contrasting sowing dates were tested over two seasons on a research station at Ishurdi in Bangladesh. Secondly, relay sowing was evaluated at the same location with six early-flowering lines and two control cultivars in two seasons. It was also assessed on ten farms in Western Bangladesh, comparing relay with sole cropping over 3 years. Flowering in the early-flowering lentil lines was consistently 9–17 days earlier, than the control cultivars, but they did not achieve an economic yield (<1·0 t/ha). Relay sowing with an existing cultivar produced an economic yield of lentil, which was comparable or higher than sole-cropped lentil in all situations. The relay-sown lentil matured in sufficient time to allow the land to be prepared for the succeeding rain-fed rice crop. It was concluded that the substitution of relay-sown lentil for fallow in the monsoonal rice–fallow–rain-fed rice cropping pattern is a useful option to intensify and diversify cropping in the Eastern Gangetic Plain.
Selenium (Se) is an essential micronutrient for human and animal health. Globally, more than one billion people are Se deficient due to low dietary Se. Low dietary intake of Se can be improved by Se supplementation, food fortification and biofortification of crops. Lentil (Lens culinaris Medikus subsp. culinaris) is a popular cool-season food legume in many parts of the world; it is naturally rich in Se and therefore has potential for Se biofortification. An Se foliar application experiment at two locations and a multi-location trial of 12 genotypes at seven locations were conducted from April to December 2011 in South Australia and Victoria, Australia. Foliar application of a total of 40 g/ha of Se as potassium selenate (K2SeO4) – 10 g/ha during full bloom and 30 g/ha during the flat pod stage – increased seed Se concentration from 201 to 2772 μg/kg, but had no effect on seed size or seed yield. Consumption of 20 g of biofortified lentil can supply all of the recommended daily allowance of Se. After Se foliar application, cultivars PBA Herald XT (3327 μg/kg), PBA bolt (3212) and PBA Ace (2957 μg/kg) had high seed Se concentrations. These cultivars may be used in lentil biofortification. In the genotypic evaluation trial, significant genotype and location variation was observed for seed Se concentration, but the interaction was not significant. In conclusion, foliar application of Se as K2SeO4 is an efficient agronomic approach to improve seed Se concentration for lentil consumers and there is also scope for genetic biofortification in lentil.
Efficacy of depression treatments, including adjunctive antipsychotic treatment, has not been explored for patients with worsening symptoms after antidepressant therapy (ADT).
This post-hoc analysis utilized pooled data from 3 similarly designed, randomized, double-blind, placebo-controlled trials that assessed the efficacy, safety, and tolerability of adjunctive aripiprazole in patients with major depressive disorder with inadequate response to ADT. The studies had 2 phases: an 8-week prospective ADT phase and 6-week adjunctive (aripiprazole or placebo) treatment phase. This analysis focused on patients whose symptoms worsened during the prospective 8-week ADT phase (worsening defined as >0% increase in Montgomery–Åsberg Depressive Rating Scale [MADRS] Total score). During the 6-week, double-blind, adjunctive phase, response was defined as ≥50% reduction in MADRS Total score and remission as ≥50% reduction in MADRS Total score and MADRS score ≤10.
Of 1065 patients who failed to achieve a response during the prospective phase, 160 exhibited worsening of symptoms (ADT-Worseners), and 905 exhibited no change/reduction in MADRS scores (ADT-Non-worseners). Response rates for ADT-Worseners at endpoint were 36.6% (adjunctive aripiprazole) and 22.5% (placebo). Similarly, response rates at endpoint for ADT-Non-worseners were 37.5% (adjunctive aripiprazole) and 22.5% (placebo). Remission rates at endpoint for ADT-Worseners were 25.4% (adjunctive aripiprazole) and 12.4% (placebo). For ADT-Non-worseners, remission rates were 29.9% (adjunctive aripiprazole) and 17.4% (placebo).
These results suggest that adjunctive aripiprazole is an effective intervention for patients whose symptoms worsen during antidepressant monotherapy. The results challenge the view that benefits of adjunctive therapy with aripiprazole are limited to partial responders to ADT.
The study aimed to determine the geographical diversity in seasonality of major diarrhoeal pathogens among 21 138 patients enrolled between 2010 and 2012 in two urban and two rural sites in Bangladesh under the surveillance system of the International Centre for Diarrhoeal Disease Research, Bangladesh (icddr,b). Distinct patterns in seasonality were found for rotavirus diarrhoea which peaked in winter across the sites (December and January) and dipped during the rainy season (May) in urban Dhaka, August in Mirpur and July in Matlab, equated by time-series analysis using quasi-Poisson regression model. Significant seasonality for shigellosis was observed in Dhaka and rural Mirzapur. Cholera had robust seasonality in Dhaka and Matlab in the hot and rainy seasons. For enterotoxogenic Escherichia coli (ETEC) diarrhoea, clearly defined seasonality was observed in Dhaka (summer). Understanding the seasonality of such pathogens can improve case management with appropriate therapy, allowing policy-makers to identify periods of high disease burden.
Genome-wide association analysis on monozygotic twin-pairs offers a route to discovery of gene–environment interactions through testing for variability loci associated with sensitivity to individual environment/lifestyle. We present a genome-wide scan of loci associated with intra-pair differences in serum lipid and apolipoprotein levels. We report data for 1,720 monozygotic female twin-pairs from GenomEUtwin project with 2.5 million SNPs, imputed or genotyped, and measured serum lipid fractions for both twins. We found one locus associated with intra-pair differences in high-density lipoprotein cholesterol, rs2483058 in an intron of SRGAP2, where twins carrying the C allele are more sensitive to environmental factors (P = 3.98 × 10−8). We followed up the association in further genotyped monozygotic twins (N = 1,261), which showed a moderate association for the variant (P = 0.200, same direction of an effect). In addition, we report a new association on the level of apolipoprotein A-II (P = 4.03 × 10−8).
Psychiatric in-patients are at high risk of suicide. Recent reductions in bed numbers in many countries may have affected this risk but few studies have specifically investigated temporal trends. We aimed to explore trends in psychiatric in-patient suicide over time.
A prospective study of all patients admitted to National Health Service (NHS) in-patient psychiatric care in England (1997–2008). Suicide rates were determined using National Confidential Inquiry and Hospital Episode Statistics (HES) data.
Over the study period there were 1942 psychiatric in-patient suicides. Between the first 2 years of the study (1997, 1998) and the last 2 years (2007, 2008) the rate of in-patient suicide fell by nearly one-third from 2.45 to 1.68 per 100 000 bed days. This fall in rate was observed for males and females, across ethnicities and diagnoses. It was most marked for patients aged 15–44 years. Rates also fell for the most common suicide methods, particularly suicide by hanging on the ward (a 59% reduction). Although the number of post-discharge suicides fell, the rate of post-discharge suicide may have increased by 19%. The number of suicide deaths in those under the care of crisis resolution/home treatment teams has increased in recent years to approximately 160 annually.
The rate of suicide among psychiatric in-patients in England has fallen considerably. Possible explanations include falling general population rates, changes in the at-risk population or improved in-patient safety. However, a transfer of risk to the period after discharge or other clinical settings such as crisis resolution teams cannot be ruled out.
Rabies is a major public health problem in Bangladesh, where most of the population live in rural areas. However, there is little epidemiological information on rabies in rural Bangladesh. This study was conducted in 30 upazilas (subdistricts) covering all six divisions of the country, to determine the levels of rabies and animal bites in Bangladesh. The total population of these upazilas was 6 992 302. A pretested questionnaire was used and data were collected by interviewing the adult members of families. We estimated that in Bangladesh, 166 590 [95% confidence interval (CI) 163 350–170 550] people per year are bitten by an animal. The annual incidence of rabies deaths in Bangladesh was estimated to be 1·40 (95% CI 1·05–1·78)/100 000 population. By extrapolating this, we estimated that 2100 (95% CI 1575–2670) people die annually from rabies in Bangladesh. More than three-quarters of rabies patients died at home. This community-based study provides new information on rabies epidemiology in Bangladesh.
The goal of the present study was to examine the influence of community environment on the nutritional status (weight-for-age and height-for-age) of children (aged 0–59 months) in Bangladesh. In addition, we tested the association between specific characteristics of community environments and child nutritional status.
The data are from the nationally representative 2004 Bangladesh Demographic and Health Survey.
Respondents were ever-married women (aged 15–49 years) and their children (n 5731), residing in 361 communities. Child nutritional outcomes are physical measurements of weight-for-age and height-for-age in sd units. We considered the following attributes of community environments potentially related to child nutrition: (i) community water and sanitation infrastructure; (ii) availability of community health and education services; (iii) community employment and social participation; and (iv) education level of the community.
Multilevel regression analysis showed that the spatial distribution of maternal and child covariates did not entirely explain the between-community variation in child nutritional status. The education level of the community emerged as the strongest community-level predictor of child height-for-age (highest v. lowest tertile, β = 0·18 (se 0·07)) and weight-for-age (highest v. lowest tertile, β = 0·21 (se 0·06)). In the height-for-age model, community employment and social participation also emerged as being statistically significant (highest v. lowest tertile, β = 0·13 (se = 0·06)).
The community environment influences child nutrition in Bangladesh, and maternal- and child-level covariates may fail to capture the entire influence of communities. Interventions to reduce child undernutrition in developing countries should take into consideration the wider community context.