To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hidden hunger is widespread in India. Individual dietary diversity score (IDDS) is a measure of the nutrient adequacy of the diet. The FAO has set guidelines for the measurement of dietary diversity: the IDDS and the minimum dietary diversity score for women (MDD-W) to assess nutritional deficiency, but validation against nutritional biomarkers is required. Using available data among rural youth (17 years) from the Pune Maternal Nutrition Study, the validity of DDS was assessed to measure deficiencies of vitamin B12, folate and Hb. Of the 355 boys and 305 girls, 19 % were classified as underweight, 57 % as vitamin B12 deficient (<150 pmol/l) and 22 % as anaemic (<120/130 g/l). Cereals, legumes and ‘other-vegetables’ were the most frequently consumed foods. More boys than girls consumed milk, flesh, eggs and micronutrient-dense foods. Median IDDS of 4 (interquartile range (IQR) 3–4) and MDD-W of 6 (IQR 5–7) were low. Youth with vitamin B12 deficiency had a higher likelihood of an IDDS ≤ 4 (1·89; 95 % CI 1·24, 2·87) or an MDD-W ≤ 5 (1·40; 95 % CI 1·02, 1·94). Youth with anaemia were more likely to have an IDDS ≤ 4 (1·76; 95 % CI 1·01, 3·14) adjusted for socio-economic scores, BMI, energy intake and sex. Folate deficiency was low (3 %) and was not associated with either score. Youth with lowest plasma vitamin B12 and Hb infrequently or never consumed dairy products/non-vegetarian foods. These rural Indian youth were underweight, had low DDS and consumed foods low in good-quality proteins and micronutrients. Associations of DDS with circulating micronutrients indicate that DDS is a valid measure to predict vitamin B12 deficiency and anaemia.
Case fatality rate (CFR) and doubling time are important characteristics of any epidemic. For coronavirus disease 2019 (COVID-19), wide variations in the CFR and doubling time have been noted among various countries. Early in the epidemic, CFR calculations involving all patients as denominator do not account for the hospitalised patients who are ill and will die in the future. Hence, we calculated cumulative CFR (cCFR) using only patients whose final clinical outcomes were known at a certain time point. We also estimated the daily average doubling time. Calculating CFR using this method leads to temporal stability in the fatality rates, the cCFR stabilises at different values for different countries. The possible reasons for this are an improved outcome rate by the end of the epidemic and a wider testing strategy. The United States, France, Turkey and China had high cCFR at the start due to low outcome rate. By 22 April, Germany, China and South Korea had a low cCFR. China and South Korea controlled the epidemic and achieved high doubling times. The doubling time in Russia did not cross 10 days during the study period.
This paper offers a framework for measuring global growth and inflation, built on standard index number theory, national accounts principles, and the concepts and methods for international macro-economic comparisons. Our approach provides a sound basis for purchasing power parity (PPP)- and exchange rate (XR)-based global growth and inflation measures. The Sato–Vartia index number system advocated here offers very similar results to a Fisher system but has the added advantage of allowing a complete decomposition with PPP or XR effects. For illustrative purposes, we present estimates of global growth and inflation for 141 countries over the years 2005 and 2011. The contribution of movements in XRs and PPPs to global inflation are presented. The aggregation properties of the method are also discussed.
Alcoholism has a high prevalence and impacts on morbidity, mortality, life quality, and the economy. Heritability estimates of alcohol dependence are 50-61%. Putative psychological, cultural, and genetic susceptibilities to alcoholism have been identified but understanding of the genetic components is still underdeveloped.
Identify genetic vulnerabilities predisposing individuals to alcoholism and co-morbid psychiatric disorders in the largest study of its kind.
12 centres including 10 trainees are currently collecting blood and clinical samples. Nearly 1700 of 2000 cases of ICD-10/DSM-IV alcohol dependence have been collected; 500 with standardized assessments of alcohol use and comorbdity; and 2000 ancestrally-matched supernormal controls from UCL/collaborators. Genomic DNA will be isolated following standard procedures. Genotyping will be performed using the Affymetrix Gene Chip Human Mapping 1M Array to type up to 1 million single nucleotide polymorphism (SNP) and copy number variant (CNV) markers. Chi-square analysis of allelic association for the alcoholic sample versus controls will occur.
n=65; 57% male; mean age 45years; mean age onset harmful alcohol use 19years; mean age onset withdrawals 32y; mean alcohol intake 21 units; primary depression 27%; secondary depression 49%; antisocial personality disorder 14%. The candidate gene approach in this sample has shown that the GABA receptor B1 (GABRB1) and the tachykinin receptor 1 (TACR1) are involved in genetic susceptibility to alcoholism. The D2 dopamine receptor is next.
Preliminary data suggests high psychiatric comorbidity in a clinical alcohol dependence sample and implicated candidate genes. Next is genomewide analysis of markers, sequencing and biological pathway/systems alterations.
Pervasive Refusal Syndrome (PRS) is a relatively new diagnostic concept, that describes a rare and potentially life threatening condition, in which children refuse to walk.
talk, eat, drink, engage in self care, and take part in day to day activities (Lask et al, 1991).
PRS is not included in any of the psychiatric classification systems (ICD 10, DSM IV), although consensus exists within the literature as to its existence. Lask comments in his paper on Pervasive Refusal Syndrome that he has consulted on only 50 cases worldwide (Lask, 2004).
The authors will share their clinical experience of treating seven new cases of PRS in a Regional CAMHS inpatient hospital. Patients with PRS often require hospital admission for assessment and exclusion of other medical, neurological and psychiatric disorders. However, because of the rarity many medical and psychiatric professionals have little experience of the treatment and rehabilitation required.
The specific MDT management approach necessary to meet the complex needs of patients with PRS will be discussed, as treatment is often counterintuitive, and some approaches can result in deterioration rather than improvement.
In terms of improvement and recovery from the disorder, less is known about long term follow-up, as only a few studies have reported on immediate outcome. The authors have undertaken a long term follow-up (in press) and will discuss issues relating to prognosis.
A specific MDT treatment approach for PRS will be discussed, alongside the clinical decisions and dilemmas involved in following this approach.
First Rank Symptoms (FRS) - a group of intriguing experiences characterized by striking breach of ‘self versus non-self’ boundaries - have had a critical influence on the diagnosis of schizophrenia. Inferior Parietal Lobule is implicated in the pathogenesis of FRS in Schizophrenia. However, the role of Planum Parietale (PP) in the genesis of FRS is yet to be examined.
Aims & objectives
This first time study (to the best of our knowledge), aims to examine antipsychotic-naïve schizophrenia patients for the effect of FRS status on volume of PP.
In this study we examined the volume of PP in antipsychotic-naïve schizophrenia patients (n = 32; M:F = 16:16) in comparison with age, sex, and handedness matched (as a group) healthy comparison subjects (n = 34; M:F = 16:18) using valid method with good inter-rater reliability.
Female Schizophrenia patients showed significant volume reduction in right PP in comparison with female healthy controls (F = 7.2; p = 0.01). However, male patients did not. There was a significant effect of schneiderian FRS in female patients in that those who had FRS had significantly smaller volume of right PP than healthy controls (F = 3.8; p = 0.03); where those female patients who were FRS negative did not differ. Left PP volume did not differ between patients and controls.
Current study supports previous studies which have implicated the role of parietal lobe in pathogenesis of FRS. Specific role of PP in FRS generation and possible implication of sex differences needs further systematic studies.
Borderline Personality Disorder (BPD) describes pervasive and stable impairments in personal identity and interpersonal functioning with pathological personality traits. Most patients are prescribed multiple psychoactive medications despite none being indicated for BPD. Current evidence proposes long-term, BPD-appropriate psychotherapy as the most efficacious treatment
A residential BPD patient cohort was evaluated to determine whether BPD-focused psychotherapy reduced prescribing and BPD and co-morbid symptom severity
The pattern of psychotropic drug utilization at admission, discharge and one year follow-up was measured. Changes in the utilisation of pharmacotherapy were examined in the context of improvements in BPD and/or co-morbid disorder symptom severity
There were 74 female participants, most with more than one Personality Disorder diagnosis and co-morbid mood disorders. Residential treatment included individual and group psychotherapy for BPD. Self-reported use of psychotropic medications was ascertained at admission (T1); discharge from the program (3 to 6 months; T2), and one year post discharge (T3). The SCID (Structured Clinical Interview for DSM-IV) was used to confirm the BPD diagnosis and associated co-morbid conditions. The Beck Depression Inventory was completed at each time point
A significant reduction in the prescription of psychoactive medications was accompanied by significant decreases in the incidence and severity of self-rated depression as well as clinician assessed personality disorder, including BPD. These were most pronounced 12 months after discharge
Three to six months of BPD-specific psychotherapy provided lasting benefit for a range of mental health problems and reduced prescription medication use
Depression is known to be associated with low serum Brain-Derived Neurotrophic Factor (BDNF) and elevated levels of cortisol. Yoga has been shown to be associated with significant antidepressant effect as well as increase in serum BDNF levels and reduction in serum cortisol levels in these patients.
Aims and Objectives
We examined the association between serum cortisol and BDNF levels in patients with depression who were on treatment with antidepressants, yoga therapy, and both in combination.
Fifty-one consenting drug-naive outpatients (29 males) aged between 18-55 years, diagnosed with Major Depression received antidepressant medication alone (n=15), yoga therapy with (n=18), or without (n=18) concurrent antidepressants. Subjects in the yoga groups practiced a specific Yoga module for three months. Depression was assessed using the Hamilton Depression Rating Scale (HDRS). Serum BDNF & cortisol levels were obtained before and after three months using sandwich ELISA method. The group differences were analyzed using one-way ANOVA. Correlations between Serum BDNF & cortisol levels were analyzed using Pearson's correlation.
Significant negative correlations were observed between baseline BDNF & cortisol levels in the Yoga+Medication group (r=0.569*; P=0.01), and between change in BDNF and cortisol level in the Yoga alone group (r=0.582*; P=0.01). No other significant correlations were found.
There is a significant association between serum cortisol and BDNF levels in patients with depression who underwent Yoga with or without antidepressants. This suggests that Yoga may have stress reduction and neuroplastic effects alone or in combination with medications in depressed patients.
There are a number of good standard practices available for prescribing long acting antipsychotics. Adherence to these guidelines will minimise any harm to the service users.
To compare depot antipsychotic prescribing practice with good standard practice guidelines of BNF, Trust and Maudsley guidelines.
To compare practice with standards in the areas of:
– licensed indication;
– dose/frequency range;
– avoiding poly-pharmacy;
– regular review of clinical and side effects.
Case notes of a randomly selected sample of 30 patients from the depot clinic at the City East Adult Community Mental Health Team Leicester, UK were retrospectively investigated. The data collected was analysed and the results were produced. Compliance with the best practice guidelines was calculated and recommendations made based on the findings.
One hundred percent compliance was noticed in licensed indications and dose/frequency within BNF range. However, 14% patients received poly-pharmacotherapy, 86% had regular outpatient review, but only 46% had review of side effects.
Better quality of documentations by the clinicians, improvised technology to elicit automatic review reminders, introduction of checklist for clinics to include review of all clinically important information, wider dissemination of the findings of this investigation, and re-auditing practice to explore impact of this investigation was recommended.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
We propose the use of a machine learning algorithm to improve possible COVID-19 case identification more quickly using a mobile phone–based web survey. This method could reduce the spread of the virus in susceptible populations under quarantine.
The Indian subcontinent is prone to tropical cyclones that used to originate in the North Indian Ocean. Through this study, an inventory of disease outbreaks for the tropical cyclone-affected regions from 2010 to 2018 has been compiled. This inventory is used to assess the success of recent sanitation intervention, Swachh Bharat Mission, also known as the Clean India Mission.
Meteorological parameters from the Indian satellites were used to demarcate the cyclone-affected area. Disease outbreaks and epidemics during the tropical cyclones were compiled from the Integrated Disease Surveillance Program and other relevant sources. The inventory has been used to track the effect of recent sanitation interventions on disease outbreaks.
Districts in the eastern coast of India are frequently affected due to tropical cyclones that have originated from the North Indian Ocean. Infectious diseases like the acute diarrheal diseases, vector-borne diseases, viral fevers, enteric fevers, and food poisoning have recursively occurred during the cyclonic events and persisted up to 2 weeks from the cyclonic episode. The effectiveness of the Clean India Mission is evident during the recent cyclones, Ockhi, Titli, and Gaja, where a significantly lower number of infectious disease outbreaks were recorded.
The Clean India Mission has exhibited positive results on the public health consequences associated with tropical cyclones.
Choice of the most appropriate breeding method hinges on mode of action of genes controlling expression of target traits. Pungency (capsaicin) and colour (oleoresin) are most important fruit quality traits in chilli. Genetics of fruit quality traits was unravelled using a combination of first and second degree statistics. An additive-dominance model was inadequate to explain the inheritance of fruit yield and quality traits. Magnitude of additive genetic effects [a] and their variances [σ2A] were higher than those of dominance genetic effects [d] and dominance genetic variances [σ2D] suggesting predominance of additive effect genes in the inheritance of both oleoresin and capsaicin contents. These results are discussed in relation to appropriate selection strategy to be followed for genetic improvement of chilli for oleoresin and capsaicin contents.
SGA devices have been used successfully in patients of all ages in various clinical scenarios, including primary airway management under general anesthesia in the operating room, and resuscitation and emergent airway management in the emergency department (ED) and prehospital settings. SGA devices have been used as alternatives to face-mask ventilation and tracheal intubation by healthcare providers with proficient airway management skills, but also by those with less experience, to successfully oxygenate and ventilate the lungs. The clinical efficacy of SGA devices in children has been proven in a large number of clinical studies. Pediatric SGA devices have undergone an evolution in design since their introduction 30 years ago. These newer design features have improved the use of SGA devices to provide positive-pressure ventilation and facilitate fiberoptic-guided tracheal intubation. The evolution, versatility, and utility of the SGA device will be discussed in detail in this chapter.
Thirty-one accessions of Oryza glaberrima were evaluated to study the genetic variability, correlation, path, principal component analysis (PCA) and D2 analysis. Box plots depicted high estimates of variability for days to 50% flowering and grain yield per plant in Kharif 2016, plant height, productive tillers, panicle length and 1000 seed weight in Kharif 2017. Correlation studies revealed days to 50% flowering, plant height, panicle length, number of productive tillers, spikelets per panicle having a high direct positive association with grain yield, while path analysis identified the number of productive tillers having the maximum direct positive effect on grain yield. Days to 50% flowering via spikelets per panicle, productive tillers and plant height via spikelets per panicle exhibited high positive indirect effects on grain yield per plant. PCA showed that a cumulative variance of 54.752% from yield per plant, days to 50% flowering, spikelets per panicle and panicle length, contributing almost all the variation of traits while D2 analysis identified days to 50% flowering and grain yield per plant contributing maximum to the genetic diversity. Therefore, selection of accessions with more number of productive tillers and early maturity would be most suitable for yield improvement programme. The study has revealed the utility of African rice germplasm and its potential to utilize in the genetic improvement of indica rice varieties.
The design of high energy Li-ion batteries (LIBs) by coupling high voltage LiNi0.5Mn1.5O4 (LNMO) cathode and Li4Ti5O12 (LTO) anode ensures effective and safe energy-storage. LTO–LNMO full-cells (FCs) with difference in electrode grain sizes and presence of excess Mn3+ in cathode were studied using micron-sized commercial LTO, nanostructured LTO donuts (LTOd), P4332 LNMO nanopowders, and nanostructured Fd3m LNMO caterpillars (LNMOcplr). Among the studied FCs, LTOd–LNMOcplr was detected with a stable capacity of 69 mA h/g (1C rate), 99% coulombic efficiency, and 87% capacity retention under 200 cycles of continuous charge–discharge studies. The superior electrochemical performance observed in LTOd–LNMOcplr FC was due to the low charge transfer resistance, which is corroborated to the effect of grain sizes and the longer retention of Mn3+ in the electrodes. An effective and simple FC design incorporating both nanostructuring and in situ conductivity in electrode materials would aid in developing future high-performance LIBs.
In this study, we estimate the burden of foodborne illness (FBI) caused by five major pathogens among nondeployed US Army service members. The US Army is a unique population that is globally distributed, has its own food procurement system and a food protection system dedicated to the prevention of both unintentional and intentional contamination of food. To our knowledge, the burden of FBI caused by specific pathogens among the US Army population has not been determined. We used data from a 2015 US Army population survey, a 2015 US Army laboratory survey and data from FoodNet to create inputs for two model structures. Model type 1 scaled up case counts of Campylobacter jejuni, Shigella spp., Salmonella enterica non-typhoidal and STEC non-O157 ascertained from the Disease Reporting System internet database from 2010 to 2015. Model type 2 scaled down cases of self-reported acute gastrointestinal illness (AGI) to estimate the annual burden of Norovirus illness. We estimate that these five pathogens caused 45 600 (5%–95% range, 30 300–64 000) annual illnesses among nondeployed active duty US Army Service members. Of these pathogens, Norovirus, Campylobacter jejuni and Salmonella enterica non-typhoidal were responsible for the most illness. There is a tremendous burden of AGI and FBI caused by five major pathogens among US Army Soldiers, which can have a tremendous impact on readiness of the force. The US Army has a robust food protection program in place, but without a specific active FBI surveillance system across the Department of Defence, we will never have the ability to measure the effectiveness of modern, targeted, interventions aimed at the reduction of specific foodborne pathogens.
Throughout history, acute gastrointestinal illness (AGI) has been a significant cause of morbidity and mortality among US service members. We estimated the magnitude, distribution, risk factors and care seeking behaviour of AGI among the active duty US Army service members using a web-based survey. The survey asked about sociodemographic characteristics, dining and food procurement history and any experience of diarrhoea in the past 30 days. If respondents reported diarrhoea, additional questions about concurrent symptoms, duration of illness, medical care seeking and stool sample submission were asked. Univariable and multivariable logistic regression were used to identify the factors associated with AGI and factors associated with seeking care and submitting a stool sample. The 30-day prevalence of AGI was 18.5% (95% CI 16.66–20.25), the incidence rate was 2.24 AGI episodes per person-year (95% CI 2.04–2.49). Risk factors included a region of residence, eating at the dining facility and eating at other on-post establishments. Individuals with AGI missed 2.7–3.7 days of work, which costs approximately $ 847 451 629 in paid wages. Results indicate there are more than 1 million cases of AGI per year among US Army Soldiers, which can have a major impact on readiness. We found that care-seeking behaviours for AGI are different among US Army Service Members than the general population. Army Service Members with AGI report seeking care and having a stool sample submitted less often, especially for severe (bloody) diarrhoea. Factors associated with seeking care included rank, experiencing respiratory symptoms (sore throat, cough), experiencing vomiting and missing work for their illness. Factors associated with submitting a stool sample including experiencing more than five loose stools in 24 h and not experiencing respiratory symptoms. US Army laboratory-based surveillance under-estimates service members with both bloody and non-bloody diarrhoea. To our knowledge, this is the first study to estimate the magnitude, distribution, risk factors and care-seeking behaviour of AGI among Army members. We determined Army service members care-seeking behaviours, AGI risk factors and stool sample submission rates are different than the general population, so when estimating burden of AGI caused by specific foodborne pathogens using methods like Scallan et al. (2011), unique multipliers must be used for this subset of the population. The study legitimises not only the importance of AGI in the active duty Army population but also highlights opportunities for public health leaders to engage in simple strategies to better capture AGI impact so more modern intervention strategies can be implemented to reduce burden and indirectly improve operational readiness across the Enterprise.
Clozapine treatment increases the risk of agranulocytosis, but findings on the epidemiology of agranulocytosis have been inconsistent. This meta-analysis examined the prevalence of agranulocytosis and related death in clozapine-treated patients.
A literature search in the international (PubMed, PsycINFO, and EMBASE) and Chinese (WanFang, Chinese National Knowledge Infrastructure, and Sinomed) databases was conducted. Prevalence estimates of agranulocytosis and related death in clozapine-treated patients were synthesized with the Comprehensive Meta-Analysis program using the random-effects model.
Thirty-six studies with 260 948 clozapine-treated patients published between 1984 and 2018 were included in the meta-analysis. The overall prevalence of agranulocytosis and death caused by agranulocytosis were 0.4% (95% CI 0.3–0.6%) and 0.05% (95% CI 0.03–0.09%), respectively. The prevalence of agranulocytosis was moderated by sample size, study quality, year of publication, and that of data collection.
The prevalence of clozapine-associated agranulocytosis is low. Agranulocytosis-related death appears rare.