We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Recurrent acute otitis media is common in children. The preferred treatment measures for recurrent acute otitis media have a mixed evidence base. This study sought to assess baseline practice across ENT departments in England.
Methods
A national telephone survey of healthcare staff was conducted. Every ENT centre in England was contacted. A telephone script was used to ask about antibiotic and grommet use and duration in recurrent acute otitis media cases.
Results
Ninety-six centres (74 per cent) provided complete information. Recurrent acute otitis media treatment across England by ENT departments varied. The antibiotic first- and second-line prophylaxis offered varies, with trimethoprim used in 33 centres and 29 centres not offering any antibiotics. The timing or choice about when to use grommets also varies, but 87 centres (91 per cent) offer grommet surgery at one stage.
Conclusion
The treatments received by children in England for recurrent acute otitis media vary by centre; collaborative research in this area is advised.
The coronavirus disease 2019 (COVID-19) pandemic has led to significant strain on front-line healthcare workers.
Aims
In this multicentre study, we compared the psychological outcomes during the COVID-19 pandemic in various countries in the Asia-Pacific region and identified factors associated with adverse psychological outcomes.
Method
From 29 April to 4 June 2020, the study recruited healthcare workers from major healthcare institutions in five countries in the Asia-Pacific region. A self-administrated survey that collected information on prior medical conditions, presence of symptoms, and scores on the Depression Anxiety Stress Scales and the Impact of Events Scale-Revised were used. The prevalence of depression, anxiety, stress and post-traumatic stress disorder (PTSD) relating to COVID-19 was compared, and multivariable logistic regression identified independent factors associated with adverse psychological outcomes within each country.
Results
A total of 1146 participants from India, Indonesia, Singapore, Malaysia and Vietnam were studied. Despite having the lowest volume of cases, Vietnam displayed the highest prevalence of PTSD. In contrast, Singapore reported the highest case volume, but had a lower prevalence of depression and anxiety. In the multivariable analysis, we found that non-medically trained personnel, the presence of physical symptoms and presence of prior medical conditions were independent predictors across the participating countries.
Conclusions
This study highlights that the varied prevalence of psychological adversity among healthcare workers is independent of the burden of COVID-19 cases within each country. Early psychological interventions may be beneficial for the vulnerable groups of healthcare workers with presence of physical symptoms, prior medical conditions and those who are not medically trained.
Diet has a major influence on the composition and metabolic output of the gut microbiome. Higher-protein diets are often recommended for older consumers; however, the effect of high-protein diets on the gut microbiota and faecal volatile organic compounds (VOC) of elderly participants is unknown. The purpose of the study was to establish if the faecal microbiota composition and VOC in older men are different after a diet containing the recommended dietary intake (RDA) of protein compared with a diet containing twice the RDA (2RDA). Healthy males (74⋅2 (sd 3⋅6) years; n 28) were randomised to consume the RDA of protein (0⋅8 g protein/kg body weight per d) or 2RDA, for 10 weeks. Dietary protein was provided via whole foods rather than supplementation or fortification. The diets were matched for dietary fibre from fruit and vegetables. Faecal samples were collected pre- and post-intervention for microbiota profiling by 16S ribosomal RNA amplicon sequencing and VOC analysis by head space/solid-phase microextraction/GC-MS. After correcting for multiple comparisons, no significant differences in the abundance of faecal microbiota or VOC associated with protein fermentation were evident between the RDA and 2RDA diets. Therefore, in the present study, a twofold difference in dietary protein intake did not alter gut microbiota or VOC indicative of altered protein fermentation.
Background: Biallelic variants in POLR1C are associated with POLR3-related leukodystrophy (POLR3-HLD), or 4H leukodystrophy (Hypomyelination, Hypodontia, Hypogonadotropic Hypogonadism), and Treacher Collins syndrome (TCS). The clinical spectrum of POLR3-HLD caused by variants in this gene has not been described. Methods: A cross-sectional observational study involving 25 centers worldwide was conducted between 2016 and 2018. The clinical, radiologic and molecular features of 23 unreported and previously reported cases of POLR3-HLD caused by POLR1C variants were reviewed. Results: Most participants presented between birth and age 6 years with motor difficulties. Neurological deterioration was seen during childhood, suggesting a more severe phenotype than previously described. The dental, ocular and endocrine features often seen in POLR3-HLD were not invariably present. Five patients (22%) had a combination of hypomyelinating leukodystrophy and abnormal craniofacial development, including one individual with clear TCS features. Several cases did not exhibit all the typical radiologic characteristics of POLR3-HLD. A total of 29 different pathogenic variants in POLR1C were identified, including 13 new disease-causing variants. Conclusions: Based on the largest cohort of patients to date, these results suggest novel characteristics of POLR1C-related disorder, with a spectrum of clinical involvement characterized by hypomyelinating leukodystrophy with or without abnormal craniofacial development reminiscent of TCS.
Background: Continuous electroencephalographic (cEEG) monitoring is essential to diagnosing non-convulsive seizures (NCS), reported to occur in 7-46% of at-risk critically ill patients. However, cEEG is labour-intensive, and given scarcity of resources at most centres cEEG is feasible in only selected patients. We aim to evaluate the clinical utility of cEEG at our centre in order to optimize further cEEG allocation among critically ill patients. Methods: Using a clinical database, we identified critically ill children who underwent cEEG monitoring in 2016, 2017 and 2018. We abstracted underlying diagnoses, indication for cEEG monitoring, cEEG findings, and associated changes in management. Results: Over this three year period, 928 cEEGs were performed. Among the 100 studies analyzed to date, primary indications for monitoring were characterization of events of unclear etiology (32%), diagnosis of NCS (30%), and monitoring of therapy for seizures (17%). Seizures were captured in 31% of patients (22% subclinical only, 5% electroclinical only, 4% both), which resulted in a treatment change in 90% of cases. Non-epileptic events were captured in 26% of patients. Conclusions: cEEG yielded clinically meaningful information in 57% of cases, frequently resulting in management changes. Subgroup analyses by cEEG indication and ICU location will be presented.
The Global Muon Detector Network (GMDN) is composed by four ground cosmic ray detectors distributed around the Earth: Nagoya (Japan), Hobart (Australia), Sao Martinho da Serra (Brazil) and Kuwait city (Kuwait). The network has operated since March 2006. It has been upgraded a few times, increasing its detection area. Each detector is sensitive to muons produced by the interactions of ~50 GeV Galactic Cosmic Rays (GCR) with the Earth′s atmosphere. At these energies, GCR are known to be affected by interplanetary disturbances in the vicinity of the earth. Of special interest are the interplanetary counterparts of coronal mass ejections (ICMEs) and their driven shocks because they are known to be the main origins of geomagnetic storms. It has been observed that these ICMEs produce changes in the cosmic ray gradient, which can be measured by GMDN observations. In terms of applications for space weather, some attempts have been made to use GMDN for forecasting ICME arrival at the earth with lead times of the order of few hours. Scientific space weather studies benefit the most from the GMDN network. As an example, studies have been able to determine ICME orientation at the earth using cosmic ray gradient. Such determinations are of crucial importance for southward interplanetary magnetic field estimates, as well as ICME rotation.
Many medications administered to patients with schizophrenia possess anticholinergic properties. When aggregated, pharmacological treatments may result in a considerable anticholinergic burden. The extent to which anticholinergic burden has a deleterious effect on cognition and impairs ability to participate in and benefit from psychosocial treatments is unknown.
Method
Seventy patients were followed for approximately 3 years. The MATRICS consensus cognitive battery (MCCB) was administered at baseline. Anticholinergic burden was measured with the Anticholinergic Cognitive Burden (ACB) scale. Ability to benefit from psychosocial programmes was measured using the DUNDRUM-3 Programme Completion Scale (D-3) at baseline and follow-up. Psychiatric symptoms were measured using the PANSS. Total antipsychotic dose was measured using chlorpromazine equivalents. Functioning was measured using the Social and Occupational Functioning Assessment Scale (SOFAS).
Results
Mediation analysis found that the influence of anticholinergic burden on ability to participate and benefit from psychosocial programmes was completely mediated by the MCCB. For every 1-unit increase on the ACB scale, change scores for DUNDRUM-3 decreased by −0.27 points. This relationship appears specific to anticholinergic burden and not total antipsychotic dose. Moreover, mediation appears to be specific to cognition and not psychopathology. Baseline functioning also acted as mediator but only when MCCB was not controlled for.
Conclusions
Anticholinergic burden has a significant impact on patients’ ability to participate in and benefit from psychosocial treatment programmes. Physicians need to be mindful of the cumulative effect that medications can have on patient cognition, functional capacity and ability to benefit from psychosocial treatments.
There are few data on excess direct and indirect costs of diabetes in India and limited data on rural costs of diabetes. We aimed to further explore these aspects of diabetes burdens using a clinic-based, comparative cost-of-illness study.
Methods
Persons with diabetes (n = 606) were recruited from government, private, and rural clinics and compared to persons without diabetes matched for age, sex, and socioeconomic status (n = 356). We used interviewer-administered questionnaires to estimate direct costs (outpatient, inpatient, medication, laboratory, and procedures) and indirect costs [absence from (absenteeism) or low productivity at (presenteeism) work]. Excess costs were calculated as the difference between costs reported by persons with and without diabetes and compared across settings. Regression analyses were used to separately identify factors associated with total direct and indirect costs.
Results
Annual excess direct costs were highest amongst private clinic attendees (INR 19 552, US$425) and lowest amongst government clinic attendees (INR 1204, US$26.17). Private clinic attendees had the lowest excess absenteeism (2.36 work days/year) and highest presenteeism (0.06 work days/year) due to diabetes. Government clinic attendees reported the highest absenteeism (7.48 work days/year) and lowest presenteeism (−0.31 work days/year). Ten additional years of diabetes duration was associated with 11% higher direct costs (p < 0.001). Older age (p = 0.02) and longer duration of diabetes (p < 0.001) were associated with higher total lost work days.
Conclusions
Excess health expenditures and lost productivity amongst individuals with diabetes are substantial and different across care settings. Innovative solutions are needed to cope with diabetes and its associated cost burdens in India.
This study aimed to measure changes in disease-specific quality of life in children following tonsillectomy or adenotonsillectomy.
Methods:
A multicentre prospective cohort study was performed involving seven ENT departments in England. A total of 276 children entered the study over a 2-month period: 107 underwent tonsillectomy and 128 adenotonsillectomy. Forty-one children referred with throat problems initially managed by watchful waiting were also recruited. The follow-up period was 12 months. Outcome measures were the T14, parental impressions of their child's quality of life and the number of days absent from school.
Results:
One-year follow-up data were obtained from 150 patients (52 per cent). The mean baseline T14 score in the non-surgical group was significantly lower (T14 = 23) than in the tonsillectomy group (T14 = 31) or the adenotonsillectomy group (T14 = 35; p < 0.001). There was a significant improvement in the T14 scores of responders in all groups at follow up. The effect size was 1.3 standard deviations (SD) for the non-surgical group, 2.1 SD for the tonsillectomy group and 1.9 SD for the adenotonsillectomy group. Between-group differences did not reach statistical significance. A third of children in the non-surgical group underwent surgery during the follow-up period.
Conclusion:
Children who underwent surgical intervention achieved a significant improvement in disease-specific quality of life. Less severely affected children were managed conservatively and also improved over 12 months, but 1 in 3 crossed over to surgical intervention.
This paper addresses the problem of designing low-order and linear robust feedback controllers that provide a priori guarantees with respect to stability and performance when applied to a fluid flow. This is challenging, since whilst many flows are governed by a set of nonlinear, partial differential–algebraic equations (the Navier–Stokes equations), the majority of established control system design assumes models of much greater simplicity, in that they are: firstly, linear; secondly, described by ordinary differential equations (ODEs); and thirdly, finite-dimensional. With this in mind, we present a set of techniques that enables the disparity between such models and the underlying flow system to be quantified in a fashion that informs the subsequent design of feedback flow controllers, specifically those based on the $\mathscr{H}_{\infty }$ loop-shaping approach. Highlights include the application of a model refinement technique as a means of obtaining low-order models with an associated bound that quantifies the closed-loop degradation incurred by using such finite-dimensional approximations of the underlying flow. In addition, we demonstrate how the influence of the nonlinearity of the flow can be attenuated by a linear feedback controller that employs high loop gain over a select frequency range, and offer an explanation for this in terms of Landahl’s theory of sheared turbulence. To illustrate the application of these techniques, an $\mathscr{H}_{\infty }$ loop-shaping controller is designed and applied to the problem of reducing perturbation wall shear stress in plane channel flow. Direct numerical simulation (DNS) results demonstrate robust attenuation of the perturbation shear stresses across a wide range of Reynolds numbers with a single linear controller.
Longitudinal, patient-level data on resource use and costs after an ischemic stroke are lacking in Canada. The objectives of this analysis were to calculate costs for the first year post-stroke and determine the impact of disability on costs.
Methodology:
The Economic Burden of Ischemic Stroke (BURST) Study was a one-year prospective study with a cohort of ischemic stroke patients recruited at 12 Canadian stroke centres. Clinical history, disability, health preference and resource utilization information was collected at discharge, three months, six months and one year. Resources included direct medical costs (2009 CAN$) such as emergency services, hospitalizations, rehabilitation, physician services, diagnostics, medications, allied health professional services, homecare, medical/assistive devices, changes to residence and paid caregivers, as well as indirect costs. Results were stratified by disability measured at discharge using the modified Rankin Score (mRS): non-disabling stroke (mRS 0-2) and disabling stroke (mRS 3-5).
Results:
We enrolled 232 ischemic stroke patients (age 69.4 ± 15.4 years; 51.3% male) and 113 (48.7%) were disabled at hospital discharge. The average annual cost was $74,353; $107,883 for disabling strokes and $48,339 for non-disabling strokes.
Conclusions:
An average annual cost for ischemic stroke was calculated in which a disabling stroke was associated with a two-fold increase in costs compared to NDS. Costs during the hospitalization to three months phase were the highest contributor to the annual cost. A “back of the envelope” calculation using 38,000 stroke admissions and the average annual cost yields $2.8 billion as the burden of ischemic stroke.
Plant genetic resources are raw materials and their use in breeding is one of the most sustainable ways to conserve biodiversity. The ICRISAT has over 120,000 accessions of its five mandate crops and six small millets. The management and utilization of such large diversity are greatest challenges to germplasm curators and crop breeders. New sources of variations have been discovered using core and minicore collections developed at the ICRISAT. About 1.4 million seed samples have been distributed; some accessions with specific attributes have been requested more frequently. The advances in genomics have led researchers to dissect population structure and diversity and mine allelic variations associated with agronomically beneficial traits. Genome-wide association mapping in sorghum has revealed significant marker–trait associations for many agronomically beneficial traits. Wild relatives harbour genes for resistance to diseases and insect pests. Resistance to pod borer in chickpea and pigeonpea and resistance to rust and late leaf spot in groundnut have been successfully introgressed into a cultivated genetic background. Synthetics in groundnut are available to broaden the cultigen's gene pool. ICRISAT has notified the release of 266 varieties/cultivars, germplasm, and elite genetic stocks with unique traits, with some having a significant impact on breeding programs. Seventy-five germplasm lines have been directly released for cultivation in 39 countries.
Post-traumatic stress disorder (PTSD) in response to the World Trade Center (WTC) disaster of 11 September 2001 (9/11) is one of the most prevalent and persistent health conditions among both professional (e.g. police) and non-traditional (e.g. construction worker) WTC responders, even several years after 9/11. However, little is known about the dimensionality and natural course of WTC-related PTSD symptomatology in these populations.
Method
Data were analysed from 10 835 WTC responders, including 4035 police and 6800 non-traditional responders who were evaluated as part of the WTC Health Program, a clinic network in the New York area established by the National Institute for Occupational Safety and Health. Confirmatory factor analyses (CFAs) were used to evaluate structural models of PTSD symptom dimensionality; and autoregressive cross-lagged (ARCL) panel regressions were used to examine the prospective interrelationships among PTSD symptom clusters at 3, 6 and 8 years after 9/11.
Results
CFAs suggested that five stable symptom clusters best represent PTSD symptom dimensionality in both police and non-traditional WTC responders. This five-factor model was also invariant over time with respect to factor loadings and structural parameters, thereby demonstrating its longitudinal stability. ARCL panel regression analyses revealed that hyperarousal symptoms had a prominent role in predicting other symptom clusters of PTSD, with anxious arousal symptoms primarily driving re-experiencing symptoms, and dysphoric arousal symptoms primarily driving emotional numbing symptoms over time.
Conclusions
Results of this study suggest that disaster-related PTSD symptomatology in WTC responders is best represented by five symptom dimensions. Anxious arousal symptoms, which are characterized by hypervigilance and exaggerated startle, may primarily drive re-experiencing symptoms, while dysphoric arousal symptoms, which are characterized by sleep disturbance, irritability/anger and concentration difficulties, may primarily drive emotional numbing symptoms over time. These results underscore the importance of assessment, monitoring and early intervention of hyperarousal symptoms in WTC and other disaster responders.
Prevention of the heat-induced aggregation of β-lactoglobulin (β-Lg) would improve the heat stability of whey proteins. The effects of lipoic acid (LA, or thioctic acid), in both its oxidised and reduced form (dihydrolipoic acid, DHLA), on heat-induced unfolding and aggregation of β-Lg were investigated. LA/DHLA was added to native β-Lg and the mixture was heated at 70, 75, 80 or 85 °C for up to 30 min at pH 6·8. The samples were analysed by Polyacrylamide Gel Electrophoresis (PAGE) and Size-exclusion HPLC (SE-HPLC). LA was not as effective as DHLA in reducing the formation of aggregates of heated β-Lg. Heating β-Lg with DHLA resulted in formation of more β-Lg monomers (due to dissociation of native dimers) and significantly less β-Lg aggregates, compared with heating β-Lg alone. The aggregates formed in the presence of DHLA were both covalently linked, via disulphide bonds, and non-covalently (hydrophobically) linked, but the amount of covalently linked aggregates was much less than when β-Lg was heated alone. The results suggest that DHLA was able to partially trap the reactive β-Lg monomer containing a free sulphydryl (−SH) group, by forming a ‘modified monomer’, and to prevent some sulphydryl−sulphydryl and sulphydryl−disulphide interactions that lead to the formation of covalently linked protein aggregates. The effects of DHLA were similar to those of N-ethylmaleimide (NEM) and dithio(bis)-p-nitrobenzoate (DTNB). However, the advantage of using DHLA over NEM and DTNB to lessen aggregation of β-Lg is that it is a food-grade compound which occurs naturally in milk.
Background: Disaster preparations usually focus on preventing injury and infectious disease. However, people with chronic disease and related conditions (CDRCs), including obstetric/gynecological conditions, may be vulnerable to disruptions caused by disasters.
Methods: We used surveillance data collected after Hurricane Katrina to characterize the burden of visits for CDRCs at emergency treatment facilities (eg, hospitals, disaster medical assistance teams, military aid stations). In 6 parishes in and around New Orleans, health care providers at 29 emergency treatment facilities completed a standardized questionnaire for injury and illness surveillance from September 8 through October 22, 2005.
Results: Of 21,673 health care visits, 58.0% were for illness (24.3% CDRCs, 75.7% non-CDRCs), 29.1% for injury, 7.2% for medication refills, and 5.7% for routine or follow-up care. The proportion of visits for CDRCs increased with age. Among men presenting with CDRCs, the most common illnesses were cardiovascular disease (36.8%), chronic lower-respiratory disease (12.3%), and diabetes/glucose abnormalities (7.7%). Among women presenting with CDRCs, the most common were cardiovascular disease (29.2%), obstetric/gynecological conditions (18.2%), and chronic lower-respiratory disease (12.0%). Subsequent hospitalization occurred among 28.7% of people presenting with CDRCs versus 10.9% of those with non-CDRCs and 3.8% of those with injury.
Conclusions: Our data illustrate the importance of including CDRCs as a part of emergency response planning. (Disaster Med Public Health Preparedness. 2008;2:27–32)
Longitudinal symptoms of post-traumatic stress disorder (PTSD) are often characterized by heterogeneous trajectories, which may have unique pre-, peri- and post-trauma risk and protective factors. To date, however, no study has evaluated the nature and determinants of predominant trajectories of PTSD symptoms in World Trade Center (WTC) responders.
Method
A total of 10835 WTC responders, including 4035 professional police responders and 6800 non-traditional responders (e.g. construction workers) who participated in the WTC Health Program (WTC-HP), were evaluated an average of 3, 6 and 8 years after the WTC attacks.
Results
Among police responders, longitudinal PTSD symptoms were best characterized by four classes, with the majority (77.8%) in a resistant/resilient trajectory and the remainder exhibiting chronic (5.3%), recovering (8.4%) or delayed-onset (8.5%) symptom trajectories. Among non-traditional responders, a six-class solution was optimal, with fewer responders in a resistant/resilient trajectory (58.0%) and the remainder exhibiting recovering (12.3%), severe chronic (9.5%), subsyndromal increasing (7.3%), delayed-onset (6.7%) and moderate chronic (6.2%) trajectories. Prior psychiatric history, Hispanic ethnicity, severity of WTC exposure and WTC-related medical conditions were most strongly associated with symptomatic trajectories of PTSD symptoms in both groups of responders, whereas greater education and family and work support while working at the WTC site were protective against several of these trajectories.
Conclusions
Trajectories of PTSD symptoms in WTC responders are heterogeneous and associated uniquely with pre-, peri- and post-trauma risk and protective factors. Police responders were more likely than non-traditional responders to exhibit a resistant/resilient trajectory. These results underscore the importance of prevention, screening and treatment efforts that target high-risk disaster responders, particularly those with prior psychiatric history, high levels of trauma exposure and work-related medical morbidities.