To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
The work of Ed Zigler spans decades of research all singularly dedicated to using science to improve the lives of children facing different challenges. The focus of this article is on one of Zigler's numerous lines of work: advocating for the practice of mental age (MA) matching in empirical research, wherein groups of individuals are matched on the basis of developmental level, rather than chronological age. While MA matching practices represented a paradigm shift that provided the seeds from which the developmental approach to developmental disability sprouted, it is not without its own limits. Here, we examine and test the underlying assumption of linearity inherent in MA matching using three commonly used IQ measures. Results provide practical constraints of using MA matching, a solution which we hope refines future clinical and empirical practices, furthering Zigler's legacy of continued commitment to compassionate, meaningful, and rigorous science in the service of children.
Vitamin D deficiency is associated with an increased risk of falls and fractures. Assuming this association is causal, we aimed to identify the number and proportion of hospitalisations for falls and hip fractures attributable to vitamin D deficiency (25 hydroxy D (25(OH)D) <50 nmol/l) in Australians aged ≥65 years. We used 25(OH)D data from the 2011/12 Australian Health Survey and relative risks from published meta-analyses to calculate population-attributable fractions for falls and hip fracture. We applied these to data published by the Australian Institute of Health and Welfare to calculate the number of events each year attributable to vitamin D deficiency. In men and women combined, 8·3 % of hospitalisations for falls (7991 events) and almost 8 % of hospitalisations for hip fractures (1315 events) were attributable to vitamin D deficiency. These findings suggest that, even in a sunny country such as Australia, vitamin D deficiency contributes to a considerable number of hospitalisations as a consequence of falls and for treatment of hip fracture in older Australians; in countries where the prevalence of vitamin D deficiency is higher, the impact will be even greater. It is important to mitigate vitamin D deficiency, but whether this should occur through supplementation or increased sun exposure needs consideration of the benefits, harms, practicalities and costs of both approaches.
Chronic psychotic disorders (CPDs) occur worldwide and cause significant burden. Poor medication adherence is pervasive, but has not been well studied in sub-Saharan Africa.
This cross-sectional survey of 100 poorly adherent Tanzanian patients with CPD characterised clinical features associated with poor adherence.
Descriptive statistics characterised demographic and clinical variables, including barriers to adherence, adherence behaviours and attitudes, and psychiatric symptoms. Measures included the Tablets Routine Questionnaire, Drug Attitudes Inventory, the Brief Psychiatric Rating Scale, the Clinical Global Impressions scale, the Alcohol Use Disorders Identification Test and Alcohol, Smoking and Substance Involvement Screening Test. The relationship between adherence and other clinical variables was evaluated.
Mean age was 35.7 years (s.d. 8.8), 61% were male and 80% had schizophrenia, with a mean age at onset of 22.4 (s.d. 7.6) years. Mean proportion of missed CPD medication was 64%. One in ten had alcohol dependence. Most individuals had multiple adherence barriers. Most clinical variables were not significantly associated with the Tablets Routine Questionnaire; however, in-patients with CPD were more likely to have worse adherence (P ≤ 0.01), as were individuals with worse medication attitudes (Drug Attitudes Inventory, P < 0.01), higher CPD symptom severity levels (Brief Psychiatric Rating Scale, P < 0.001) and higher-risk use of alcohol (Alcohol Use Disorders Identification Test, P < 0.001).
Poorly adherent patients had multiple barriers to adherence, including poor attitudes toward medication and treatment, high illness acuity and substance use comorbidity. Treatments need to address adherence barriers, and consider family supports and challenges from an intergenerational perspective.
This is an epidemiological study of carbapenem-resistant Enterobacteriaceae (CRE) in Veterans’ Affairs medical centers (VAMCs). In 2017, almost 75% of VAMCs had at least 1 CRE case. We observed substantial geographic variability, with more cases in urban, complex facilities. This supports the benefit of tailoring infection control strategies to facility characteristics.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
Vanadium dioxide (VO2) has been widely studied due to its metal-insulator phase transition at 68 °C, below which it is a semiconducting monoclinic phase, P21/c, and above it is a metallic tetragonal phase, P42/mnm. Substituting vanadium with transition metals allows transition temperature tunability. An accelerated microwave-assisted synthesis for VO2 and 5d tungsten-substituted VO2 presented herein decreased synthesis time by three orders of magnitude while maintaining phase purity, particle size, and transition character. Tungsten substitution amount was determined using inductively coupled plasma-optical emission spectroscopy. Differential scanning calorimetry, superconducting quantum interference device measurements, and in situ heating and cooling experiments monitored through synchrotron X-ray diffraction (XRD) confirmed the transition temperature decreased with increased tungsten substitution. Scanning electron microscopy analyzed through the line-intercept method produced an average particle size of 3–5 μm. Average structure and local structure phase purity was determined through the Rietveld analysis of synchrotron XRD and the least-squares refinement of pair distribution function data.
CHDs can be complicated by renal injury which worsens morbidity and mortality. Urinary neutrophil gelatinase-associated lipocalin, a sensitive and specific biomarker of renal tubular injury, has not been studied in children with uncorrected CHDs. This study evaluated renal injury in children with uncorrected CHDs using this biomarker.
The patients were children with uncorrected CHDs with significant shunt confirmed on echocardiogram with normal renal ultrasound scan, in the paediatric cardiology clinic of a tertiary hospital. The controls were age-matched healthy children recruited from general practice clinics. Information on bio-data and socio-demographics were collected and urine was obtained for measurement of urinary neutrophil gelatinase-associated lipocalin levels.
A total of 65 children with uncorrected CHDs aged 2 to 204 months were recruited. Thirty-one (47.7%) were males while 36 (55.4%) had acyanotic CHDs. The median urinary neutrophil gelatinase-associated lipocalin level of patients of 26.10 ng/ml was significantly higher than controls of 16.90 ng/ml (U = 1624.50, p = 0.023). The median urinary neutrophil gelatinase-associated lipocalin level of patients with cyanotic and acyanotic CHDs were 30.2 ng/ml and 22.60 ng/ml respectively; (Mann–Whitney U = 368.50, p = 0.116). The prevalence of renal injury using 95th percentile cut-off value of urinary neutrophil gelatinase-associated lipocalin was 16.9%. Median age of patients with renalinjury was 16 (4–44) months.
Children with uncorrected CHDs have renal injury detected as early as infancy. The use of urinary neutrophil gelatinase-associated lipocalin in early detection of renal injury in these children may enhance early intervention and resultant prevention of morbidity and reduction in mortality.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
This study aimed to evaluate risk factors associated with shedding of pathogenic Leptospira species in urine at animal and herd levels. In total, 200 dairy farms were randomly selected from the DairyNZ database. Urine samples were taken from 20 lactating, clinically normal cows in each herd between January and April 2016 and tested by real-time polymerase chain reaction (PCR) using gyrB as the target gene. Overall, 26.5% of 200 farms had at least one PCR positive cow and 2.4% of 4000 cows were shedding Leptospira in the urine. Using a questionnaire, information about risk factors at cow and farm level was collected via face-to-face interviews with farm owners and managers. Animals on all but one farm had been vaccinated against Hardjo and Pomona and cows on 54 of 200 (27%) farms had also been vaccinated against Copenhageni in at least one age group (calves, heifers and cows). Associations found to be statistically significant in univariate analysis (at P < 0.2) were assessed by multivariable logistic regression. Factors associated with shedding included cattle age (Odds ratio (OR) 0.82, 95% CI 0.71–0.95), keeping sheep (OR 5.57, 95% confidence interval (CI) 1.46–21.25) or dogs (OR 1.45, 95% CI 1.07–1.97) and managing milking cows in a single as opposed to multiple groups (OR 0.45, 95% CI 0.20–0.99). We conclude that younger cattle were more likely to be shedding Leptospira than older cattle and that the presence of sheep and dogs was associated with an increased risk of shedding in cows. Larger herds were at higher risk of having Leptospira shedders. However, none of the environmental risk factors that were assessed (e.g. access to standing water, drinking-water source), or wildlife abundance on-farm, or pasture were associated with shedding, possibly due to low statistical power, given the low overall shedding rate.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
Vitamin D deficiency has been commonly reported in elite athletes, but the vitamin D status of UK university athletes in different training environments remains unknown. The present study aimed to determine any seasonal changes in vitamin D status among indoor and outdoor athletes, and whether there was any relationship between vitamin D status and indices of physical performance and bone health. A group of forty-seven university athletes (indoor n 22, outdoor n 25) were tested during autumn and spring for serum vitamin D status, bone health and physical performance parameters. Blood samples were analysed for serum 25-hydroxyvitamin D (s-25(OH)D) status. Peak isometric knee extensor torque using an isokinetic dynamometer and jump height was assessed using an Optojump. Aerobic capacity was estimated using the Yo-Yo intermittent recovery test. Peripheral quantitative computed tomography scans measured radial bone mineral density. Statistical analyses were performed using appropriate parametric/non-parametric testing depending on the normality of the data. s-25(OH)D significantly fell between autumn (52·8 (sd 22·0) nmol/l) and spring (31·0 (sd 16·5) nmol/l; P < 0·001). In spring, 34 % of participants were considered to be vitamin D deficient (<25 nmol/l) according to the revised 2016 UK guidelines. These data suggest that UK university athletes are at risk of vitamin D deficiency. Thus, further research is warranted to investigate the concomitant effects of low vitamin D status on health and performance outcomes in university athletes residing at northern latitudes.
Between 2010 and 2019 the international health care organization Partners In Health (PIH) and its sister organization Zanmi Lasante (ZL) mounted a long-term response to the 2010 Haiti earthquake, focused on mental health. Over that time, implementing a Theory of Change developed in 2012, the organization successfully developed a comprehensive, sustained community mental health system in Haiti's Central Plateau and Artibonite departments, directly serving a catchment area of 1.5 million people through multiple diagnosis-specific care pathways. The resulting ZL mental health system delivered 28 184 patient visits and served 6305 discrete patients at ZL facilities between January 2016 and September 2019. The experience of developing a system of mental health services in Haiti that currently provides ongoing care to thousands of people serves as a case study in major challenges involved in global mental health delivery. The essential components of the effort to develop and sustain this community mental health system are summarized.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
Formation of close double white dwarfs likely requires the initial binary system to evolve through two successive common envelope (CE) phases. A prominent method for describing CE outcomes involves defining an ejection efficiency, αeff, which quantifies the fraction of orbital energy available to unbind the envelope. Reproducing observed post-CE orbital parameters has proven difficult for numerical simulations, as the companion’s decaying orbit fails to eject the envelope. The ejection failure seen in numerical simulations may be resolved with a proper treatment of convection, whereby the binary orbit shrinks before energy can drive ejection. Where the orbital decay timescale exceeds the convective transport timescale, the energy released during inspiral is carried to the stellar surface and radiated away. By including convection, we produce sub-day post-CE orbital periods, a result consistent with observations. We comment on the effects of convection for the population of double white dwarfs that evolve through two CEs.
In a crossover trial, a gown designed to increase skin coverage at the hands and wrists significantly reduced contamination of personnel during personal protective equipment (PPE) removal, and education on donning and doffing technique further reduced contamination. Simple modifications of PPE and education can reduce contamination during PPE removal.