To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Obtaining objective, dietary exposure information from individuals is challenging because of the complexity of food consumption patterns and the limitations of self-reporting tools (e.g., FFQ and diet diaries). This hinders research efforts to associate intakes of specific foods or eating patterns with population health outcomes.
Dietary exposure can be assessed by the measurement of food-derived chemicals in urine samples. We aimed to develop methodologies for urine collection that minimised impact on the day-to-day activities of participants but also yielded samples that were data-rich in terms of targeted biomarker measurements.
Urine collection methodologies were developed within home settings.
Different cohorts of free-living volunteers.
Home collection of urine samples using vacuum transfer technology was deemed highly acceptable by volunteers. Statistical analysis of both metabolome and selected dietary exposure biomarkers in spot urine collected and stored using this method showed that they were compositionally similar to urine collected using a standard method with immediate sample freezing. Even without chemical preservatives, samples can be stored under different temperature regimes without any significant impact on the overall urine composition or concentration of forty-six exemplar dietary exposure biomarkers. Importantly, the samples could be posted directly to analytical facilities, without the need for refrigerated transport and involvement of clinical professionals.
This urine sampling methodology appears to be suitable for routine use and may provide a scalable, cost-effective means to collect urine samples and to assess diet in epidemiological studies.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
For patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, a traditional fist-bump greeting did not significantly reduce MRSA transfer in comparison to a handshake. However, transfer was reduced with a modified fist bump that minimized the surface area of contact and when hand hygiene was performed before the handshake.
Marine-terminating glaciers, such as those along the coastline of Greenland, often release meltwater into the ocean in the form of subglacial discharge plumes. Though these plumes can dramatically alter the mass loss along the front of a glacier, the conditions surrounding their genesis remain poorly constrained. In particular, little is known about the geometry of subglacial outlets and the extent to which seawater may intrude into them. Here, the latter is addressed by exploring the dynamics of an arrested salt wedge – a steady-state, two-layer flow system where salty water partially intrudes a channel carrying fresh water. Building on existing theory, we formulate a model that predicts the length of a non-entraining salt wedge as a function of the Froude number, the slope of the channel and coefficients for interfacial and wall drag. In conjunction, a series of laboratory experiments were conducted to observe a salt wedge within a rectangular channel. For experiments conducted with laminar flow (Reynolds number
), good agreement with theoretical predictions are obtained when the drag coefficients are modelled as being inversely proportional to
. However, for fully turbulent flows on geophysical scales, these drag coefficients are expected to asymptote toward finite values. Adopting reasonable drag coefficient estimates for this flow regime, our theoretical model suggests that typical subglacial channels may permit seawater intrusions of the order of several kilometres. While crude, these results indicate that the ocean has a strong tendency to penetrate subglacial channels and potentially undercut the face of marine-terminating glaciers.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
It is now well established that CBT for chronic insomnia is as efficacious as hypnotic medication and is also likely to be better at maintaining improved sleep. Most studies have looked at the use of individual CBT; there have been only a few studies looking at CBT for insomnia given in a group format.
For nearly ten years the Bristol Insomnia Group has offered cognitive behavioural management and support for people with chronic insomnia.
The seven group sessions are led by up to three members of a team consisting of a doctor (sleep specialist), an occupational therapist and a research sleep scientist. Components of the group intervention include education about sleep science, information on insomnia medication, sleep hygiene, relaxation, and cognitive therapy. To assess efficacy participants complete sleep diaries, a quality of life scale (SF36) and the dysfunctional beliefs and attitudes scale (DBAS) pre and post group.
Sleep diaries (n=68) showed significant differences in Total Sleep Time (TST), Sleep Onset Latency (SOL) and Sleep Quality (SQ). Approximately half of the participants had clinically significant improvements in their TST (increased by 30 minutes) and about a third had a clinically significant decrease (by 30 minutes) in their SOL. SF36 scores showed statistically improved scores in all nine domains, DBAS scores showed statistically significant decreased scores post group.
These results demonstrate promising sleep parameter and quality of life improvements after attendance at the group. CBT for insomnia is a clinically and cost effective approach for the treatment of chronic insomnia.
Ecstasy (3,4-methylenedioxymethamphetamine, MDMA) is an amphetamine derivative that is used recreationally and is now being tested in clinical trials for treatment of posttraumatic stress disorder. Ecstasy can damage serotonin neurones in brain of experimental animals; however, relevance of these findings to the human is debated.
To measure by positron emission tomography (PET) levels of binding to the serotonin transporter (SERT), a marker of serotonin neurones, in brain of chronic ecstasy users and in matched controls.
An estimate of brain SERT levels was obtained, using the PET tracer 11C-DASB, in 50 chronic (confirmed by drug hair testing) ecstasy users (mean age, 26 years; mean duration of drug use, 3.9 years; median drug withdrawal time, 38 days) and 50 (drug-hair negative) control subjects (mean age 26 years).
SERT binding levels in the ecstasy group were significantly decreased by 22 to 46% in frontal, temporal, cingulate, insular and occipital cortices, and by 23% in hippocampus. However, concentrations were distinctly normal in the SERT-rich caudate, putamen, ventral striatum and thalamus.
Our imaging data suggest that cerebral cortical SERT concentration is below normal in some ecstasy users for at least one month after last use of the drug. However, it remains to be established whether low SERT might have preceded drug use, reflects actual loss of brain serotonin neurones, or is causally related to any functional impairment in the ecstasy users. (Supported by US NIH NIDA DA017301).
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
Despite advances in the treatment of pulmonary hypertension and improvements in obstetric care, pulmonary hypertension (PH) remains a leading cause of cardiac maternal death in the developed world. The last three decades have seen the development of effective therapies for specific forms of PH, improving patients’ symptoms and more than doubling survival in some forms of PH. Consequently there are an increasing number of women of childbearing potential with PH. Women may present for the first time, with PH in pregnancy, in the early post-partum period or patients with PH may consider pregnancy despite counselling regarding the high risks.
Vitamin D deficiency has been commonly reported in elite athletes, but the vitamin D status of UK university athletes in different training environments remains unknown. The present study aimed to determine any seasonal changes in vitamin D status among indoor and outdoor athletes, and whether there was any relationship between vitamin D status and indices of physical performance and bone health. A group of forty-seven university athletes (indoor n 22, outdoor n 25) were tested during autumn and spring for serum vitamin D status, bone health and physical performance parameters. Blood samples were analysed for serum 25-hydroxyvitamin D (s-25(OH)D) status. Peak isometric knee extensor torque using an isokinetic dynamometer and jump height was assessed using an Optojump. Aerobic capacity was estimated using the Yo-Yo intermittent recovery test. Peripheral quantitative computed tomography scans measured radial bone mineral density. Statistical analyses were performed using appropriate parametric/non-parametric testing depending on the normality of the data. s-25(OH)D significantly fell between autumn (52·8 (sd 22·0) nmol/l) and spring (31·0 (sd 16·5) nmol/l; P < 0·001). In spring, 34 % of participants were considered to be vitamin D deficient (<25 nmol/l) according to the revised 2016 UK guidelines. These data suggest that UK university athletes are at risk of vitamin D deficiency. Thus, further research is warranted to investigate the concomitant effects of low vitamin D status on health and performance outcomes in university athletes residing at northern latitudes.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
Iron-rich meteorites are significantly underrepresented in collection statistics from Antarctica. This has led to a hypothesis that there is a sparse layer of iron-rich meteorites hidden below the surface of the ice, thereby explaining the apparent shortfall. As standard Antarctic meteorite collecting techniques rely upon a visual surface search approach, the need has thus arisen to develop a system that can detect iron objects under a few tens of centimetres of ice, where the expected number density is of the order one per square kilometre. To help answer this hypothesis, a large-scale pulse induction metal detector array has been constructed for deployment in Antarctica. The metal detector array is 6 m wide, able to travel at 15 km h-1 and can scan 1 km2 in ~11 hours. This paper details the construction of the metal detector system with respect to design criteria, notably the ruggedization of the system for Antarctic deployment. Some preliminary results from UK and Antarctic testing are presented. We show that the system performs as specified and should reach the pre-agreed target of the detection of a 100 g iron meteorite at 300 mm when deployed in Antarctica.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
Between 2010 and 2019 the international health care organization Partners In Health (PIH) and its sister organization Zanmi Lasante (ZL) mounted a long-term response to the 2010 Haiti earthquake, focused on mental health. Over that time, implementing a Theory of Change developed in 2012, the organization successfully developed a comprehensive, sustained community mental health system in Haiti's Central Plateau and Artibonite departments, directly serving a catchment area of 1.5 million people through multiple diagnosis-specific care pathways. The resulting ZL mental health system delivered 28 184 patient visits and served 6305 discrete patients at ZL facilities between January 2016 and September 2019. The experience of developing a system of mental health services in Haiti that currently provides ongoing care to thousands of people serves as a case study in major challenges involved in global mental health delivery. The essential components of the effort to develop and sustain this community mental health system are summarized.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.