To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
Schmidt-hammer exposure-age dating (SHD) of boulders on cryoplanation terrace treads and associated bedrock cliff faces revealed Holocene ages ranging from 0 ± 825 to 8890 ± 1185 yr. The cliffs were significantly younger than the inner treads, which tended to be younger than the outer treads. Radiocarbon dates from the regolith of 3854 to 4821 cal yr BP (2σ range) indicated maximum rates of cliff recession of ~0.1 mm/yr, which suggests the onset of terrace formation before the last glacial maximum. Age, angularity, and size of clasts, together with planation across bedrock structures and the seepage of groundwater from the cliff foot, all support a process-based conceptual model of cryoplanation terrace development in which frost weathering leads to parallel cliff recession and, hence, terrace extension. The availability of groundwater during autumn freezeback is viewed as critical for frost wedging and/or the growth of segregation ice during prolonged winter frost penetration. Permafrost promotes cryoplanation by providing an impermeable frost table beneath the active layer, focusing groundwater flow, and supplying water for sediment transport by solifluction across the tread. Snow beds are considered an effect rather than a cause of cryoplanation terraces, and cryoplanation is seen as distinct from nivation.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
Previous research regarding anxiety as a predictor of future cognitive decline in older adults is limited and inconsistent. We examined the independent relationship between anxiety symptoms and subsequent cognitive decline.
We included 2,818 community-dwelling older men (mean age = 76.1, SD ±5.3 years) who were followed on an average for 3.4 years. We assessed anxiety symptoms at baseline using the Goldberg Anxiety Scale (GAS; range = 0–9). We assessed cognitive function at baseline and at two subsequent visits using the Modified Mini-Mental State Examination (3MS; global cognition) and the Trails B test (executive function).
At baseline, there were 690 (24%) men with mild anxiety symptoms (GAS 1–4) and 226 (8%) men with moderate/severe symptoms (GAS 5–9). Men with anxiety symptoms were more likely to have depressed mood, poor sleep, more chronic medical conditions, and more impairment in activities of daily living compared to those with no anxiety symptoms. Compared to those with no anxiety symptoms at baseline, men with any anxiety symptoms were more likely to have substantial worsening in Trails B completion time (OR = 1.56, 95% CI 1.19, 2.05). The association was attenuated after adjusting for potential confounders, including depression and poor sleep, but remained significant (OR = 1.40, 95% CI 1.04, 1.88).
In cognitively healthy older men, mild anxiety symptoms may potentially predict future decline in executive functioning. Anxiety is likely a manifestation of an underlying neurodegenerative process rather than a cause.
A number of studies reports reduced hippocampal volume in individuals who engage in problematic alcohol use. However, the magnitude of the difference in hippocampal volume between individuals with v. without problematic alcohol use has varied widely, and there have been null findings. Moreover, the studies comprise diverse alcohol use constructs and samples, including clinically significant alcohol use disorders and subclinical but problematic alcohol use (e.g. binge drinking), adults and adolescents, and males and females.
We conducted the first quantitative synthesis of the published empirical research on associations between problematic alcohol use and hippocampal volume. In total, 23 studies were identified and selected for inclusion in the meta-analysis; effects sizes were aggregated using a random-effects model.
Problematic alcohol use was associated with significantly smaller hippocampal volume (d = −0.53). Moderator analyses indicated that effects were stronger for clinically significant v. subclinical alcohol use and among adults relative to adolescents; effects did not differ among males and females.
Problematic alcohol use is associated with reduced hippocampal volume. The moderate overall effect size suggests the need for larger samples than are typically included in studies of alcohol use and hippocampal volume. Because the existing literature is almost entirely cross-sectional, future research using causally informative study designs is needed to determine whether this association reflects premorbid risk for the development of problematic alcohol use and/or whether alcohol has a neurotoxic effect on the hippocampus.
We review the development of a disaster health care response system in Mississippi aimed at improving disaster response efforts. Large-scale disasters generate many injured and ill patients, which causes a significant utilization of emergency health care services and often requires external support to meet clinical needs. Disaster health care services require a solid infrastructure of coordination and collaboration to be effective. Following Hurricane Katrina, the state of Mississippi implemented best practices from around the nation to establish a disaster health care response system. The State Medical Response System of Mississippi provides an all-hazards system designed to support local response efforts at the time, scope, and scale required to successfully manage the incident. Components of this disaster health care response system can be replicated or adapted to meet the dynamic landscape of health care delivery following disasters. (Disaster Med Public Health Preparedness. 2017;11:600–604)
Accurate models of X-ray absorption and re-emission in partly stripped ions are necessary to calculate the structure of stars, the performance of hohlraums for inertial confinement fusion and many other systems in high-energy-density plasma physics. Despite theoretical progress, a persistent discrepancy exists with recent experiments at the Sandia Z facility studying iron in conditions characteristic of the solar radiative–convective transition region. The increased iron opacity measured at Z could help resolve a longstanding issue with the standard solar model, but requires a radical departure for opacity theory. To replicate the Z measurements, an opacity experiment has been designed for the National Facility (NIF). The design uses established techniques scaled to NIF. A laser-heated hohlraum will produce X-ray-heated uniform iron plasmas in local thermodynamic equilibrium (LTE) at temperatures
eV and electron densities
. The iron will be probed using continuum X-rays emitted in a
diameter source from a 2 mm diameter polystyrene (CH) capsule implosion. In this design,
of the NIF beams deliver 500 kJ to the
mm diameter hohlraum, and the remaining
directly drive the CH capsule with 200 kJ. Calculations indicate this capsule backlighter should outshine the iron sample, delivering a point-projection transmission opacity measurement to a time-integrated X-ray spectrometer viewing down the hohlraum axis. Preliminary experiments to develop the backlighter and hohlraum are underway, informing simulated measurements to guide the final design.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.
Several extragalactic HI surveys using a λ21 cm 13-beam focal plane array will begin in early 1997 using the Parkes 64 m telescope. These surveys are designed to detect efficiently nearby galaxies that have failed to be identified optically because of low optical surface brightness or high optical extinction. We discuss scientific and technical aspects of the multibeam receiver, including astronomical objectives, feed, receiver and correlator design and data acquisition. A comparison with other telescopes shows that the Parkes multibeam receiver has significant speed advantages for any large-area λ21 cm galaxy survey in the velocity range range 0–14000 km s−1.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
Experiments on the National Ignition Facility show that multi-dimensional effects currently dominate the implosion performance. Low mode implosion symmetry and hydrodynamic instabilities seeded by capsule mounting features appear to be two key limiting factors for implosion performance. One reason these factors have a large impact on the performance of inertial confinement fusion implosions is the high convergence required to achieve high fusion gains. To tackle these problems, a predictable implosion platform is needed meaning experiments must trade-off high gain for performance. LANL has adopted three main approaches to develop a one-dimensional (1D) implosion platform where 1D means measured yield over the 1D clean calculation. A high adiabat, low convergence platform is being developed using beryllium capsules enabling larger case-to-capsule ratios to improve symmetry. The second approach is liquid fuel layers using wetted foam targets. With liquid fuel layers, the implosion convergence can be controlled via the initial vapor pressure set by the target fielding temperature. The last method is double shell targets. For double shells, the smaller inner shell houses the DT fuel and the convergence of this cavity is relatively small compared to hot spot ignition. However, double shell targets have a different set of trade-off versus advantages. Details for each of these approaches are described.
The aim of this study was to compare patterns of cognitive decline in older Latinos and non-Latinos. At annual intervals for a mean of 5.7 years, older Latino (n=104) and non-Latino (n=104) persons of equivalent age, education, and race completed a battery of 17 cognitive tests from which previously established composite measures of episodic memory, semantic memory, working memory, perceptual speed, and visuospatial ability were derived. In analyses adjusted for age, sex, and education, performance declined over time in each cognitive domain, but there were no ethnic group differences in initial level of function or annual rate of decline. There was evidence of retest learning following the baseline evaluation, but neither the magnitude nor duration of the effect was related to Latino ethnicity, and eliminating the first two evaluations, during which much of retest learning occurred, did not affect ethnic group comparisons. Compared to the non-Latino group, the Latino group had more diabetes (38.5% vs. 25.0; χ2=4.4; p=.037), fewer histories of smoking (24.0% vs. 39.4%, χ2=5.7; p=.017), and lower childhood household socioeconomic level (−0.410 vs. −0.045, t[185.0]=3.1; p=.002), but controlling for these factors did not affect results. Trajectories of cognitive aging in different abilities are similar in Latino and non-Latino individuals of equivalent age, education, and race. (JINS, 2016, 22, 58–65)
We conducted a time-series analysis to evaluate the impact of the ASP over a 6.25-year period (July 1, 2008–September 30, 2014) while controlling for trends during a 3-year preintervention period (July 1, 2005–June 30, 2008). The primary outcome measures were total antibacterial and antipseudomonal use in days of therapy (DOT) per 1,000 patient-days (PD). Secondary outcomes included antimicrobial costs and resistance, hospital-onset Clostridium difficile infection, and other patient-centered measures.
During the preintervention period, total antibacterial and antipseudomonal use were declining (−9.2 and −5.5 DOT/1,000 PD per quarter, respectively). During the stewardship period, both continued to decline, although at lower rates (−3.7 and −2.2 DOT/1,000 PD, respectively), resulting in a slope change of 5.5 DOT/1,000 PD per quarter for total antibacterial use (P=.10) and 3.3 DOT/1,000 PD per quarter for antipseudomonal use (P=.01). Antibiotic expenditures declined markedly during the stewardship period (−$295.42/1,000 PD per quarter, P=.002). There were variable changes in antimicrobial resistance and few apparent changes in C. difficile infection and other patient-centered outcomes.
In a hospital with low baseline antibiotic use, implementation of an ASP was associated with sustained reductions in total antibacterial and antipseudomonal use and declining antibiotic expenditures. Common ASP outcome measures have limitations.
This paper brings together the work of the GI Solvency II Technical Provisions working party. The working party was formed in 2009 for the primary purpose of raising awareness of Solvency II and the impact it would have on the work that reserving actuaries do. Over the years, the working party’s focus has shifted to exploring and promoting discussion of the many practical issues raised by the requirements and to promoting best practice. To this end, we have developed, presented and discussed many of the ideas contained in this paper at events and forums. However, the size of the subject means that at no one event have we managed to cover all of the areas that the reserving actuary needs to be aware of. This paper brings together our thinking in one place for the first time. We hope experienced practitioners will find it thought provoking, and a useful reference tool. For new practitioners, we hope it helps to get you up-to-speed quickly. Good luck!