To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/SPECIFIC AIMS: To evaluate the ability of various techniques to track changes in body fluid volumes before and after a rapid infusion of saline. METHODS/STUDY POPULATION: Eight healthy participants (5M; 3F) completed baseline measurements of 1) total body water using ethanol dilution and bioelectrical impedance analysis (BIA) and 2) blood volume, plasma volume and red blood cell (RBC) volume using carbon monoxide rebreathe technique and I-131 albumin dilution. Subsequently, 30mL saline/kg body weight was administered intravenously over 20 minutes after which BIA and ethanol dilution were repeated. RESULTS/ANTICIPATED RESULTS: On average, 2.29±0.35 L saline was infused with an average increase in net fluid input-output (I/O) of 1.56±0.29 L. BIA underestimated measured I/O by −3.4±7.9%, while ethanol dilution did not demonstrate a measurable change in total body water. Carbon monoxide rebreathe differed from I-131 albumin dilution measurements of blood, plasma and RBC volumes by +0.6±2.8%, −5.4±3.6%, and +11.0±4.7%, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: BIA is capable of tracking modest changes in total body water. Carbon monoxide rebreathe appears to be a viable alternative for the I-131 albumin dilution technique to determine blood volume. Together, these two techniques may be useful in monitoring fluid status in patients with impaired fluid regulation.
The main objective of this study was to investigate the relationships between motivation and readiness levels for physical activity and exercise behaviour among persons with chronic musculoskeletal pain. Participants were 211 U.S. adults with chronic musculoskeletal pain from online support groups as well as specialty and primary care clinics (females = 86.7%; mean age = 43.4 years, SD = 14.4 years). The participants completed an online survey on their engagement in physical activity and exercise behaviour. Multiple one-way analyses of variance with post-hoc comparisons using the Tukey HSD test revealed significant differences between the readiness stages of change groups of preintenders, intenders, and actors in their motivation for physical activity and exercise behaviour. Specifically, the actor group of behavioural change reported higher levels of motivation beliefs for physical activity and exercise behaviour compared to preintenders and intenders. These findings suggest that people with chronic musculoskeletal pain experiencing increased motivation for physical activity and exercise behaviour are more engaged in desired behaviours than the persons with chronic pain reporting varying degrees of behavioural intentions.
We describe the use of implementation science at the unit level and organizational level to guide an intervention to reduce central-line–associated bloodstream infections (CLABSIs) in a high-volume, regional, burn intensive care unit (BICU).
A single center observational quasi-experimental study.
A regional BICU in Maryland serving 300–400 burn patients annually.
In 2011, an organizational-level and unit-level intervention was implemented to reduce the rates of CLABSI in a high-risk patient population in the BICU. At the organization level, leaders declared a goal of zero infections, created an infrastructure to support improvement efforts by creating a coordinating team, and engaged bedside staff. Performance data were transparently shared. At the unit level, the Comprehensive Unit-based Safety Program (CUSP)/ Translating Research Into Practice (TRIP) model was used. A series of interventions were implemented: development of new blood culture procurement criteria, implementation of chlorhexidine bathing and chlorhexidine dressings, use of alcohol impregnated caps, routine performance of root-cause analysis with executive engagement, and routine central venous catheter changes.
The use of an implementation science framework to guide multiple interventions resulted in the reduction of CLABSI rates from 15.5 per 1,000 central-line days to zero with a sustained rate of zero CLABSIs over 3 years (rate difference, 15.5; 95% confidence interval, 8.54–22.48).
CLABSIs in high-risk units may be preventable with the a use a structured organizational and unit-level paradigm.
To examine variation in antibiotic coverage and detection of resistant pathogens in community-onset pneumonia.
A total of 128 hospitals in the Veterans Affairs health system.
Hospitalizations with a principal diagnosis of pneumonia from 2009 through 2010.
We examined proportions of hospitalizations with empiric antibiotic coverage for methicillin-resistant Staphylococcus aureus (MRSA) and Pseudomonas aeruginosa (PAER) and with initial detection in blood or respiratory cultures. We compared lowest- versus highest-decile hospitals, and we estimated adjusted probabilities (AP) for patient- and hospital-level factors predicting coverage and detection using hierarchical regression modeling.
Among 38,473 hospitalizations, empiric coverage varied widely across hospitals (MRSA lowest vs highest, 8.2% vs 42.0%; PAER lowest vs highest, 13.9% vs 44.4%). Detection rates also varied (MRSA lowest vs highest, 0.5% vs 3.6%; PAER lowest vs highest, 0.6% vs 3.7%). Whereas coverage was greatest among patients with recent hospitalizations (AP for anti-MRSA, 54%; AP for anti-PAER, 59%) and long-term care (AP for anti-MRSA, 60%; AP for anti-PAER, 66%), detection was greatest in patients with a previous history of a positive culture (AP for MRSA, 7.9%; AP for PAER, 11.9%) and in hospitals with a high prevalence of the organism in pneumonia (AP for MRSA, 3.9%; AP for PAER, 3.2%). Low hospital complexity and rural setting were strong negative predictors of coverage but not of detection.
Hospitals demonstrated widespread variation in both coverage and detection of MRSA and PAER, but probability of coverage correlated poorly with probability of detection. Factors associated with empiric coverage (eg, healthcare exposure) were different from those associated with detection (eg, microbiology history). Providing microbiology data during empiric antibiotic decision making could better align coverage to risk for resistant pathogens and could promote more judicious use of broad-spectrum antibiotics.
Finch trichomonosis, caused by Trichomonas gallinae, emerged in the Canadian Maritime provinces in 2007 and has since caused ongoing mortality in regional purple finch (Carpodacus purpureus) and American goldfinch (Carduelis tristis) populations. Trichomonas gallinae was isolated from (1) finches and rock pigeons (Columbia livia) submitted for post-mortem or live-captured at bird feeding sites experiencing trichomonosis mortality; (2) bird seed at these same sites; and (3) rock pigeons live-captured at known roosts or humanely killed. Isolates were characterized using internal transcribed spacer (ITS) region and iron hydrogenase (Fe-hyd) gene sequences. Two distinct ITS types were found. Type A was identical to the UK finch epidemic strain and was isolated from finches and a rock pigeon with trichomonosis; apparently healthy rock pigeons and finches; and bird seed at an outbreak site. Type B was obtained from apparently healthy rock pigeons. Fe-hyd sequencing revealed six distinct subtypes. The predominant subtype in both finches and the rock pigeon with trichomonosis was identical to the UK finch epidemic strain A1. Single nucleotide polymorphisms in Fe-hyd sequences suggest there is fine-scale variation amongst isolates and that finch trichomonosis emergence in this region may not have been caused by a single spill-over event.
Ordered carbon nanotube (CNT) growth by deposition of nanoparticle catalysts using dip pen nanolithography (DPN) is presented. DPN is a direct write, tip based lithography technique capable of multi-component deposition of a wide range of materials with nanometer precision. A NanoInk NLP 2000 is used to pattern different catalytic nanoparticle solutions on various substrates. To generate a uniform pattern of nanoparticle clusters, various conditions need to be considered. These parameters include: the humidity in the vessel, temperature, and tip-surface dwell time. By patterning different nanoparticle solutions next to each other, identical growth conditions can be compared for different catalysts in a streamlined analysis process. Fe, Ni, and Co nanoparticle solutions patterned on silicon, mica, and graphite substrates serve as nucleation sites for CNT growth. The CNTs were synthesized by a chemical vapor deposition (CVD) reaction. Each nanoparticle patterned substrate is placed in a tube furnace held at 725°C during CNT growth. The carbon source used in the growth chamber is toluene. The toluene is injected at a rate of 5 mL/hr. Growth is observed for Fe and Ni nanoparticle patterns, but is lacking for the Co patterns. The results of these reactions provide important information regarding efficient and highly reproducible mechanisms for CNT growth.
Sport-related concussion (SRC) is typically followed by clinical recovery within days, but reports of prolonged symptoms are common. We investigated the incidence of prolonged recovery in a large cohort (n = 18,531) of athlete seasons over a 10-year period. A total of 570 athletes with concussion (3.1%) and 166 controls who underwent pre-injury baseline assessments of symptoms, neurocognitive functioning and balance were re-assessed immediately, 3 hr, and 1, 2, 3, 5, 7, and 45 or 90 days after concussion. Concussed athletes were stratified into typical (within 7 days) or prolonged (> 7 days) recovery groups based on symptom recovery time. Ten percent of athletes (n = 57) had a prolonged symptom recovery, which was also associated with lengthier recovery on neurocognitive testing (p < .001). At 45–90 days post-injury, the prolonged recovery group reported elevated symptoms, without deficits on cognitive or balance testing. Prolonged recovery was associated with unconsciousness [odds ratio (OR), 4.15; 95% confidence interval (CI) 2.12–8.15], posttraumatic amnesia (OR, 1.81; 95% CI, 1.00–3.28), and more severe acute symptoms (p < .0001). These results suggest that a small percentage of athletes may experience symptoms and functional impairments beyond the typical window of recovery after SRC, and that prolonged recovery is associated with acute indicators of more severe injury. (JINS, 2012, 18, 1–12)
To estimate avoidable intravenous (IV) fluoroquinolone use in Veterans Affairs (VA) hospitals.
A retrospective analysis of bar code medication administration (BCMA) data.
Acute care wards of 128 VA hospitals throughout the United States.
Data were analyzed for all medications administered on acute care wards between January 1, 2006, and December 31, 2010. Patient-days receiving therapy were expressed as fluoroquinolone-days (FD) and divided into intravenous (IV; all doses administered intravenously) and oral (PO; at least one dose administered per os) FD. We assumed IV fluoroquinolone use to be potentially avoidable on a given IV FD when there was at least 1 other medication administered via the enteral route.
Over the entire study period, 884,740 IV and 830,572 PO FD were administered. Overall, avoidable IV fluoroquinolone use accounted for 46.8% of all FD and 90.9% of IV FD. Excluding the first 2 days of all IV fluoroquinolone courses and limiting the analysis to the non-ICU setting yielded more conservative estimates of avoidable IV use: 20.9% of all FD and 45.9% of IV FD. Avoidable IV use was more common for levofloxacin and more frequent in the ICU setting. There was a moderate correlation between avoidable IV FD and total systemic antibiotic use (r = 0.32).
Unnecessary IV fluoroquinolone use seems to be common in the VA system, but important variations exist between facilities. Antibiotic stewardship programs could focus on this patient safety issue as a “low-hanging fruit” to increase awareness of appropriate antibiotic use.
The present study used a systematic review approach to identify relevant randomised control trials (RCT) with vitamin D and then apply meta-regression to explore the most appropriate model of the vitamin D intake–serum 25-hydroxyvitamin D (25(OH)D) relationship to underpin setting reference intake values. Methods included an updated structured search on Ovid MEDLINE; rigorous inclusion/exclusion criteria; data extraction; and meta-regression (using different model constructs). In particular, priority was given to data from winter-based RCT performed at latitudes >49·5°N (n 12). A combined weighted linear model meta-regression analyses of natural log (Ln) total vitamin D intake (i.e. diet and supplemental vitamin D) v. achieved serum 25(OH)D in winter (that used by the North American Dietary Reference Intake Committee) produced a curvilinear relationship (mean (95 % lower CI) serum 25(OH)D (nmol/l) = 9·2 (8·5) Ln (total vitamin D)). Use of non-transformed total vitamin D intake data (maximum 1400 IU/d; 35 μg/d) provided for a more linear relationship (mean serum 25(OH)D (nmol/l) = 0·044 × (total vitamin D)+33·035). Although inputting an intake of 600 IU/d (i.e. the RDA) into the 95 % lower CI curvilinear and linear models predicted a serum 25(OH)D of 54·4 and 55·2 nmol/l, respectively, the total vitamin D intake that would achieve 50 (and 40) nmol/l serum 25(OH)D was 359 (111) and 480 (260) IU/d, respectively. Inclusion of 95 % range in the model to account for inter-individual variability increased the predicted intake of vitamin D needed to maintain serum 25(OH)D ≥ 50 nmol/l to 930 IU/d. The model used to describe the vitamin D intake–status relationship needs to be considered carefully when setting new reference intake values in the Europe.
A successful wafer-scale device layering process for fabricating three-dimensional integrated circuits (3D ICs) using Benzocyclobutene (BCB) is described. In the reported embodiment of the method, a sub-micron thick “donor” device layer is transplanted onto a fully fabricated “host” wafer with BCB as the intervening medium. Experimental results, including RIE study and planarization of BCB processed through the 3D fabrication procedure are reported. We conclude with an approach to alleviate BCB and fabrication induced wafer bowing, which leads to poor wafer to wafer alignment in 3D integration.
Twins can be used to investigate the biological basis for observed associations between birth weight and later disease risk, as they experience in utero growth restriction compared with singletons, which can differ in magnitude within twin pairs despite partial or total genetic identity. In the present study, sixty monozygotic and seventy-one dizygotic same-sex twin pairs aged 19–50 years and eighty-nine singleton controls matched for age, gestational age, sex, maternal age and parity were recruited from an obstetric database. Associations between fasting lipid levels and birth weight were assessed by linear regression with adjustment for possible confounding factors. Twins were significantly lighter at birth but were not significantly different in adult height, weight or lipid levels from the singleton controls. There was a significant inverse association between birth weight and both total and LDL-cholesterol levels among singleton controls (−0·53mmol/l per kg (95% CI −0·97, −0·09), P=0·02 and −0·39mmol/l per kg (95% CI −0·76, −0·02), P=0·04, respectively), but there was no significant association between birth weight and lipid levels in either unpaired or within-pair analysis of twins. The results suggest that the in utero growth restriction and early catch-up growth experienced by twins does not increase the risk of an atherogenic lipid profile in adult life.
Clinical decision making about an athlete's return to
competition after concussion is hampered by a lack of systematic
methods to measure recovery. We applied standard regression-based
methods to statistically measure individual rates of impairment at
several time points after concussion in college football players.
Postconcussive symptoms, cognitive functioning, and balance were
assessed in 94 players with concussion (based on American Academy of
Neurology Criteria) and 56 noninjured controls during preseason
baseline testing, and immediately, 3 hr, and 1, 2, 3, 5, and 7 days
postinjury. Ninety-five percent of injured players exhibited acute
concussion symptoms and impairment on cognitive or balance testing
immediately after injury, which diminished to 4% who reported elevated
symptoms on postinjury day 7. In addition, a small but clinically
significant percentage of players who reported being symptom free by
day 2 continued to be classified as impaired on the basis of objective
balance and cognitive testing. These data suggest that
neuropsychological testing may be of incremental utility to subjective
symptom checklists in identifying the residual effects of sport-related
concussion. The implementation of neuropsychological testing to detect
subtle cognitive impairment is most useful once postconcussive symptoms
have resolved. This management model is also supported by practical and
other methodological considerations. (JINS, 2005, 11,