To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is a copy of the slides presented at the meeting but not formally
written up for the volume.
As in vivo cellular imaging becomes the necessary norm for understanding
cancer and other diseases, new non-toxic nanoprobes are going to be
required to replace the high quality cadmium based nanoprobes in use
today. We are developing less toxic probes based on two types of
luminescent ceramic nanoparticles: naturally occurring fluorescent (NOF)
mimics and Ln-based ceramic oxide materials. The NOF minerals of interest
and that have demonstrated initial luminosity of sufficient brightness
for use in cellular studies that include sphalerite, scheelite, manganoan
and perovskite nanoparticles. For Ln-based materials we have shown that
Ln-doped zincite will also luminesce enough to allow for quantification
in cellular activity. Once formed, these probes are functionalized such
that they can be delivered to desired cellular targets. Probe
derivatization has focused on surface capping with functionalized
poly(ethyleneglycol) molecules/lipids to yield water soluble NCs and
polyarginine-based transporters for transmembrane delivery. The probes
are being evaluated for their luminescent properties, as well as their
non-toxicity and ability to report on cell-signaling events with various
cell lines using multi-spectral, confocal microscopy, and other
techniques. Preliminary interdisciplinary studies have validated the
basic approaches for the synthesis of NOF nanoprobes and the bio-delivery
and imaging of nanoparticles. Work to optimize the design, delivery, and
imaging of these new nanoprobes is expected to achieve the NIH directed
goal of increasing in the sensitivity and specificity of molecular probes
for imaging. Details of the synthesis, functionalization and biological
imaging using these probes will be presented. This work partially
supported by the United States Department of Energy under contract number
DE-AC04-94AL85000. Sandia is a multi-program laboratory operated by
Sandia Corporation, a Lockheed-Martin Company, for the United States
Department of Energy and by the National Institutes of health through the
NIH Roadmap for Medical Research, Grant #1 R21 EB005365-01. Information
on this RFA (Innovation in Molecular Imaging Probes) can be found at http://grants.nih.gov/grants/guide/rfa-files/RFA-RM-04-021.html.
Nurse sow strategies are used to manage large litters on commercial pig farms. However, new-born piglets transferred to nurse sows in late lactation might be compromised in terms of growth and survival. We investigated the effects of two nurse sow strategies on piglet growth, suckling behaviour and sow nursing behaviour. At 1-day post-farrowing, the four heaviest piglets from large litters were transferred to a nurse sow either 21 (1STEP21, n=9 litters) or 7 (2STEP7, n=10 litters) days into lactation. The remainder of the litter remained with their mother and was either kept intact (remain intact (RI), n=10 litters) or had some piglets cross-fostered to equalise birth weights (remain equalised (RE), n=9 litters). The 7-day-old piglets from 2STEP7 were transferred onto a sow 21 days into lactation (2STEP21, n=10 litters). The growth of new-born piglets on 1STEP21 and 2STEP7 nurse sows was initially lower than in RI litters (F3,33.8=4.61; P<0.01), but weaning weights did not significantly differ (F4,32.7=0.78; P>0.5). After the 1st week of lactation, the weights and growth rates did not differ between treatments. Fighting behaviour during nursing bouts decreased over time. The frequency of fights was higher in 1STEP21 and 2STEP21 litters compared with RI litters (t122=3.06 and t123=3.00, respectively, P<0.05). The 2STEP21 litters had shorter nursing bouts than RI and 1STEP21 litters (t107=−2.81 and t81.7=2.8, respectively, P<0.05), which were more frequently terminated by 2STEP21 than RI sows (t595=2.93; P<0.05). Transferring heaviest piglets from RI and RE litters to nurse sows reduced the percentage of teat changes during nursing bouts (RI: F1,275=16.61; RE: F1,308=43.59; P<0.001). In conclusion, nurse sow strategies do not appear to compromise piglet growth. However, new-born piglets transferred onto sows in late lactation experienced more competition at the udder, suggesting that the sows’ stage of lactation is of importance to how achievable nurse sow strategies are. Thus, the two-step nurse sow strategy is likely the best option (in relation to growth and suckling behaviour), as it minimises the difference between piglet age and sow stage of lactation.
Management strategies are needed to optimise the number of piglets weaned from hyper-prolific sows. Nurse sow strategies involve transferring supernumerary new-born piglets onto a sow whose own piglets are either weaned or fostered onto another sow. Such ‘nurse sows’ have extended lactations spent in farrowing crates, which could have negative implications for their welfare. This study used 47 sows, 20 of which farrowed large litters and had their biggest piglets fostered onto nurse sows which were either 1 week (2STEP7, n=9) or 3 weeks into lactation (1STEP21, n=10). Sows from which piglets were removed (R) were either left with the remainder of the litter intact (I) (remain intact (RI) sows, n=10), or had their litters equalised (E) for birth weight using piglets of the same age from non-experimental sows (remain equalised (RE) sows, n=9). Piglets from 2STEP7 were fostered onto another nurse sow which was 3 weeks into lactation (2STEP21, n=9). Back-fat thickness was measured at entry to the farrowing house, at fostering (nurse sows only) and weaning. Sows were scored for ease of locomotion and skin and claw lesions at entry to the farrowing house and weaning. Salivary cortisol samples were collected and tear staining was scored at 0900 h weekly from entry until weaning. Saliva samples were also taken at fostering. Data were analysed using GLMs with appropriate random and repeated factors, or non-parametric tests were applied where appropriate. Back-fat thickness decreased between entry and weaning for all sows (F1,42=26.59, P<0.001) and tended to differ between treatments (F4,16=2.91; P=0.06). At weaning RI sows had lower limb lesion scores than 2STEP7 and RE sows (χ24=10.8, P<0.05). No treatment effects were detected on salivary cortisol concentrations (P>0.05) and all nurse sows had a higher salivary cortisol concentration at fostering, compared with the other days (F10,426=3.47; P<0.05). Acute effects of fostering differed between nurse sow treatments (F2,113=3.45, P<0.05); 2STEP7 sows had a higher salivary cortisol concentration than 1STEP21 and 2STEP21 sows on the day of fostering. 2STEP7 sows had a higher salivary cortisol concentration at fostering, compared with 1STEP21 and 2STEP21 sows. Tear staining scores were not influenced by treatment (P>0.05). In conclusion, no difference was detected between nurse sows and non-nurse sows in body condition or severity of lesions. Although some nurse sows experienced stress at fostering, no long-term effect of the nurse sow strategies was detected on stress levels compared with sows that raised their own litter.
It is well established that the equid olfactory system is highly sensitive. It has been suggested that there is an intimate link between the sense of smell in the horse (Williams, 1995) and the fight and flight response, which is mediated by the autonomic nervous system (Marlin and Nankervis, 2002), thus affecting the heart. In humans, it has been demonstrated that lavender can have a pronounced impact on heart rate variability (Saeki, 2000) but it is not yet know whether the autonomic nervous system of the horse can be influenced by odour. It is the aim of the study to determine the effect of odour on the autonomic status of the horses.
Eight geldings were used in the study and the electrical activity of their heart was assessed using the “Polar S810” telemetric system before and during the presentation of either pig faeces or lavender to the horses’ nostrils. The experiment was conducted over a two–day period so that their response to both odours could be determined.
Comfort is of particular importance during the peri-partum period when cows are particularly susceptible to claw lesions. Claw lesions are reduced by housing on straw bedding which is thought to be in part attributable to the physical surface (Bergsten and Frank, 1996) but also to the fact that cows in straw yards spend more time lying down (Singh et al., 1993). The objective of this study was to determine the influence of relief areas in cubicle housing and out-wintering on a woodchip pad on behaviour and foot lesion scores of pregnant dairy heifers.
Floor type is one of the main features influencing the welfare of sows and piglets in farrowing crates. Yet it is difficult to reconcile the needs of the sow and her piglets through the use of one floor (Furniss et al., 1986). Hence the aim of this study was to identify a floor combination that optimises the welfare of the piglets in the farrowing crate.
Introduction: Most ambulance communication officers receive minimal education on agonal breathing, often leading to unrecognized out-of-hospital cardiac arrest (OHCA). We sought to evaluate the impact of an educational program on cardiac arrest recognition, and on bystander CPR and survival rates. Methods: Ambulance communication officers in Ottawa, Canada received additional training on agonal breathing, while the control site (Windsor, Canada) did not. Sites were compared to their pre-study performance (before-after design), and to each other (concurrent control). Trained investigators used a piloted-standardized data collection tool when reviewing the recordings for all potential OHCA cases submitted. OHCA was confirmed using our local OHCA registry, and we requested 9-1-1 recordings for OHCA cases not initially suspected. Two independent investigators reviewed medical records for non-OHCA cases receiving telephone-assisted CPR in Ottawa. We present descriptive and chi-square statistics. Results: There were 988 confirmed and suspected OHCA in the “before” (540 Ottawa; 448 Windsor), and 1,076 in the “after” group (689 Ottawa; 387 Windsor). Characteristics of “after” group OHCA patients were: mean age (68.1 Ottawa, 68.2 Windsor); Male (68.5% Ottawa, 64.8% Windsor); witnessed (45.0% Ottawa, 41.9% Windsor); and initial rhythm VF/VT (Ottawa 28.9, Windsor 22.5%). Before-after comparisons were: for cardiac arrest recognition (from 65.4% to 71.9% in Ottawa p=0.03; from 70.9% to 74.1% in Windsor p=0.37); for bystander CPR rates (from 23.0% to 35.9% in Ottawa p=0.0001; from 28.2% to 39.4% in Windsor p=0.001); and for survival to hospital discharge (from 4.1% to 12.5% in Ottawa p=0.001; from 3.9% to 6.9% in Windsor p=0.03). “After” group comparisons between Ottawa and Windsor (control) were not statistically different, except survival (p=0.02). Agonal breathing was common (25.6% Ottawa, 22.4% Windsor) and present in 18.5% of missed cases (15.8% Ottawa, 22.2% Windsor p=0.27). In Ottawa, 31 patients not in OHCA received chest compressions resulting from telephone-assisted CPR instructions. None suffered injury or adverse effects. Conclusion: While all OHCA outcomes improved over time, the educational intervention significantly improved OHCA recognition in Ottawa, and appeared to mitigate the impact of agonal breathing.
While the trajectory of self-esteem from adolescence to adulthood varies from person to person, little research has examined how differences in early developmental processes might affect these pathways. This study examined how early motor skill development interacted with preterm birth status to predict self-esteem from adolescence through the early 30s. We addressed this using the oldest known, prospectively followed cohort of extremely low birth weight (<1000 g) survivors (N = 179) and normal birth weight controls (N = 145) in the world, born between 1977 and 1982. Motor skills were measured using a performance-based assessment at age 8 and a retrospective self-report, and self-esteem was reported during three follow-up periods (age 12–16, age 22–26, and age 29–36). We found that birth weight status moderated the association between early motor skills and self-esteem. Stable over three decades, the self-esteem of normal birth weight participants was sensitive to early motor skills such that those with poorer motor functioning manifested lower self-esteem, while those with better motor skills manifested higher self-esteem. Conversely, differences in motor skill development did not affect the self-esteem from adolescence to adulthood in individuals born at extremely low birth weight. Early motor skill development may exert differential effects on self-esteem, depending on whether one is born at term or prematurely.
The shaping of PNe as a result of an interaction with a planet is a hypothesis that has been suggested for nearly two decades. However, exploring the idea observationally is challenging due to the lack of capabilities needed to detect any evidence of such a scenario. Nonetheless, we propose that the hypothesis can be indirectly tested via a combination of exoplanet formation and evolution theories, the star and planet formation histories of the galaxy and the tidal evolution of star-planet systems. We present a calculation of the fraction of planetary nebulae in the galaxy today which have undergone an interaction with a planet, concluding that a significant number of visible planetary nebulae may have been shaped by a planet.
Approximately half of the variation in wellbeing measures overlaps with variation in personality traits. Studies of non-human primate pedigrees and human twins suggest that this is due to common genetic influences. We tested whether personality polygenic scores for the NEO Five-Factor Inventory (NEO-FFI) domains and for item response theory (IRT) derived extraversion and neuroticism scores predict variance in wellbeing measures. Polygenic scores were based on published genome-wide association (GWA) results in over 17,000 individuals for the NEO-FFI and in over 63,000 for the IRT extraversion and neuroticism traits. The NEO-FFI polygenic scores were used to predict life satisfaction in 7 cohorts, positive affect in 12 cohorts, and general wellbeing in 1 cohort (maximal N = 46,508). Meta-analysis of these results showed no significant association between NEO-FFI personality polygenic scores and the wellbeing measures. IRT extraversion and neuroticism polygenic scores were used to predict life satisfaction and positive affect in almost 37,000 individuals from UK Biobank. Significant positive associations (effect sizes <0.05%) were observed between the extraversion polygenic score and wellbeing measures, and a negative association was observed between the polygenic neuroticism score and life satisfaction. Furthermore, using GWA data, genetic correlations of -0.49 and -0.55 were estimated between neuroticism with life satisfaction and positive affect, respectively. The moderate genetic correlation between neuroticism and wellbeing is in line with twin research showing that genetic influences on wellbeing are also shared with other independent personality domains.
In general population samples, better childhood cognitive functioning is associated with decreased risk of depression in adulthood. However, this link has not been examined in extremely low birth weight survivors (ELBW, <1000 g), a group known to have poorer cognition and greater depression risk. This study assessed associations between cognition at age 8 and lifetime risk of major depressive disorder in 84 ELBW survivors and 90 normal birth weight (NBW, ⩾2500 g) individuals up to 29–36 years of age. The Wechsler Intelligence Scale for Children, Revised (WISC-R), Raven’s Coloured Progressive Matrices and the Token Test assessed general, fluid, and verbal intelligence, respectively, at 8 years of age. Lifetime major depressive disorder was assessed using the Mini International Neuropsychiatric Interview at age 29–36 years. Associations were examined using logistic regression adjusted for childhood socioeconomic status, educational attainment, age, sex, and marital status. Neither overall intelligence quotient (IQ) [WISC-R Full-Scale IQ, odds ratios (OR)=0.87, 95% confidence interval (CI)=0.43–1.77], fluid intelligence (WISC-R Performance IQ, OR=0.98, 95% CI=0.48–2.00), nor verbal intelligence (WISC-R Verbal IQ, OR=0.81, 95% CI=0.40–1.63) predicted lifetime major depression in ELBW survivors. However, every standard deviation increase in WISC-R Full-Scale IQ (OR=0.43, 95% CI=0.20–0.92) and Performance IQ (OR=0.46, 95% CI=0.21–0.97), and each one point increase on the Token Test (OR=0.80, 95% CI=0.67–0.94) at age 8 was associated with a reduced risk of lifetime depression in NBW participants. Higher childhood IQ, better fluid intelligence, and greater verbal comprehension in childhood predicted reduced depression risk in NBW adults. Our findings suggest that ELBW survivors may be less protected by superior cognition than NBW individuals.
Tail lesions are important pig welfare indicators that could be recorded during meat inspection as they are more visible on the carcass than on the live animal. Tail biting is associated with reduced performance in the bitten pig, but it is not clear whether problems with tail biting are reflected in general farm performance figures. Farm advisory services aim to improve farm productivity which could be associated with improvements in pig welfare. Record keeping forms an integral part of such advisory services. The aim of this study was to examine the influence of record keeping in the Teagasc eProfit Monitor (ePM herds) on the prevalence of tail lesion severity scores in Irish slaughter pigs. In addition, we investigated associations between the prevalence of tail lesion scores and production parameters at farm level in ePM herds. Pigs were observed after scalding/dehairing and tail lesion score (0 to 4), sex and farm identification were recorded. Tail lesion scores were collapsed into none/mild lesions (score ⩽1), moderate lesions (score 2) and severe lesions (score ⩾3). The effect of record keeping (ePM herd) on the different tail lesion outcomes was analysed at batch level using the events/trials structure in generalized linear mixed models (PROC GLIMMIX). Spearman’s rank correlations were calculated between average tail lesion score of a batch and production parameters. A total of 13 133 pigs were assessed from 73 batches coming from 61 farms. In all, 23 farms were identified as ePM herds. The average prevalence of moderate tail lesions was 26.8% and of severe tail lesions was 3.4% in a batch. Batches coming from ePM herds had a lower prevalence of moderate tail lesions than non-ePM herds (P<0.001). Average tail lesion score was negatively associated with age (P<0.05) and weight (P<0.05) at sale/transfer of weaners, and tended to be positively associated with the number of finishing days (P=0.06). In addition, the prevalence of severe tail lesions was negatively associated with average daily gain in weaners (P<0.05) and tended to do so with average daily gain in finishers (P=0.08). This study provides the first indication that record keeping through an advisory service may help to lower the risk of tail biting, which is associated with improved farm performance.
The study of large-scale structure through QSO clustering provides a potentially powerful route to determining the fundamental cosmological parameters of the Universe (see Croom & Shanks 1996). Unfortunately, previous QSO clustering studies have been limited by the relatively small sizes of homogeneous QSO catalogues that have been available. Although approximately 10,000 QSOs are now known (Veron-Cetty & Veron 1997), the largest catalogues suitable for clustering studies contain only 500–1000 QSOs (Boyle et al. 1990, Crampton et al. 1990, Hewett et al. 1994). Even combining all such suitable catalogues, the total number of QSOs which can be used for clustering studies is still only about 2000.
Observations that radio-quiet QSOs exist in average galaxy cluster environments (Smith et al. 1995 and references therein) demonstrate that QSOs can be used to derive important information on the structure of the Universe at the largest scales. Previous studies of QSO clustering have been frustrated by the lack of large QSO redshift surveys. Although QSO clustering is detected in the largest existing QSO catalogues (see Shanks & Boyle 1994), it is difficult to place strong limits on the cosmological evolution of QSO clustering or the level of clustering at large scales (> 10h–1 Mpc) with current QSO catalogues.
Introduction: The goal of this study was to determine if emergency department (ED) surge and end of shift assessment of patients affect the extent of diagnostic tests, therapeutic interventions performed and accuracy of diagnosis prior to referral of patients to Internal Medicine as well as the impact on patient outcomes. Methods: This study was a health records review of consecutive patients referred to the internal medicine service with an ED diagnosis of heart failure, COPD or sepsis, at two tertiary care EDs. We developed a scoring system in consultation with senior emergency and internal medicine physicians to uniformly assess the treatments and investigations performed for patients diagnosed in the ED with heart failure, COPD or sepsis. These scores were then correlated with surge levels and time of day at patient assessment and disposition. Rate of admission and diagnosis disagreements were also assessed. Results: We included 308 patients (101 with heart failure, 101 with COPD, 106 with sepsis). Comparing middle of shift to end of shift, the overall weighted mean scores were 92.2% vs. 91.7% for investigations and 73.5% vs. 70.0% for treatments. Comparing low to high surge times, the overall weighted mean scores were 89.9% vs. 92.6% for investigations and 68.6% vs. 71.7% for treatments. Evaluating each condition separately for investigations and treatments according to time of shift or surge conditions, there were no consistent differences in scores. We found overall high admission rates (93.1 % for heart failure, 91.1% for COPD, 96.2% for sepsis patients), and low rates of diagnosis disagreement (4.0 % heart failure, 10.9% COPD, 8.5% sepsis). Conclusion: We found that surge levels and end of shift did not impact the extent of investigations and treatments provided to patients diagnosed in the emergency department with heart failure, COPD or sepsis and referred to internal medicine. Admission rates for the patients referred were above 90% and there were very few diagnosis disagreements or diversion to alternate service by internal medicine. We believe this supports the emergency physician's ability to adapt to time and surge constraints, particularly in the context of commonly encountered conditions.
There is increasing interest in developing abattoir-based measures to assist in determining the welfare status of pigs. The primary aim of this study was to determine the most appropriate place on the slaughter line to conduct assessments of welfare-related lesions, namely apparent aggression-related skin lesions (hereafter referred to as ‘skin lesions’), loin bruising and apparent tail biting damage. The study also lent itself to an assessment of the prevalence of these lesions, and the extent to which they were linked with production variables. Finishing pigs processed at two abattoirs on the Island of Ireland (n=1950 in abattoir A, and n=1939 in abattoir B) were used. Data were collected over 6 days in each abattoir in July 2014. Lesion scoring took place at two points on the slaughter line: (1) at exsanguination (slaughter stage 1 (SS1)), and (2) following scalding and dehairing of carcasses (slaughter stage 2 (SS2)). At both points, each carcass was assigned a skin and tail lesion score ranging from 0 (lesion absent) to 3 or 4 (severe lesions), respectively. Loin bruising was recorded as present or absent. Differences in the percentage of pigs with observable lesions of each type were compared between SS1 and SS2 using McNemar/McNemar-Bowker tests. The associations between each lesion type, and both cold carcass weight and condemnations, were examined at batch level using Pearson’s correlations. Batch was defined as the group of animals with a particular farm identification code on a given day. The overall percentage of pigs with a visible skin lesion (i.e. score>0) decreased between SS1 and SS2 (P<0.001). However, the percentage of pigs with a severe skin lesion increased numerically from SS1 to SS2. The percentage of pigs with a visible tail lesion and with loin bruising also increased between SS1 and SS2 (P<0.001). There was a positive correlation between the percentage of carcasses that were partially condemned, and the percentage of pigs with skin lesions, tail lesions and loin bruising (P<0.05). In addition, as the batch-level frequency of each lesion type increased, average cold carcass weight decreased (P<0.001). These findings suggest that severe skin lesions, tail lesions and loin bruising are more visible on pig carcasses after they have been scalded and dehaired, and that this is when abattoir-based lesion scoring should take place. The high prevalence of all three lesion types, and the links with economically important production parameters, suggests that more research into identifying key risk factors is warranted.
Immunoglobulin A (IgA) is a predominant immunoglobulin present in human breast milk and is known to play an important role in infant gut immunity maturation. Breast milk composition varies between populations, but the environmental and maternal factors responsible for these variations are still unclear. We examined the relationship between different exposures and levels of IgA in colostrum. The objective of this study was to examine whether exposures analysed influence levels of IgA in colostrum. The present study used 294 colostrum samples from the MecMilk International cohort, collected from women residing in London, Moscow and Verona. Samples were analysed in automated Abbott Architect Analyser. We found an inverse correlation between time postpartum and colostrum total IgA level (r=−0.49, P<0.001). Adjusting for maternal parity, smoking, fresh fruit and fish consumption and allergen sensitization, multiple regression model showed that IgA levels were influenced by colostrum collection time (P<0.0001) and country of collection (P<0.01). Mode of delivery influence did not appear to be significant in univariate comparisons, once adjusted for the above maternal characteristics it showed a significant influence on total IgA (P=0.01). We conclude that the concentration of IgA in colostrum drops rapidly after birth and future studies should always consider this factor in analysis. IgA concentration varied significantly between countries, with the highest level detected in Moscow and lowest in Verona. Mode of delivery effect should be confirmed on larger cohorts. Further work is needed to determine ways to correct for IgA decline over time in colostrum, and to find the cause of variations in IgA levels between the countries.
To assess an intervention to limit community-associated methicillin-resistant Staphylococcus aureus (MRSA) dissemination.
Randomized, controlled trial.
County Jail, Dallas, Texas.
A total of 4,196 detainees in 68 detention tanks.
Tanks were randomly assigned to 1 of 3 groups: in group 1, detainees received cloths that contained chlorhexidine gluconate (CHG) to clean their entire skin surface 3 times per week for 6 months; group 2 received identical cloths containing only water; and group 3 received no skin treatment. During the study, all newly arrived detainees were invited to enroll. Nares and hand cultures were obtained at baseline and from all current enrollees at 2 and 6 months.
At baseline, S. aureus was isolated from 41.2% and MRSA from 8.0% (nares and/or hand) of 947 enrollees. The average participation rate was 47%. At 6 months, MRSA carriage was 10.0% in group 3 and 8.7% in group 1 tanks (estimated absolute risk reduction [95% confidence interval (CI)], 1.4% [−4.8% to 7.1%]; P = .655). At 6 months, carriage of any S. aureus was 51.1% in group 3, 40.7% in group 1 (absolute risk reduction [95% CI], 10.4% [0.01%–20.1%]; P = .047), and 42.8% (absolute risk reduction [95% CI], 8.3% [−1.4% to 18.0%]; P = .099) in group 2.
Skin cleaning with CHG for 6 months in detainees, compared with no intervention, significantly decreased carriage of S. aureus, and use of water cloths produced a nonsignificant but similar decrease. A nonsignificant decrease in MRSA carriage was found with CHG cloth use.