We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To characterise the parenting priorities of mothers and fathers of infants hospitalised with CHD and generate recommendations to support parenting during infant hospitalisation.
Study design:
Through online crowdsourcing, an innovative research methodology to create an online community to serve as a research sample, 79 parents of young children with CHD responded to questions about parenting during hospitalisation via private social networking site. Responses were analysed using qualitative research methods.
Results:
Three broad themes were identified: (1) establishing a bond with my baby, (2) asserting the parental role, and (3) coping with fear and uncertainty. Parents value provider support in restoring normalcy to the parenting experience during infant hospitalisation.
Conclusions:
Care teams can support parenting during infant hospitalisation by promoting parents’ roles as primary caretakers and decision-makers and attending to the emotional impact of infant hospitalisation on the family.
Praziquantel (PZQ) is the drug of choice for schistosomiasis. The potential drug resistance necessitates the search for adjunct or alternative therapies to PZQ. Previous functional genomics has shown that RNAi inhibition of Ca2+/calmodulin-dependent protein kinase II (CaMKII) gene in Schistosoma adult worms significantly improved the effectiveness of PZQ. Here we tested the in vitro efficacy of 15 selective and non-selective CaMK inhibitors against Schistosoma mansoni and showed that PZQ efficacy was improved against refractory juvenile parasites when combined with these CaMK inhibitors. By measuring CaMK activity and the mobility of adult S. mansoni, we identified two non-selective CaMK inhibitors, Staurosporine (STSP) and 1Naphthyl PP1 (1NAPP1), as promising candidates for further study. The impact of STSP and 1NAPP1 was investigated in mice infected with S. mansoni in the presence or absence of a sub-lethal dose of PZQ against 2- and 7-day-old schistosomula and adults. Treatment with STSP/PZQ induced a significant (47–68%) liver egg burden reduction compared with mice treated with PZQ alone. The findings indicate that the combination of STSP and PZQ dosages significantly improved anti-schistosomal activity compared to PZQ alone, demonstrating the potential of selective and non-selective CaMK/kinase inhibitors as a combination therapy with PZQ in treating schistosomiasis.
Toxoplasma gondii rhoptry protein TgROP18 is a polymorphic virulence effector that targets immunity-related GTPases (IRGs) in rodents. Given that IRGs are uniquely diversified in rodents and not in other T. gondii intermediate hosts, the role of TgROP18 in manipulating non-rodent cells is unclear. Here we show that in human cells TgROP18I interacts with the interferon-gamma-inducible protein N-myc and STAT interactor (NMI) and that this is a property that is unique to the type I TgROP18 allele. Specifically, when expressed ectopically in mammalian cells only TgROP18I co-immunoprecipitates with NMI in IFN-γ-treated cells, while TgROP18II does not. In parasites expressing TgROP18I or TgROP18II, NMI only co-immunoprecipitates with TgROP18I and this is associated with allele-specific immunolocalization of NMI on the parasitophorous vacuolar membrane (PVM). We also found that TgROP18I reduces NMI association with IFN-γ-activated sequences (GAS) in the IRF1 gene promoter. Finally, we determined that polymorphisms in the C-terminal kinase domain of TgROP18I are required for allele-specific effects on NMI. Together, these data further define new host pathway targeted by TgROP18I and provide the first function driven by allelic differences in the highly polymorphic ROP18 locus.
We investigated the impact of eszopiclone 3mg on next day driving ability (on-the-road brake-reaction-time, BRT) and cognitive and psychomotor performance in patients with primary insomnia.
Methods:
Patients with DSM-IV primary insomnia completed this study. Treatment was administered 30min before bedtime, and next day driving ability was assessed by on-the-road BRT approximately 9.5 hours postdose. A cognitive test battery measured residual effects on information processing, divided attention, psychomotor tasks, and working memory. Overnight polysomnography was conducted to assess sleep architecture; subjective ratings of morning sedation and sleep quality were also obtained.
Results:
There were no significant differences in BRT following night time administration of eszopiclone 3mg compared with placebo (p=0.39) and there were no significant differences in objective cognitive tests of information processing, divided attention, psychomotor tasks and working memory (p values>0.15). No significant effect on subjective next day ratings of morning sedation, coordination or mood was observed (p values>0.22). There was improvement compared with placebo (p<0.0001) in subjective ease of getting to sleep and quality of sleep the morning following dosing, and no perceived impairment of behavior following awakening or early morning awakenings. Polysomnography demonstrated significant improvements in sleep onset and maintenance.
Conclusion:
In this study, the first to assess next day on-the-road driving in primary insomniacs following hypnotic use, eszopiclone 3mg improved both objective and subjective measures of sleep onset and maintenance without residual impairments on next day driving ability or cognitive and psychomotor performance.
Support for this study provided by Sepracor Inc., Marlborough, MA.
Nurse sow strategies are used to manage large litters on commercial pig farms. However, new-born piglets transferred to nurse sows in late lactation might be compromised in terms of growth and survival. We investigated the effects of two nurse sow strategies on piglet growth, suckling behaviour and sow nursing behaviour. At 1-day post-farrowing, the four heaviest piglets from large litters were transferred to a nurse sow either 21 (1STEP21, n=9 litters) or 7 (2STEP7, n=10 litters) days into lactation. The remainder of the litter remained with their mother and was either kept intact (remain intact (RI), n=10 litters) or had some piglets cross-fostered to equalise birth weights (remain equalised (RE), n=9 litters). The 7-day-old piglets from 2STEP7 were transferred onto a sow 21 days into lactation (2STEP21, n=10 litters). The growth of new-born piglets on 1STEP21 and 2STEP7 nurse sows was initially lower than in RI litters (F3,33.8=4.61; P<0.01), but weaning weights did not significantly differ (F4,32.7=0.78; P>0.5). After the 1st week of lactation, the weights and growth rates did not differ between treatments. Fighting behaviour during nursing bouts decreased over time. The frequency of fights was higher in 1STEP21 and 2STEP21 litters compared with RI litters (t122=3.06 and t123=3.00, respectively, P<0.05). The 2STEP21 litters had shorter nursing bouts than RI and 1STEP21 litters (t107=−2.81 and t81.7=2.8, respectively, P<0.05), which were more frequently terminated by 2STEP21 than RI sows (t595=2.93; P<0.05). Transferring heaviest piglets from RI and RE litters to nurse sows reduced the percentage of teat changes during nursing bouts (RI: F1,275=16.61; RE: F1,308=43.59; P<0.001). In conclusion, nurse sow strategies do not appear to compromise piglet growth. However, new-born piglets transferred onto sows in late lactation experienced more competition at the udder, suggesting that the sows’ stage of lactation is of importance to how achievable nurse sow strategies are. Thus, the two-step nurse sow strategy is likely the best option (in relation to growth and suckling behaviour), as it minimises the difference between piglet age and sow stage of lactation.
Management strategies are needed to optimise the number of piglets weaned from hyper-prolific sows. Nurse sow strategies involve transferring supernumerary new-born piglets onto a sow whose own piglets are either weaned or fostered onto another sow. Such ‘nurse sows’ have extended lactations spent in farrowing crates, which could have negative implications for their welfare. This study used 47 sows, 20 of which farrowed large litters and had their biggest piglets fostered onto nurse sows which were either 1 week (2STEP7, n=9) or 3 weeks into lactation (1STEP21, n=10). Sows from which piglets were removed (R) were either left with the remainder of the litter intact (I) (remain intact (RI) sows, n=10), or had their litters equalised (E) for birth weight using piglets of the same age from non-experimental sows (remain equalised (RE) sows, n=9). Piglets from 2STEP7 were fostered onto another nurse sow which was 3 weeks into lactation (2STEP21, n=9). Back-fat thickness was measured at entry to the farrowing house, at fostering (nurse sows only) and weaning. Sows were scored for ease of locomotion and skin and claw lesions at entry to the farrowing house and weaning. Salivary cortisol samples were collected and tear staining was scored at 0900 h weekly from entry until weaning. Saliva samples were also taken at fostering. Data were analysed using GLMs with appropriate random and repeated factors, or non-parametric tests were applied where appropriate. Back-fat thickness decreased between entry and weaning for all sows (F1,42=26.59, P<0.001) and tended to differ between treatments (F4,16=2.91; P=0.06). At weaning RI sows had lower limb lesion scores than 2STEP7 and RE sows (χ24=10.8, P<0.05). No treatment effects were detected on salivary cortisol concentrations (P>0.05) and all nurse sows had a higher salivary cortisol concentration at fostering, compared with the other days (F10,426=3.47; P<0.05). Acute effects of fostering differed between nurse sow treatments (F2,113=3.45, P<0.05); 2STEP7 sows had a higher salivary cortisol concentration than 1STEP21 and 2STEP21 sows on the day of fostering. 2STEP7 sows had a higher salivary cortisol concentration at fostering, compared with 1STEP21 and 2STEP21 sows. Tear staining scores were not influenced by treatment (P>0.05). In conclusion, no difference was detected between nurse sows and non-nurse sows in body condition or severity of lesions. Although some nurse sows experienced stress at fostering, no long-term effect of the nurse sow strategies was detected on stress levels compared with sows that raised their own litter.
Introduction: Most ambulance communication officers receive minimal education on agonal breathing, often leading to unrecognized out-of-hospital cardiac arrest (OHCA). We sought to evaluate the impact of an educational program on cardiac arrest recognition, and on bystander CPR and survival rates. Methods: Ambulance communication officers in Ottawa, Canada received additional training on agonal breathing, while the control site (Windsor, Canada) did not. Sites were compared to their pre-study performance (before-after design), and to each other (concurrent control). Trained investigators used a piloted-standardized data collection tool when reviewing the recordings for all potential OHCA cases submitted. OHCA was confirmed using our local OHCA registry, and we requested 9-1-1 recordings for OHCA cases not initially suspected. Two independent investigators reviewed medical records for non-OHCA cases receiving telephone-assisted CPR in Ottawa. We present descriptive and chi-square statistics. Results: There were 988 confirmed and suspected OHCA in the “before” (540 Ottawa; 448 Windsor), and 1,076 in the “after” group (689 Ottawa; 387 Windsor). Characteristics of “after” group OHCA patients were: mean age (68.1 Ottawa, 68.2 Windsor); Male (68.5% Ottawa, 64.8% Windsor); witnessed (45.0% Ottawa, 41.9% Windsor); and initial rhythm VF/VT (Ottawa 28.9, Windsor 22.5%). Before-after comparisons were: for cardiac arrest recognition (from 65.4% to 71.9% in Ottawa p=0.03; from 70.9% to 74.1% in Windsor p=0.37); for bystander CPR rates (from 23.0% to 35.9% in Ottawa p=0.0001; from 28.2% to 39.4% in Windsor p=0.001); and for survival to hospital discharge (from 4.1% to 12.5% in Ottawa p=0.001; from 3.9% to 6.9% in Windsor p=0.03). “After” group comparisons between Ottawa and Windsor (control) were not statistically different, except survival (p=0.02). Agonal breathing was common (25.6% Ottawa, 22.4% Windsor) and present in 18.5% of missed cases (15.8% Ottawa, 22.2% Windsor p=0.27). In Ottawa, 31 patients not in OHCA received chest compressions resulting from telephone-assisted CPR instructions. None suffered injury or adverse effects. Conclusion: While all OHCA outcomes improved over time, the educational intervention significantly improved OHCA recognition in Ottawa, and appeared to mitigate the impact of agonal breathing.
While the trajectory of self-esteem from adolescence to adulthood varies from person to person, little research has examined how differences in early developmental processes might affect these pathways. This study examined how early motor skill development interacted with preterm birth status to predict self-esteem from adolescence through the early 30s. We addressed this using the oldest known, prospectively followed cohort of extremely low birth weight (<1000 g) survivors (N = 179) and normal birth weight controls (N = 145) in the world, born between 1977 and 1982. Motor skills were measured using a performance-based assessment at age 8 and a retrospective self-report, and self-esteem was reported during three follow-up periods (age 12–16, age 22–26, and age 29–36). We found that birth weight status moderated the association between early motor skills and self-esteem. Stable over three decades, the self-esteem of normal birth weight participants was sensitive to early motor skills such that those with poorer motor functioning manifested lower self-esteem, while those with better motor skills manifested higher self-esteem. Conversely, differences in motor skill development did not affect the self-esteem from adolescence to adulthood in individuals born at extremely low birth weight. Early motor skill development may exert differential effects on self-esteem, depending on whether one is born at term or prematurely.
The shaping of PNe as a result of an interaction with a planet is a hypothesis that has been suggested for nearly two decades. However, exploring the idea observationally is challenging due to the lack of capabilities needed to detect any evidence of such a scenario. Nonetheless, we propose that the hypothesis can be indirectly tested via a combination of exoplanet formation and evolution theories, the star and planet formation histories of the galaxy and the tidal evolution of star-planet systems. We present a calculation of the fraction of planetary nebulae in the galaxy today which have undergone an interaction with a planet, concluding that a significant number of visible planetary nebulae may have been shaped by a planet.
Approximately half of the variation in wellbeing measures overlaps with variation in personality traits. Studies of non-human primate pedigrees and human twins suggest that this is due to common genetic influences. We tested whether personality polygenic scores for the NEO Five-Factor Inventory (NEO-FFI) domains and for item response theory (IRT) derived extraversion and neuroticism scores predict variance in wellbeing measures. Polygenic scores were based on published genome-wide association (GWA) results in over 17,000 individuals for the NEO-FFI and in over 63,000 for the IRT extraversion and neuroticism traits. The NEO-FFI polygenic scores were used to predict life satisfaction in 7 cohorts, positive affect in 12 cohorts, and general wellbeing in 1 cohort (maximal N = 46,508). Meta-analysis of these results showed no significant association between NEO-FFI personality polygenic scores and the wellbeing measures. IRT extraversion and neuroticism polygenic scores were used to predict life satisfaction and positive affect in almost 37,000 individuals from UK Biobank. Significant positive associations (effect sizes <0.05%) were observed between the extraversion polygenic score and wellbeing measures, and a negative association was observed between the polygenic neuroticism score and life satisfaction. Furthermore, using GWA data, genetic correlations of -0.49 and -0.55 were estimated between neuroticism with life satisfaction and positive affect, respectively. The moderate genetic correlation between neuroticism and wellbeing is in line with twin research showing that genetic influences on wellbeing are also shared with other independent personality domains.
In general population samples, better childhood cognitive functioning is associated with decreased risk of depression in adulthood. However, this link has not been examined in extremely low birth weight survivors (ELBW, <1000 g), a group known to have poorer cognition and greater depression risk. This study assessed associations between cognition at age 8 and lifetime risk of major depressive disorder in 84 ELBW survivors and 90 normal birth weight (NBW, ⩾2500 g) individuals up to 29–36 years of age. The Wechsler Intelligence Scale for Children, Revised (WISC-R), Raven’s Coloured Progressive Matrices and the Token Test assessed general, fluid, and verbal intelligence, respectively, at 8 years of age. Lifetime major depressive disorder was assessed using the Mini International Neuropsychiatric Interview at age 29–36 years. Associations were examined using logistic regression adjusted for childhood socioeconomic status, educational attainment, age, sex, and marital status. Neither overall intelligence quotient (IQ) [WISC-R Full-Scale IQ, odds ratios (OR)=0.87, 95% confidence interval (CI)=0.43–1.77], fluid intelligence (WISC-R Performance IQ, OR=0.98, 95% CI=0.48–2.00), nor verbal intelligence (WISC-R Verbal IQ, OR=0.81, 95% CI=0.40–1.63) predicted lifetime major depression in ELBW survivors. However, every standard deviation increase in WISC-R Full-Scale IQ (OR=0.43, 95% CI=0.20–0.92) and Performance IQ (OR=0.46, 95% CI=0.21–0.97), and each one point increase on the Token Test (OR=0.80, 95% CI=0.67–0.94) at age 8 was associated with a reduced risk of lifetime depression in NBW participants. Higher childhood IQ, better fluid intelligence, and greater verbal comprehension in childhood predicted reduced depression risk in NBW adults. Our findings suggest that ELBW survivors may be less protected by superior cognition than NBW individuals.
Tail lesions are important pig welfare indicators that could be recorded during meat inspection as they are more visible on the carcass than on the live animal. Tail biting is associated with reduced performance in the bitten pig, but it is not clear whether problems with tail biting are reflected in general farm performance figures. Farm advisory services aim to improve farm productivity which could be associated with improvements in pig welfare. Record keeping forms an integral part of such advisory services. The aim of this study was to examine the influence of record keeping in the Teagasc eProfit Monitor (ePM herds) on the prevalence of tail lesion severity scores in Irish slaughter pigs. In addition, we investigated associations between the prevalence of tail lesion scores and production parameters at farm level in ePM herds. Pigs were observed after scalding/dehairing and tail lesion score (0 to 4), sex and farm identification were recorded. Tail lesion scores were collapsed into none/mild lesions (score ⩽1), moderate lesions (score 2) and severe lesions (score ⩾3). The effect of record keeping (ePM herd) on the different tail lesion outcomes was analysed at batch level using the events/trials structure in generalized linear mixed models (PROC GLIMMIX). Spearman’s rank correlations were calculated between average tail lesion score of a batch and production parameters. A total of 13 133 pigs were assessed from 73 batches coming from 61 farms. In all, 23 farms were identified as ePM herds. The average prevalence of moderate tail lesions was 26.8% and of severe tail lesions was 3.4% in a batch. Batches coming from ePM herds had a lower prevalence of moderate tail lesions than non-ePM herds (P<0.001). Average tail lesion score was negatively associated with age (P<0.05) and weight (P<0.05) at sale/transfer of weaners, and tended to be positively associated with the number of finishing days (P=0.06). In addition, the prevalence of severe tail lesions was negatively associated with average daily gain in weaners (P<0.05) and tended to do so with average daily gain in finishers (P=0.08). This study provides the first indication that record keeping through an advisory service may help to lower the risk of tail biting, which is associated with improved farm performance.
Introduction: The goal of this study was to determine if emergency department (ED) surge and end of shift assessment of patients affect the extent of diagnostic tests, therapeutic interventions performed and accuracy of diagnosis prior to referral of patients to Internal Medicine as well as the impact on patient outcomes. Methods: This study was a health records review of consecutive patients referred to the internal medicine service with an ED diagnosis of heart failure, COPD or sepsis, at two tertiary care EDs. We developed a scoring system in consultation with senior emergency and internal medicine physicians to uniformly assess the treatments and investigations performed for patients diagnosed in the ED with heart failure, COPD or sepsis. These scores were then correlated with surge levels and time of day at patient assessment and disposition. Rate of admission and diagnosis disagreements were also assessed. Results: We included 308 patients (101 with heart failure, 101 with COPD, 106 with sepsis). Comparing middle of shift to end of shift, the overall weighted mean scores were 92.2% vs. 91.7% for investigations and 73.5% vs. 70.0% for treatments. Comparing low to high surge times, the overall weighted mean scores were 89.9% vs. 92.6% for investigations and 68.6% vs. 71.7% for treatments. Evaluating each condition separately for investigations and treatments according to time of shift or surge conditions, there were no consistent differences in scores. We found overall high admission rates (93.1 % for heart failure, 91.1% for COPD, 96.2% for sepsis patients), and low rates of diagnosis disagreement (4.0 % heart failure, 10.9% COPD, 8.5% sepsis). Conclusion: We found that surge levels and end of shift did not impact the extent of investigations and treatments provided to patients diagnosed in the emergency department with heart failure, COPD or sepsis and referred to internal medicine. Admission rates for the patients referred were above 90% and there were very few diagnosis disagreements or diversion to alternate service by internal medicine. We believe this supports the emergency physician's ability to adapt to time and surge constraints, particularly in the context of commonly encountered conditions.
There is increasing interest in developing abattoir-based measures to assist in determining the welfare status of pigs. The primary aim of this study was to determine the most appropriate place on the slaughter line to conduct assessments of welfare-related lesions, namely apparent aggression-related skin lesions (hereafter referred to as ‘skin lesions’), loin bruising and apparent tail biting damage. The study also lent itself to an assessment of the prevalence of these lesions, and the extent to which they were linked with production variables. Finishing pigs processed at two abattoirs on the Island of Ireland (n=1950 in abattoir A, and n=1939 in abattoir B) were used. Data were collected over 6 days in each abattoir in July 2014. Lesion scoring took place at two points on the slaughter line: (1) at exsanguination (slaughter stage 1 (SS1)), and (2) following scalding and dehairing of carcasses (slaughter stage 2 (SS2)). At both points, each carcass was assigned a skin and tail lesion score ranging from 0 (lesion absent) to 3 or 4 (severe lesions), respectively. Loin bruising was recorded as present or absent. Differences in the percentage of pigs with observable lesions of each type were compared between SS1 and SS2 using McNemar/McNemar-Bowker tests. The associations between each lesion type, and both cold carcass weight and condemnations, were examined at batch level using Pearson’s correlations. Batch was defined as the group of animals with a particular farm identification code on a given day. The overall percentage of pigs with a visible skin lesion (i.e. score>0) decreased between SS1 and SS2 (P<0.001). However, the percentage of pigs with a severe skin lesion increased numerically from SS1 to SS2. The percentage of pigs with a visible tail lesion and with loin bruising also increased between SS1 and SS2 (P<0.001). There was a positive correlation between the percentage of carcasses that were partially condemned, and the percentage of pigs with skin lesions, tail lesions and loin bruising (P<0.05). In addition, as the batch-level frequency of each lesion type increased, average cold carcass weight decreased (P<0.001). These findings suggest that severe skin lesions, tail lesions and loin bruising are more visible on pig carcasses after they have been scalded and dehaired, and that this is when abattoir-based lesion scoring should take place. The high prevalence of all three lesion types, and the links with economically important production parameters, suggests that more research into identifying key risk factors is warranted.
Immunoglobulin A (IgA) is a predominant immunoglobulin present in human breast milk and is known to play an important role in infant gut immunity maturation. Breast milk composition varies between populations, but the environmental and maternal factors responsible for these variations are still unclear. We examined the relationship between different exposures and levels of IgA in colostrum. The objective of this study was to examine whether exposures analysed influence levels of IgA in colostrum. The present study used 294 colostrum samples from the MecMilk International cohort, collected from women residing in London, Moscow and Verona. Samples were analysed in automated Abbott Architect Analyser. We found an inverse correlation between time postpartum and colostrum total IgA level (r=−0.49, P<0.001). Adjusting for maternal parity, smoking, fresh fruit and fish consumption and allergen sensitization, multiple regression model showed that IgA levels were influenced by colostrum collection time (P<0.0001) and country of collection (P<0.01). Mode of delivery influence did not appear to be significant in univariate comparisons, once adjusted for the above maternal characteristics it showed a significant influence on total IgA (P=0.01). We conclude that the concentration of IgA in colostrum drops rapidly after birth and future studies should always consider this factor in analysis. IgA concentration varied significantly between countries, with the highest level detected in Moscow and lowest in Verona. Mode of delivery effect should be confirmed on larger cohorts. Further work is needed to determine ways to correct for IgA decline over time in colostrum, and to find the cause of variations in IgA levels between the countries.
Autoantibodies have been implicated in the etiologic pathway of depressive disorders. Here, we determine the association between the presence of a panel of autoantibodies at baseline and change in depression symptom score over 5-year follow-up in a cohort of healthy elderly Australians.
Methods
Serum samples from 2049 randomly selected subjects enrolled in the Hunter Community Study (HCS) aged 55–85 years were assayed for a range of autoimmune markers (anti-nuclear autoantibodies, extractable nuclear antigen autoantibodies, anti-neutrophil cytoplasmic autoantibodies, thyroid peroxidase autoantibodies, tissue transglutaminase autoantibodies, anti-cardiolipin autoantibodies, rheumatoid factor and cyclic citrullinated peptide autoantibodies) at baseline. Depression symptom score was assessed using the Centre for Epidemiological Study (CES-D) scale at baseline and 5 years later.
Results
Autoantibody prevalence varied amongst our sample with ANA being the most prevalent; positive in 16% and borderline in 36% of study population. No evidence for a relationship was found between change in CES-D score over time and any autoimmune marker. Statins and high cholesterol were significantly associated with change in CES-D score over time in univariate analysis; however, these were probably confounded since they failed to remain significant following multivariable analysis.
Conclusions
Autoantibodies were not associated with change in CES-D score over time. These findings point to an absence of autoimmune mechanisms in the general population or in moderate cases of depression.
To assess an intervention to limit community-associated methicillin-resistant Staphylococcus aureus (MRSA) dissemination.
Design.
Randomized, controlled trial.
Setting.
County Jail, Dallas, Texas.
Participants.
A total of 4,196 detainees in 68 detention tanks.
Methods.
Tanks were randomly assigned to 1 of 3 groups: in group 1, detainees received cloths that contained chlorhexidine gluconate (CHG) to clean their entire skin surface 3 times per week for 6 months; group 2 received identical cloths containing only water; and group 3 received no skin treatment. During the study, all newly arrived detainees were invited to enroll. Nares and hand cultures were obtained at baseline and from all current enrollees at 2 and 6 months.
Results.
At baseline, S. aureus was isolated from 41.2% and MRSA from 8.0% (nares and/or hand) of 947 enrollees. The average participation rate was 47%. At 6 months, MRSA carriage was 10.0% in group 3 and 8.7% in group 1 tanks (estimated absolute risk reduction [95% confidence interval (CI)], 1.4% [−4.8% to 7.1%]; P = .655). At 6 months, carriage of any S. aureus was 51.1% in group 3, 40.7% in group 1 (absolute risk reduction [95% CI], 10.4% [0.01%–20.1%]; P = .047), and 42.8% (absolute risk reduction [95% CI], 8.3% [−1.4% to 18.0%]; P = .099) in group 2.
Conclusions.
Skin cleaning with CHG for 6 months in detainees, compared with no intervention, significantly decreased carriage of S. aureus, and use of water cloths produced a nonsignificant but similar decrease. A nonsignificant decrease in MRSA carriage was found with CHG cloth use.
This paper describes the system architecture of a newly constructed radio telescope – the Boolardy engineering test array, which is a prototype of the Australian square kilometre array pathfinder telescope. Phased array feed technology is used to form multiple simultaneous beams per antenna, providing astronomers with unprecedented survey speed. The test array described here is a six-antenna interferometer, fitted with prototype signal processing hardware capable of forming at least nine dual-polarisation beams simultaneously, allowing several square degrees to be imaged in a single pointed observation. The main purpose of the test array is to develop beamforming and wide-field calibration methods for use with the full telescope, but it will also be capable of limited early science demonstrations.