We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We examined the association between bilingualism, executive function (EF), and brain volume in older monolinguals and bilinguals who spoke English, Spanish, or both, and were cognitively normal (CN) or diagnosed with Mild Cognitive Impairment (MCI) or dementia. Gray matter volume (GMV) was higher in language and EF brain regions among bilinguals, but no differences were found in memory regions. Neuropsychological performance did not vary across language groups over time; however, bilinguals exhibited reduced Stroop interference and lower scores on Digit Span Backwards and category fluency. Higher scores on Digit Span Backwards were associated with a younger age of English acquisition, and a greater degree of balanced bilingualism was associated with lower scores in category fluency. The initial age of cognitive decline did not differ between language groups. The influence of bilingualism appears to be reflected in increased GMV in language and EF regions, and to a lesser degree, in EF.
Survey results suggest that prolonged administration of prophylactic antibiotics is common after mastectomy with reconstruction. We determined utilization, predictors, and outcomes of postdischarge prophylactic antibiotics after mastectomy with or without immediate breast reconstruction.
DESIGN
Retrospective cohort.
PATIENTS
Commercially insured women aged 18–64 years coded for mastectomy from January 2004 to December 2011 were included in the study. Women with a preexisting wound complication or septicemia were excluded.
METHODS
Predictors of prophylactic antibiotics within 5 days after discharge were identified in women with 1 year of prior insurance enrollment; relative risks (RR) were calculated using generalized estimating equations.
RESULTS
Overall, 12,501 mastectomy procedures were identified; immediate reconstruction was performed in 7,912 of these procedures (63.3%). Postdischarge prophylactic antibiotics were used in 4,439 procedures (56.1%) with immediate reconstruction and 1,053 procedures (22.9%) without immediate reconstruction (P<.001). The antibiotics most commonly prescribed were cephalosporins (75.1%) and fluoroquinolones (11.1%). Independent predictors of postdischarge antibiotics were implant reconstruction (RR, 2.41; 95% confidence interval [CI], 2.23–2.60), autologous reconstruction (RR, 2.17; 95% CI, 1.93–2.45), autologous reconstruction plus implant (RR, 2.11; 95% CI, 1.92–2.31), hypertension (RR, 1.05; 95% CI, 1.00–1.10), tobacco use (RR, 1.07; 95% CI, 1.01–1.14), surgery at an academic hospital (RR, 1.14; 95% CI, 1.07–1.21), and receipt of home health care (RR, 1.11; 95% CI, 1.04–1.18). Postdischarge prophylactic antibiotics were not associated with SSI after mastectomy with or without immediate reconstruction (both P>.05).
CONCLUSIONS
Prophylactic postdischarge antibiotics are commonly prescribed after mastectomy; immediate reconstruction is the strongest predictor. Stewardship efforts in this population to limit continuation of prophylactic antibiotics after discharge are needed to limit antimicrobial resistance.
Prediction of the developmental stages of wheat and wild oats would be useful in order to: 1) correctly time the application of herbicides, and 2) accurately schedule research and cultural operations. The Haun developmental scale which numbers leaf development and describes floral development on the main stem of grasses was found to be suitable for describing the development of semidwarf wheat and wild oats in California. Haun developmental rates of wheat and wild oats were similar. Interference by wheat or wild oats in mixed cultures did not change the developmental rate of either species when grown with added nutrients and water. Degree days gave better correlations with development than calendar days when different planting dates, years, and locations were compared. A degree day model with a 5 C base temperature and a second-order polynomial expression gave accurate predictions of developmental stage, which correlated well with field data.
To determine the impact of total household decolonization with intranasal mupirocin and chlorhexidine gluconate body wash on recurrent methicillin-resistant Staphylococcus aureus (MRSA) infection among subjects with MRSA skin and soft-tissue infection.
DESIGN
Three-arm nonmasked randomized controlled trial.
SETTING
Five academic medical centers in Southeastern Pennsylvania.
PARTICIPANTS
Adults and children presenting to ambulatory care settings with community-onset MRSA skin and soft-tissue infection (ie, index cases) and their household members.
INTERVENTION
Enrolled households were randomized to 1 of 3 intervention groups: (1) education on routine hygiene measures, (2) education plus decolonization without reminders (intranasal mupirocin ointment twice daily for 7 days and chlorhexidine gluconate on the first and last day), or (3) education plus decolonization with reminders, where subjects received daily telephone call or text message reminders.
MAIN OUTCOME MEASURES
Owing to small numbers of recurrent infections, this analysis focused on time to clearance of colonization in the index case.
RESULTS
Of 223 households, 73 were randomized to education-only, 76 to decolonization without reminders, 74 to decolonization with reminders. There was no significant difference in time to clearance of colonization between the education-only and decolonization groups (log-rank P=.768). In secondary analyses, compliance with decolonization was associated with decreased time to clearance (P=.018).
CONCLUSIONS
Total household decolonization did not result in decreased time to clearance of MRSA colonization among adults and children with MRSA skin and soft-tissue infection. However, subjects who were compliant with the protocol had more rapid clearance
We aimed to determine the frequency of qacA/B chlorhexidine tolerance genes and high-level mupirocin resistance among MRSA isolates before and after the introduction of a chlorhexidine (CHG) daily bathing intervention in a surgical intensive care unit (SICU).
DESIGN
Retrospective cohort study (2005–2012)
SETTING
A large tertiary-care center
PATIENTS
Patients admitted to SICU who had MRSA surveillance cultures of the anterior nares
METHODS
A random sample of banked MRSA anterior nares isolates recovered during (2005) and after (2006–2012) implementation of a daily CHG bathing protocol was examined for qacA/B genes and high-level mupirocin resistance. Staphylococcal cassette chromosome mec (SCCmec) typing was also performed.
RESULTS
Of the 504 randomly selected isolates (63 per year), 36 (7.1%) were qacA/B positive (+) and 35 (6.9%) were mupirocin resistant. Of these, 184 (36.5%) isolates were SCCmec type IV. There was a significant trend for increasing qacA/B (P=.02; highest prevalence, 16.9% in 2009 and 2010) and SCCmec type IV (P<.001; highest prevalence, 52.4% in 2012) during the study period. qacA/B(+) MRSA isolates were more likely to be mupirocin resistant (9 of 36 [25%] qacA/B(+) vs 26 of 468 [5.6%] qacA/B(−); P=.003).
CONCLUSIONS
A long-term, daily CHG bathing protocol was associated with a change in the frequency of qacA/B genes in MRSA isolates recovered from the anterior nares over an 8-year period. This change in the frequency of qacA/B genes is most likely due to patients in those years being exposed in prior admissions. Future studies need to further evaluate the implications of universal CHG daily bathing on MRSA qacA/B genes among hospitalized patients.
To identify risk factors for recurrent methicillin-resistant Staphylococcus aureus (MRSA) colonization.
DESIGN
Prospective cohort study conducted from January 1, 2010, through December 31, 2012.
SETTING
Five adult and pediatric academic medical centers.
PARTICIPANTS
Subjects (ie, index cases) who presented with acute community-onset MRSA skin and soft-tissue infection.
METHODS
Index cases and all household members performed self-sampling for MRSA colonization every 2 weeks for 6 months. Clearance of colonization was defined as 2 consecutive sampling periods with negative surveillance cultures. Recurrent colonization was defined as any positive MRSA surveillance culture after clearance. Index cases with recurrent MRSA colonization were compared with those without recurrence on the basis of antibiotic exposure, household demographic characteristics, and presence of MRSA colonization in household members.
RESULTS
The study cohort comprised 195 index cases; recurrent MRSA colonization occurred in 85 (43.6%). Median time to recurrence was 53 days (interquartile range, 36–84 days). Treatment with clindamycin was associated with lower risk of recurrence (odds ratio, 0.52; 95% CI, 0.29–0.93). Higher percentage of household members younger than 18 was associated with increased risk of recurrence (odds ratio, 1.01; 95% CI, 1.00–1.02). The association between MRSA colonization in household members and recurrent colonization in index cases did not reach statistical significance in primary analyses.
CONCLUSION
A large proportion of patients initially presenting with MRSA skin and soft-tissue infection will have recurrent colonization after clearance. The reduced rate of recurrent colonization associated with clindamycin may indicate a unique role for this antibiotic in the treatment of such infection.
Infect. Control Hosp. Epidemiol. 2015;36(7):786–793
To investigate whether operative factors are associated with risk of surgical site infection (SSI) after hernia repair.
Design
Retrospective cohort study.
Patients
Commercially insured enrollees aged 6 months-64 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure or Current Procedural Terminology, fourth edition, codes for inguinal/femoral, umbilical, and incisional/ventral hernia repair procedures from January 1, 2004, through December 31, 2010.
Methods
SSIs within 90 days after hernia repair were identified by diagnosis codes. The χ2 and Fisher exact tests were used to compare SSI incidence by operative factors.
Results
A total of 119,973 hernia repair procedures were analyzed. The incidence of SSI differed significantly by anatomic site, with rates of 0.45% (352/77,666) for inguinal/femoral, 1.16% (288/24,917) for umbilical, and 4.11% (715/17,390) for incisional/ventral hernia repair. Within anatomic sites, the incidence of SSI was significantly higher for open versus laparoscopic inguinal/femoral (0.48% [295/61,142] vs 0.34% [57/16,524], P=.020) and incisional/ventral (4.20% [701/16,699] vs 2.03% [14/691], P=.005) hernia repairs. The rate of SSI was higher following procedures with bowel obstruction/necrosis than procedures without obstruction/necrosis for open inguinal/femoral (0.89% [48/5,422] vs 0.44% [247/55,720], P<.001) and umbilical (1.57% [131/8,355] vs 0.95% [157/16,562], P<.001), but not incisional/ventral hernia repair (4.01% [224/5,585] vs 4.16% [491/11,805], P=.645).
Conclusions
The incidence of SSI was highest after open procedures, incisional/ventral repairs, and hernia repairs with bowel obstruction/necrosis. Stratification of hernia repair SSI rates by some operative factors may facilitate accurate comparison of SSI rates between facilities.
International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnosis codes are increasingly used to identify healthcare-associated infections, often with insufficient evidence demonstrating validity of the codes used. Absent medical record verification, we sought to confirm a claims algorithm to identify surgical site infections (SSIs) by examining the presence of clinically expected SSI treatment.
Methods.
We performed a retrospective cohort study, using private insurer claims data from persons less than 65 years old with ICD-9-CM procedure or Current Procedure Terminology (CPT-4) codes for anterior cruciate ligament (ACL) reconstruction from January 2004 through December 2010. SSIs occurring within 90 days after ACL reconstruction were identified by ICD-9-CM diagnosis codes. Antibiotic utilization, surgical treatment, and microbiology culture claims within 14 days of SSI codes were used as evidence to support the SSI diagnosis.
Results.
Of 40,702 procedures, 401 (1.0%) were complicated by SSI, 172 (0.4%) of which were specifically identified as septic arthritis. Most SSIs were associated with an inpatient admission (232/401 [58%]), and/or surgical procedure(s) for treatment (250/401 [62%]). Temporally associated antibiotics, surgical treatment procedures, and cultures were present for 84% (338/401), 61% (246/401), and 59% (238/401), respectively. Only 5.7% (23/401) of procedures coded for SSI after the procedure had no antibiotics, surgical treatments, or cultures within 14 days of the SSI claims.
Conclusions.
More than 94% of patients identified by our claims algorithm as having an SSI received clinically expected treatment for infection, including antibiotics, surgical treatment, and culture, suggesting that this algorithm has very good positive predictive value. This method may facilitate retrospective SSI surveillance and comparison of SSI rates across facilities and providers.
The IntCal09 and Marine09 radiocarbon calibration curves have been revised utilizing newly available and updated data sets from 14C measurements on tree rings, plant macrofossils, speleothems, corals, and foraminifera. The calibration curves were derived from the data using the random walk model (RWM) used to generate IntCal09 and Marine09, which has been revised to account for additional uncertainties and error structures. The new curves were ratified at the 21st International Radiocarbon conference in July 2012 and are available as Supplemental Material at www.radiocarbon.org. The database can be accessed at http://intcal.qub.ac.uk/intcal13/.
High-quality data from appropriate archives are needed for the continuing improvement of radiocarbon calibration curves. We discuss here the basic assumptions behind 14C dating that necessitate calibration and the relative strengths and weaknesses of archives from which calibration data are obtained. We also highlight the procedures, problems, and uncertainties involved in determining atmospheric and surface ocean 14C/12C in these archives, including a discussion of the various methods used to derive an independent absolute timescale and uncertainty. The types of data required for the current IntCal database and calibration curve model are tabulated with examples.
We report on the electrical and structural properties of boron-doped diamond tips commonly used for in-situ electromechanical testing during nanoindentation. The boron dopant environment, as evidenced by cathodoluminescence (CL) microscopy, revealed significantly different boron states within each tip. Characteristic emission bands of both electrically activated and nonelectrically activated boron centers were identified in all boron-doped tips. Surface CL mapping also revealed vastly different surface properties, confirming a high amount of nonelectrically activated boron clusters at the tip surface. Raman microspectroscopy analysis showed that structural characteristics at the atomic scale for boron-doped tips also differ significantly when compared to an undoped diamond tip. Furthermore, the active boron concentration, as inferred via the Raman analysis, varied greatly from tip-to-tip. It was found that tips (or tip areas) with low overall boron concentration have a higher number of electrically inactive boron, and thus non-Ohmic contacts were made when these tips contacted metallic substrates. Conversely, tips that have higher boron concentrations and a higher number of electrically active boron centers display Ohmic-like contacts. Our results demonstrate the necessity to understand and fully characterize the boron environments, boron concentrations, and atomic structure of the tips prior to performing in situ electromechanical experiments, particularly if quantitative electrical data are required.
Canadian Cardiovascular Society consensus guidelines recommend that tetralogy of Fallot patients be seen by a congenital cardiologist every 2 years. In Atlantic Canada, tetralogy of Fallot patients are followed up at either tertiary or satellite clinics, which are held in the community and attended by paediatric cardiologists. The effectiveness of satellite clinics in congenital cardiac disease follow-up is unproven. Our objective was to compare patient-reported quality of life measures to determine whether these were impacted by the site of follow-up.
Methods
We included patients with tetralogy of Fallot undergoing surgical repair at the Izaak Walton Killam Health Centre from 1 November, 1972 to 31 May, 2002. Quality of life surveys, SF-10 or SF-36v2, were administered to consenting patients. We analysed the subjective health status by patient age and site of follow-up.
Results
Of the 184 eligible patients, 72 were lost to follow-up. Of the locatable patients, 61% completed the questionnaires. In all, 90% (101 out of 112) were followed up at recommended intervals. Of the 112 (68%) patients, 76 were followed up at a tertiary clinic. These patients were older, with a mean age of 18.4 years versus 14.7 years, and scored higher on the SF-36 physical component summary (52.6 versus 45.7, p = 0.02) compared with satellite clinic patients. The SF-36 mental component summary scores were similar for patients regardless of the site of follow-up. SF-10 physical and psychosocial scores were similar regardless of the site of follow-up.
Conclusion
Tetralogy of Fallot patients followed at either satellite or tertiary clinics have similar subjective health status.
To describe dietary changes in men participating in an obesity intervention as part of the Self-Help, Exercise and Diet using Information Technology (SHED-IT) study.
Design
An assessor-blinded randomized controlled trial comparing Internet (n 34) v. information-only groups (n 31) with 6-month follow-up. Dietary intake assessed by FFQ, reporting usual consumption of seventy-four foods and six alcoholic beverages using a 10-point frequency scale. A single portion size (PSF) factor was calculated based on photographs to indicate usual serving sizes.
Setting
The campus community of the University of Newcastle, New South Wales, Australia.
Subjects
Sixty-five overweight/obese men (43 % students, 42 % non-academic general staff, 15 % academic staff; mean age 35·9 (sd 11·1) years, mean BMI 30·6 (sd 2·8) kg/m2).
Results
The average PSF decreased significantly over time (χ2 = 20·9, df = 5, P < 0·001) with no differences between groups. While both groups reduced mean daily energy intake (GLM χ2 = 34·5, df = 3, P < 0·001), there was a trend towards a greater reduction in the Internet group (GLM χ2 = 3·3, P = 0·07). Both groups reduced percentage of energy from fat (P < 0·05), saturated fat (P < 0·001) and energy-dense/nutrient-poor items (P < 0·05), with no change in dietary fibre or alcohol (P > 0·05).
Conclusions
Although men reported some positive dietary changes during weight loss, they did not increase vegetable intakes nor decrease alcohol consumption, while saturated fat, fibre and Na intakes still exceeded national targets. Future interventions for men should promote specific food-based guidelines to target improvements in their diet-related risk factor profile for chronic diseases.
To investigate the impact of school garden-enhanced nutrition education (NE) on children’s fruit and vegetable consumption, vegetable preferences, fruit and vegetable knowledge and quality of school life.
Design
Quasi-experimental 10-week intervention with nutrition education and garden (NE&G), NE only and control groups. Fruit and vegetable knowledge, vegetable preferences (willingness to taste and taste ratings), fruit and vegetable consumption (24 h recall × 2) and quality of school life (QoSL) were measured at baseline and 4-month follow-up.
Setting
Two primary schools in the Hunter Region, New South Wales, Australia.
Subjects
A total of 127 students in Grades 5 and 6 (11–12 years old; 54 % boys).
Results
Relative to controls, significant between-group differences were found for NE&G and NE students for overall willingness to taste vegetables (P < 0·001) and overall taste ratings of vegetables (P < 0·001). A treatment effect was found for the NE&G group for: ability to identify vegetables (P < 0·001); willingness to taste capsicum (P = 0·04), broccoli (P = 0·01), tomato (P < 0·001) and pea (P = 0·02); and student preference to eat broccoli (P < 0·001) and pea (P < 0·001) as a snack. No group-by-time differences were found for vegetable intake (P = 0·22), fruit intake (P = 0·23) or QoSL (P = 0·98).
Conclusions
School gardens can impact positively on primary-school students’ willingness to taste vegetables and their vegetable taste ratings, but given the complexity of dietary behaviour change, more comprehensive strategies are required to increase vegetable intake.