To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Hospitalized patients placed in isolation due to a carrier state or infection with resistant or highly communicable organisms report higher rates of anxiety and loneliness and have fewer physician encounters, room entries, and vital sign records. We hypothesized that isolation status might adversely impact patient experience as reported through Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) surveys, particularly regarding communication.
Retrospective analysis of HCAHPS survey results over 5 years.
A 1,165-bed, tertiary-care, academic medical center.
Patients on any type of isolation for at least 50% of their stay were the exposure group. Those never in isolation served as controls.
Multivariable logistic regression, adjusting for age, race, gender, payer, severity of illness, length of stay and clinical service were used to examine associations between isolation status and “top-box” experience scores. Dose response to increasing percentage of days in isolation was also analyzed.
Patients in isolation reported worse experience, primarily with staff responsiveness (help toileting 63% vs 51%; adjusted odds ratio [aOR], 0.77; P = .0009) and overall care (rate hospital 80% vs 73%; aOR, 0.78; P < .0001), but they reported similar experience in other domains. No dose-response effect was observed.
Isolated patients do not report adverse experience for most aspects of provider communication regarded to be among the most important elements for safety and quality of care. However, patients in isolation had worse experiences with staff responsiveness for time-sensitive needs. The absence of a dose-response effect suggests that isolation status may be a marker for other factors, such as illness severity. Regardless, hospitals should emphasize timely staff response for this population.
Patients with candidemia are at risk for other invasive infections, such as methicillin-resistant Staphylococcus aureus (MRSA) bloodstream infection (BSI).
To identify the risk factors for, and outcomes of, BSI in adults with Candida spp. and MRSA at the same time or nearly the same time.
Population-based cohort study.
Metropolitan Atlanta, March 1, 2008, through November 30, 2012.
All residents with Candida spp. or MRSA isolated from blood.
The Georgia Emerging Infections Program conducts active, population-based surveillance for candidemia and invasive MRSA. Medical records for patients with incident candidemia were reviewed to identify cases of MRSA coinfection, defined as incident MRSA BSI 30 days before or after candidemia. Multivariate logistic regression was performed to identify factors associated with coinfection in patients with candidemia.
Among 2,070 adult candidemia cases, 110 (5.3%) had coinfection within 30 days. Among these 110 coinfections, MRSA BSI usually preceded candidemia (60.9%; n=67) or occurred on the same day (20.0%; n=22). The incidence of coinfection per 100,000 population decreased from 1.12 to 0.53 between 2009 and 2012, paralleling the decreased incidence of all MRSA BSIs and candidemia. Thirty-day mortality was similarly high between coinfection cases and candidemia alone (45.2% vs 36.0%, P=.10). Only nursing home residence (odds ratio, 1.72 [95% CI, 1.03–2.86]) predicted coinfection.
A small but important proportion of patients with candidemia have MRSA coinfection, suggesting that heightened awareness is warranted after 1 major BSI pathogen is identified. Nursing home residents should be targeted in BSI prevention efforts.
Infect. Control Hosp. Epidemiol. 2015;36(11):1298–1304
Using the Veterans’ Health Administration MRSA Directive as a platform to collect methicillin-resistant Staphylococcus aureus (MRSA) colonization isolates and an active MRSA infection surveillance program, the genetic relatedness of colonization and infection isolates was evaluated. Infection and colonization strain concordance was found in 85.7% of patients. The USA 500 MRSA strain was identified in 31.8% of patients.
Lack of coordination between screening studies for common mental disorders in primary care and community epidemiological samples impedes progress in clinical epidemiology. Short screening scales based on the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI), the diagnostic interview used in community epidemiological surveys throughout the world, were developed to address this problem.
Expert reviews and cognitive interviews generated CIDI screening scale (CIDI-SC) item pools for 30-day DSM-IV-TR major depressive episode (MDE), generalized anxiety disorder (GAD), panic disorder (PD) and bipolar disorder (BPD). These items were administered to 3058 unselected patients in 29 US primary care offices. Blinded SCID clinical reinterviews were administered to 206 of these patients, oversampling screened positives.
Stepwise regression selected optimal screening items to predict clinical diagnoses. Excellent concordance [area under the receiver operating characteristic curve (AUC)] was found between continuous CIDI-SC and DSM-IV/SCID diagnoses of 30-day MDE (0.93), GAD (0.88), PD (0.90) and BPD (0.97), with only 9–38 questions needed to administer all scales. CIDI-SC versus SCID prevalence differences are insignificant at the optimal CIDI-SC diagnostic thresholds (χ21 = 0.0–2.9, p = 0.09–0.94). Individual-level diagnostic concordance at these thresholds is substantial (AUC 0.81–0.86, sensitivity 68.0–80.2%, specificity 90.1–98.8%). Likelihood ratio positive (LR+) exceeds 10 and LR− is 0.1 or less at informative thresholds for all diagnoses.
CIDI-SC operating characteristics are equivalent (MDE, GAD) or superior (PD, BPD) to those of the best alternative screening scales. CIDI-SC results can be compared directly to general population CIDI survey results or used to target and streamline second-stage CIDIs.
The effect of oxygen dose variation on the SIMOX SOI structure has been investigated extensively in this work. Keeping other implantation parameters constant, the dose has been varied from 0.2x1018cm–2 to 1.4xl018cm–2. Our studies show that a critical dose exists for a particular implant energy, below which a continuous buried oxide layer does not form. Further, the SiO2 precipitates enhance the pinning of defects in the as-implanted state which subsequently grow into threading dislocations. A dose window exists, where the dislocation density is a minimum.
The oxygen depth distribution in the as-implanted and annealed state has been studied using SIMS. HRXRD measurements have been performed to reveal the strain state as a function of dose variation. The defect microstructure has been investigated using PTEM, XTEM and Secco Etching. AFM studies show the quality of the Si surface and Si/BOX (Buried OXide) interface roughness. This work gives us a fundamental understanding of the evolution of the defect microstructure and strain from the as-implanted to the annealed state using independent implantation parameter control.
Novel bile acid sequestrants based on a polyammonium backbone were synthesized using molecular imprinting technique. These imprinted polymer networks were prepared by crosslinking different polymeric amines with crosslinking agents in the presence of sodium cholate as the template. The template molecules were completely removed from the polymer matrices by repeated washings. The bile acid sequestration properties of these polymeric resins were evaluated under both in vitro and in vivo conditions. Adsorption isotherms performed in physiologically relevant media revealed that molecular imprinting led to improvement in bile acid sequestration with about a twofold increase in the Ka (association constant). More importantly, hamsters fed with imprinted polymers in their diet excreted more bile acids than the non-imprinted control polymer. These results suggest that molecular imprinting may be potentially an interesting approach to prepare novel polymer therapeutics.
Previous studies on the relationship of dietary intake to the neighbourhood food environment have focused on access to supermarkets, quantified by geographic distance or store concentration measures. However, in-store food availability may also be an important determinant, particularly for urban neighbourhoods with a greater concentration of small food stores. This study synthesises both types of information – store access and in-store availability – to determine their potential relationship to fruit and vegetable consumption.
Residents in four census tracts were surveyed in 2001 about their fruit and vegetable intake. Household distances to food stores in these and surrounding tracts were obtained using geographical information system mapping techniques. In-store fruit and vegetable availability was measured by linear shelf space. Multivariate linear regression models were used to measure the association of these neighbourhood availability measures with consumption.
Four contiguous census tracts in central-city New Orleans.
A random sample of 102 households.
Greater fresh vegetable availability within 100 m of a residence was a positive predictor of vegetable intake; each additional metre of shelf space was associated with 0.35 servings per day of increased intake. Fresh fruit availability was not associated with intake, although having a small food store within this same distance was a marginal predictor of fruit consumption.
The findings suggest the possible importance of small neighbourhood food stores and their fresh produce availability in affecting fruit and vegetable intake.
From the latter 1940s until 1977, the General Electric Corporation (GE) discharged an estimated 200,000 to 1.3 million pounds (U.S. Environmental Protection Agency, 2000a) of polychlorinated biphenyls (PCBs) into the Hudson River from two electrical capacitor manufacturing plants at Hudson Falls and Fort Edward, New York (Fig. 24.1). In 1977, under a settlement agreement with the New York State Department of Environmental Conservation, GE stopped direct discharges of PCBs to the river, although leakage of PCBs from the factory sites to the river continues to this day. PCBs used at the GE plants were oily liquids containing dozens of distinct PCB compounds. Most of these components are persistent in the environment, attach strongly to soils and river sediments, and readily accumulate in fish, wildlife, and humans (National Research Council, 2001a). These properties, combined with the large discharges of PCBs from the GE plants over 50+ years, have led to elevated levels of PCBs in the water, sediments, and biota of the Upper Hudson River (defined here as the stretch upstream of the Troy lock and dam). Levels of PCBs in the Hudson River ecosystem are among the highest in the United States.
PCB contamination in the Hudson River is a management problem for the public because it has likely increased human health risks (primarily from consumption of fish), increased ecological risks to fish and fish-eating birds and mammals, and caused losses of river use and the resulting economic impacts (catch and release only fishery; advisories on fish consumption; restrictions on navigational dredging limiting access to the Champlain Canal; restrictions on and the increased costs of dredging; and commercial fishery closure).
Two-dimensional (2D) FLASH simulations were run with Spitzer-Härm
conductivity on and off in an attempt to simulate a laser-produced blast
wave. Dissociation, ionization, recombination, and radiative cooling were
not included. An initial Gaussian temperature profile with
T0 = 120 eV and spot radius r0 =
25 μm was used assuming 1 μm thickness of the CH disk is ablated
into the background nitrogen gas. Evolution of the blast wave differs
slightly between the cases of Spitzer-Härm on and off, and neither
case matches well with experiment. Due to the high temperatures involved,
a thermal wave should be expected such that the Spitzer-Härm
conductivity on case is more likely. A simulation run with an initial
temperature of ∼ 4 keV might match better with experiment.
Salmonellosis is the leading cause of death caused by foodborne bacterial pathogens in the United States. Approximately 90% of salmonella infections are sporadic, but most of what is known about salmonellosis has come from outbreak investigations. We studied the risk for sporadic salmonellosis among 115 persons aged [ges ]15 years reported to the Louisiana Office of Public Health during May 1998–April 1999, compared with 115 age-matched controls. Significantly more case-patients than controls had chronic underlying medical conditions [adjusted odds ratio (aOR) = 4.3; 95% confidence interval (CI) = 2.2–8.7]. Although reported consumption of specific food items likely to contain salmonella was not associated with illness, inconsistent handwashing between preparation of meat and non-meat items was associated with illness (aOR = 8.3; CI = 1.1–61.8). Enhanced measures to provide a consistently safe food supply and promote safer food preparation in households will depend on prevention of sporadic salmonellosis.
An outbreak of salmonellosis occurred among 63 wedding participants. The outbreak was
investigated through cohort, laboratory, and environmental studies. Consumption of rice-dressing
made from a commercially cooked, meat-based, rice-dressing mix was strongly
associated with illness. Nineteen patient isolates, six company/grocery store isolates cultured
from the rice-dressing mix, and one environmental isolate from a pump in the production line
were of an identical outbreak strain of Salmonella Infantis characterized by pulsed-field gel
electrophoresis. In the production line, cooked rice-dressing mix tested negative for S. Infantis
before and positive after contact with the contaminated pump. The dressing-mix had an
estimated 200 colony-forming units of salmonella per gram of product, and > 180 000 pounds
were distributed in 9 states for [ges ] 2 months before contamination was recognized. Food
manufacturers should be required to use systematic, hazard analysis critical control point risk
management practices for all processed meat products, validated by periodic microbiologic
monitoring of the end product.
Email your librarian or administrator to recommend adding this to your organisation's collection.