Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ecological inference (EI) is the process of learning about individual behavior from aggregate data. We relax assumptions by allowing for “linear contextual effects,” which previous works have regarded as plausible but avoided due to nonidentification, a problem we sidestep by deriving bounds instead of point estimates. In this way, we offer a conceptual framework to improve on the Duncan–Davis bound, derived more than 65 years ago. To study the effectiveness of our approach, we collect and analyze 8,430
EI datasets with known ground truth from several sources—thus bringing considerably more data to bear on the problem than the existing dozen or so datasets available in the literature for evaluating EI estimators. For the 88% of real data sets in our collection that fit a proposed rule, our approach reduces the width of the Duncan–Davis bound, on average, by about 44%, while still capturing the true district-level parameter about 99% of the time. The remaining 12% revert to the Duncan–Davis bound.
Collaborative programs have helped reduce catheter-associated urinary tract infection (CAUTI) rates in community-based nursing homes. We assessed whether collaborative participation produced similar benefits among Veterans Health Administration (VHA) nursing homes, which are part of an integrated system.
This study included 63 VHA nursing homes enrolled in the “AHRQ Safety Program for Long-Term Care,” which focused on practices to reduce CAUTI.
Changes in CAUTI rates, catheter utilization, and urine culture orders were assessed from June 2015 through May 2016. Multilevel mixed-effects negative binomial regression was used to derive incidence rate ratios (IRRs) representing changes over the 12-month program period.
There was no significant change in CAUTI among VHA sites, with a CAUTI rate of 2.26 per 1,000 catheter days at month 1 and a rate of 3.19 at month 12 (incidence rate ratio [IRR], 0.99; 95% confidence interval [CI], 0.67–1.44). Results were similar for catheter utilization rates, which were 11.02% at month 1 and 11.30% at month 12 (IRR, 1.02; 95% CI, 0.95–1.09). The numbers of urine cultures per 1,000 residents were 5.27 in month 1 and 5.31 in month 12 (IRR, 0.93; 95% CI, 0.82–1.05).
No changes in CAUTI rates, catheter use, or urine culture orders were found during the program period. One potential reason was the relatively low baseline CAUTI rate, as compared with a cohort of community-based nursing homes. This low baseline rate is likely related to the VHA’s prior CAUTI prevention efforts. While broad-scale collaborative approaches may be effective in some settings, targeting higher-prevalence safety issues may be warranted at sites already engaged in extensive infection prevention efforts.
We present the first data release of the SkyMapper Southern Survey, a hemispheric survey carried out with the SkyMapper Telescope at Siding Spring Observatory in Australia. Here, we present the survey strategy, data processing, catalogue construction, and database schema. The first data release dataset includes over 66 000 images from the Shallow Survey component, covering an area of 17 200 deg2 in all six SkyMapper passbands uvgriz, while the full area covered by any passband exceeds 20 000 deg2. The catalogues contain over 285 million unique astrophysical objects, complete to roughly 18 mag in all bands. We compare our griz point-source photometry with Pan-STARRS1 first data release and note an RMS scatter of 2%. The internal reproducibility of SkyMapper photometry is on the order of 1%. Astrometric precision is better than 0.2 arcsec based on comparison with Gaia first data release. We describe the end-user database, through which data are presented to the world community, and provide some illustrative science queries.
We evaluated rates of clinically confirmed long-term-care facility-onset Clostridium difficile infections from April 2014 through December 2016 in 132 Veterans Affairs facilities after the implementation of a prevention initiative. The quarterly pooled rate decreased 36.1% from the baseline (P<.0009 for trend) by the end of the analysis period.
We have detected four far-infrared emission lines of water vapor toward the evolved star W Hydrae, using the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO). This is the first detection of thermal water vapor emission from a circumstellar outflow.
Complications within 30 days of a clinically confirmed hospital-onset Clostridium difficile infection diagnosis from July 1, 2012, through June 30, 2015, in 127 acute care Veterans Health Administration facilities were evaluated. Pooled rates for attributable intensive care unit admissions, colectomies, and deaths were 2.7%, 0.5%, and 0.4%, respectively.
Rates of clinically confirmed hospital-onset healthcare facility-associated Clostridium difficile infections from July 1, 2012, through March 31, 2015, in 127 acute care Veterans Affairs facilities were evaluated. Quarterly pooled national standardized infection ratios decreased 15% from baseline by the final quarter of the analysis period (P=.01, linear regression).
A nationwide initiative was implemented in February 2014 to decrease Clostridium difficile infections (CDI) in Veterans Affairs (VA) long-term care facilities. We report a baseline of national CDI data collected during the 2 years before the Initiative.
Personnel at each of 122 reporting sites entered monthly retrospective CDI case data from February 2012 through January 2014 into a national database using case definitions similar to those used in the National Healthcare Safety Network Multidrug-Resistant Organism/CDI module. The data were evaluated using Poisson regression models to examine infection occurrences over time while accounting for admission prevalence and type of diagnostic test.
During the 24-month analysis period, there were 100,800 admissions, 6,976,121 resident days, and 1,558 CDI cases. The pooled CDI admission prevalence rate (including recurrent cases) was 0.38 per 100 admissions, and the pooled nonduplicate/nonrecurrent community-onset rate was 0.17 per 100 admissions. The pooled long-term care facility–onset rate and the clinically confirmed (ie, diarrhea or evidence of pseudomembranous colitis) long-term care facility–onset rate were 1.98 and 1.78 per 10,000 resident days, respectively. Accounting for diagnostic test type, the long-term care facility–onset rate declined significantly (P=.05), but the clinically confirmed long-term care facility–onset rate did not.
VA long-term care facility CDI rates were comparable to those in recent reports from other long-term care facilities. The significant decline in the long-term care facility-onset rate but not in the clinically confirmed long-term care facility–onset rate may have been due to less testing of asymptomatic patients. Efforts to decrease CDI rates in long-term care facilities are necessary as part of a coordinated approach to decrease healthcare-associated infections.
Infect. Control Hosp. Epidemiol. 2016;37(3):295–300
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Past dietary patterns may be more important than recent dietary patterns in the aetiology of chronic diseases because of the long latency in their development. We developed an instrument to recall vegetarian dietary patterns during the lifetime and examined its reliability of recall over 5·3 and 32·6 years on average. The short-term/5-year recall ability study (5-RAS) was done using 24 690 participants from the cohort of the Adventist Health Study-2 (mean age 62·2 years). The long-term/33-year recall ability study (33-RAS) included an overlap population of 1721 individuals who joined the Adventist Health Study-1 and Adventist Health Study-2 (mean age 72·5 years). Spearman correlation coefficients for recall of vegetarian status were 0·78 and 0·72 for the 5-RAS and 33-RAS, respectively, when compared with ‘reference’ data. For both time periods sensitivity and positive predictive values were highest for the lacto-ovo-vegetarian and non-vegetarian patterns (vegans, lacto-ovo-vegetarians, pesco-vegetarians, semi-vegetarians and non-vegetarians). In the 5-RAS analyses, male, non-black, younger, and more educated participants, lifetime Adventists, and those with more stability of consumption of animal products generally showed higher recall ability. Somewhat similar tendencies were shown for the 33-RAS analyses. Our findings show that the instrument has higher reliability for recalled lacto-ovo-vegetarian and non-vegetarian than for vegan, semi- and pesco-vegetarian dietary patterns in both short- and long-term recalls. This is in part because these last dietary patterns were greatly contaminated by recalls that correctly would have belonged in the adjoining category that consumed more animal products.
Sampling approaches following the dairy chain, including microbiological hygiene status of critical processing steps and physicochemical parameters, contribute to our understanding of how Staphylococcus aureus contamination risks can be minimised. Such a sampling approach was adopted in this study, together with rapid culture-independent quantification of Staph. aureus to supplement standard microbiological methods. A regional cheese production chain, involving 18 farms, was sampled on two separate occasions. Overall, 51·4% of bulk milk samples were found to be Staph. aureus positive, most of them (34·3%) at the limit of culture-based detection. Staph. aureus positive samples >100 cfu/ml were recorded in 17·1% of bulk milk samples collected mainly during the sampling in November. A higher number of Staph. aureus positive bulk milk samples (94·3%) were detected after applying the culture-independent approach. A concentration effect of Staph. aureus was observed during curd processing. Staph. aureus were not consistently detectable with cultural methods during the late ripening phase, but >100 Staph. aureus cell equivalents (CE)/ml or g were quantifiable by the culture-independent approach until the end of ripening. Enterotoxin gene PCR and pulsed-field gel electrophoresis (PFGE) typing provided evidence that livestock adapted strains of Staph. aureus mostly dominate the post processing level and substantiates the belief that animal hygiene plays a pivotal role in minimising the risk of Staph. aureus associated contamination in cheese making. Therefore, the actual data strongly support the need for additional sampling activities and recording of physicochemical parameters during semi-hard cheese-making and cheese ripening, to estimate the risk of Staph. aureus contamination before consumption.
An initiative was implemented in July 2012 to decrease Clostridium difficile infections (CDIs) in Veterans Affairs (VA) acute care medical centers nationwide. This is a report of national baseline CDI data collected from the 21 months before implementation of the initiative.
Personnel at each of 132 data-reporting sites entered monthly retrospective CDI case data from October 2010 through June 2012 into a central database using case definitions similar to those of the National Healthcare Safety Network multidrug-resistant organism/CDI module.
There were 958,387 hospital admissions, 5,286,841 patient-days, and 9,642 CDI cases reported during the 21-month analysis period. The pooled CDI admission prevalence rate (including recurrent cases) was 0.66 cases per 100 admissions. The nonduplicate/nonrecurrent community-onset not-healthcare-facility-associated (CO-notHCFA) case rate was 0.35 cases per 100 admissions, and the community-onset healthcare facility–associated (CO-HCFA) case rate was 0.14 cases per 100 admissions. Hospital-onset healthcare facility–associated (HO-HCFA), clinically confirmed HO-HCFA (CC-HO-HCFA), and CO-HCFA rates were 9.32, 8.40, and 2.56 cases per 10,000 patient-days, respectively. There were significant decreases in admission prevalence (P = .0006, Poisson regression), HO-HCFA (P = .003), and CC-HO-HCFA (P = .004) rates after adjusting for type of diagnostic test. CO-HCFA and CO-notHCFA rates per 100 admissions also trended downward (P = .07 and .10, respectively).
VA acute care medical facility CDI rates were higher than those reported in other healthcare systems, but unlike rates in other venues, they were decreasing or trending downward. Despite these downward trends, there is still a substantial burden of CDI in the system supporting the need for efforts to decrease rates further.
Currently there are no methods available for staining rat and human myocardial microvasculature on thick sections that would allow for specific staining and differentiation of arterioles, venules, and capillaries. A non-injection technique is described that allows for labeling of the microvascular bed (MVB) in formalin-fixed pieces of the myocardium from humans and the white rat Rattus norvegicus, as well as human full-mount pericardium. Vessel staining is based on the activity of phosphatases (ATPases) and the precipitation of the released phosphate with calcium ions at high pH (pH 10.5–11.5). The resulting precipitate subsequently is converted to black or brown lead sulfide. The specificity of this reaction to vessels of the MVB allows arterioles, venules, capillaries, and pre- and postcapillaries to be clearly visualized in thick (60–100 µm) and ultra-thick (300–500 µm) sections against an unstained background of muscle and connective tissue. In addition, smooth muscle cells of arterioles are also stained allowing for differentiation between arteriolar and venular beds. These observations have not been reported in rat or human myocardium using other methods. This procedure should benefit studies of coronary microcirculation in experimental and pathological conditions, as well as in pharmacological investigations.