To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite knowing for many decades that depressive psychopathology is common in first-episode schizophrenia spectrum disorders (FES), there is limited knowledge regarding the extent and nature of such psychopathology (degree of comorbidity, caseness, severity) and its demographic, clinical, functional and treatment correlates. This study aimed to determine the pooled prevalence of depressive disorder and caseness, and the pooled mean severity of depressive symptoms, as well as the demographic, illness, functional and treatment correlates of depressive psychopathology in FES.
This systematic review, meta-analysis and meta-regression was prospectively registered (CRD42018084856) and conducted in accordance with PRISMA and MOOSE guidelines.
Forty studies comprising 4041 participants were included. The pooled prevalence of depressive disorder and caseness was 26.0% (seven samples, N = 855, 95% CI 22.1–30.3) and 43.9% (11 samples, N = 1312, 95% CI 30.3–58.4), respectively. The pooled mean percentage of maximum depressive symptom severity was 25.1 (38 samples, N = 3180, 95% CI 21.49–28.68). Correlates of depressive psychopathology were also found.
At least one-quarter of individuals with FES will experience, and therefore require treatment for, a full-threshold depressive disorder. Nearly half will experience levels of depressive symptoms that are severe enough to warrant diagnostic investigation and therefore clinical intervention – regardless of whether they actually fulfil diagnostic criteria for a depressive disorder. Depressive psychopathology is prominent in FES, manifesting not only as superimposed comorbidity, but also as an inextricable symptom domain.
Catheter-associated urinary tract infections in 592 hospitals immediately declined after federal value-based incentive program implementation, but this was fully attributable to a concurrent surveillance case definition revision. Post revision, more hospitals had favorable standardized infection ratios, likely leading to artificial inflation of their performance scores unrelated to changes in patient safety.
Ultrasound applications are widespread, and their utility in resource-limited environments are numerous. In disasters, the use of ultrasound can help reallocate resources by guiding decisions on management and transportation priorities. These interventions can occur on-scene, at triage collection points, during transport, and at the receiving medical facility. Literature related to this specific topic is limited. However, literature regarding prehospital use of ultrasound, ultrasound in combat situations, and some articles specific to disaster medicine allude to the potential growth of ultrasound utilization in disaster response.
To evaluate the utility of point-of-care ultrasound in a disaster response based on studies involving ultrasonography in resource-limited environments.
A narrative review of MEDLINE, MEDLINE InProcess, EPub, and Embase found 20 articles for inclusion.
Experiences from past disasters, prehospital care, and combat experiences have demonstrated the value of ultrasound both as a diagnostic and interventional modality.
Current literature supports the use of ultrasound in disaster response as a real-time, portable, safe, reliable, repeatable, easy-to-use, and accurate tool. While both false positives and false negatives were reported in prehospital studies, these values correlate to accepted false positive and negative rates of standard in-hospital point-of-care ultrasound exams. Studies involving austere environments demonstrate the ability to apply ultrasound in extreme conditions and to obtain high-quality images with only modest training and real-time remote guidance. The potential for point-of-care ultrasound in triage and management of mass casualty incidents is there. However, as these studies are heterogeneous and observational in nature, further research is needed as to how to integrate ultrasound into the response and recovery phases.
An X-ray fluorescence analysis unit has been automated with a multi-position sample changer, a stepping motor to position the spectrometer, and computer addressable switches to control the selection of crystal, detector, collimator, and beam filter. The unit can be controlled off-line through a Teletype or on-line with a computer. This computer utilizes a multi-user program for the simultaneous operation of the fluorescence analysis unit and two diffractometers. Programming the system for any desired analytical or research procedure is accomplished using an expanded version of BASIC.
Culture-based studies, which focus on individual organisms, have implicated stethoscopes as potential vectors of nosocomial bacterial transmission. However, the full bacterial communities that contaminate in-use stethoscopes have not been investigated.
We used bacterial 16S rRNA gene deep-sequencing, analysis, and quantification to profile entire bacterial populations on stethoscopes in use in an intensive care unit (ICU), including practitioner stethoscopes, individual-use patient-room stethoscopes, and clean unused individual-use stethoscopes. Two additional sets of practitioner stethoscopes were sampled before and after cleaning using standardized or practitioner-preferred methods.
Bacterial contamination levels were highest on practitioner stethoscopes, followed by patient-room stethoscopes, whereas clean stethoscopes were indistinguishable from background controls. Bacterial communities on stethoscopes were complex, and community analysis by weighted UniFrac showed that physician and patient-room stethoscopes were indistinguishable and significantly different from clean stethoscopes and background controls. Genera relevant to healthcare-associated infections (HAIs) were common on practitioner stethoscopes, among which Staphylococcus was ubiquitous and had the highest relative abundance (6.8%–14% of contaminating bacterial sequences). Other HAI-related genera were also widespread although lower in abundance. Cleaning of practitioner stethoscopes resulted in a significant reduction in bacterial contamination levels, but these levels reached those of clean stethoscopes in only a few cases with either standardized or practitioner-preferred methods, and bacterial community composition did not significantly change.
Stethoscopes used in an ICU carry bacterial DNA reflecting complex microbial communities that include nosocomially important taxa. Commonly used cleaning practices reduce contamination but are only partially successful at modifying or eliminating these communities.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
Life span bias potentially alters species abundance in death assemblages through the overrepresentation of short-lived organisms compared with their long-lived counterparts. Although previous work found that life span bias did not contribute significantly to live–dead discordance in bivalve assemblages, life span bias better explained discordance in two groups: longer-lived bivalve species and species with known life spans. More studies using local, rather than global, species-wide life spans and mortality rates would help to determine the prevalence of life span bias, especially for long-lived species with known life spans. Here, we conducted a field study at two sites in North Carolina to assess potential life span bias between Mercenaria mercenaria and Chione elevata, two long-lived bivalve species that can be aged directly. We compared the ability of directly measured local life spans with that of regional and global life spans to predict live–dead discordance between these two species. The shorter-lived species (C. elevata) was overrepresented in the death assemblage compared with its live abundance, and local life span data largely predicted the amount of live–dead discordance; local life spans predicted 43% to 88% of discordance. Furthermore, the global maximum life span for M. mercenaria resulted in substantial overpredictions of discordance (1.4 to 1.6 times the observed live–dead discordance). The results of this study suggest that life span bias should be considered as a factor affecting proportional abundances of species in death assemblages and that using life span estimates appropriate to the study locality improves predictions of discordance based on life span compared with using global life span estimates.
This study examined the effectiveness of a formal postdoctoral education program designed to teach skills in clinical and translational science, using scholar publication rates as a measure of research productivity.
Participants included 70 clinical fellows who were admitted to a master’s or certificate training program in clinical and translational science from 1999 to 2015 and 70 matched control peers. The primary outcomes were the number of publications 5 years post-fellowship matriculation and time to publishing 15 peer-reviewed manuscripts post-matriculation.
Clinical and translational science program graduates published significantly more peer-reviewed manuscripts at 5 years post-matriculation (median 8 vs 5, p=0.041) and had a faster time to publication of 15 peer-reviewed manuscripts (matched hazard ratio = 2.91, p=0.002). Additionally, program graduates’ publications yielded a significantly higher average H-index (11 vs. 7, p=0.013).
These findings support the effectiveness of formal training programs in clinical and translational science by increasing academic productivity.
In 2012, the Centers for Medicare and Medicaid Services expanded a 2008 program that eliminated additional Medicare payment for mediastinitis following coronary artery bypass graft (CABG) to include Medicaid. We aimed to evaluate the impact of this Medicaid program on mediastinitis rates reported by the National Healthcare Safety Network (NHSN) compared with the rates of a condition not targeted by the program, deep-space surgical site infection (SSI) after knee replacement.
Interrupted time series with comparison group.
We included surveillance data from nonfederal acute-care hospitals participating in the NHSN and reporting CABG or knee replacement outcomes from January 2009 through June 2017. We examined the Medicaid program’s impact on NHSN-reported infection rates, adjusting for secular trends. The data analysis used generalized estimating equations with robust sandwich variance estimators.
During the study period, 196 study hospitals reported 273,984 CABGs to the NHSN, resulting in 970 mediastinitis cases (0.35%), and 294 hospitals reported 555,395 knee replacements, with 1,751 resultant deep-space SSIs (0.32%). There was no significant change in incidence of either condition during the study. Mediastinitis models showed no effect of the 2012 Medicaid program on either secular trend during the postprogram versus preprogram periods (P=.70) or an immediate program effect (P=.83). Results were similar in sensitivity analyses when adjusting for hospital characteristics, restricting to hospitals with consistent NHSN reporting or incorporating a program implementation roll-in period. Knee replacement models also showed no program effect.
The 2012 Medicaid program to eliminate additional payments for mediastinitis following CABG had no impact on reported mediastinitis rates.
This paper concentrates on livestock production systems by introducing sustainable housing characteristics, and the type of information required to make an informed choice on environmentally sound materials and systems. It then compares energy use in two contrasting beef cattle systems, one a conventional straw-bedded court and roofed silo, with feed delivered by a side-delivery wagon, and the other a roofless woodchip corral and earth-bank silo, with feed delivered by fore-end loader. The woodchip corral system requires 70% less energy than the conventional bedded court, when the total energy inputs are analysed for preparation of the building materials, construction of the livestock accommodation with associated feed and waste storage, and manufacture and operation of machinery. However, when energy used in feed production is included this dominates the energy budget, accounting for 60% of all energy used in the conventional bedded court, and 85% of energy used in the woodchip corral system.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
The Middle East respiratory syndrome coronavirus (MERS-CoV) is caused by a novel coronavirus discovered in 2012. Since then, 1806 cases, including 564 deaths, have been reported by the Kingdom of Saudi Arabia (KSA) and affected countries as of 1 June 2016. Previous literature attributed increases in MERS-CoV transmission to camel breeding season as camels are likely the reservoir for the virus. However, this literature review and subsequent analysis indicate a lack of seasonality. A retrospective, epidemiological cluster analysis was conducted to investigate increases in MERS-CoV transmission and reports of household and nosocomial clusters. Cases were verified and associations between cases were substantiated through an extensive literature review and the Armed Forces Health Surveillance Branch's Tiered Source Classification System. A total of 51 clusters were identified, primarily nosocomial (80·4%) and most occurred in KSA (45·1%). Clusters corresponded temporally with the majority of periods of greatest incidence, suggesting a strong correlation between nosocomial transmission and notable increases in cases.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
This study aimed to examine the effects of replacing rolled barley (high in starch) with citrus pulp (high in digestible fibre) in a supplement on intake and performance of young growing cattle offered grass silage ad libitum for 101 days. Weaned, early- and late-maturing breed, male suckled beef calves (n=120) were blocked by sire breed, gender and weight and from within block randomly assigned to one of two concentrate supplements based mainly on rolled barley (BAR) or citrus pulp (CIT) and formulated to have similar concentrations of true protein digestible in the small intestine. On day 87, blood samples were taken before and 2 h after feeding, and rumen fluid samples were collected 2 h post-feeding. Supplement type did not affect (P>0.05) grass silage intake, live weight gain, final live weight, ultrasonically assessed body composition or measurements of skeletal size. Rumen pH (6.64 v. 6.79), ammonia (51 v. 81 mg/l) and acetate-to-propionate ratio (2.7 v. 3.2) were lower (P<0.001) for CIT than BAR. In conclusion, citrus pulp can replace barley in concentrate supplements for growing cattle without negatively affecting performance.
The performance of early-maturing breed sired suckler bulls finished at pasture, with or without concentrate supplementation, at 15 or 19 months of age was evaluated. In total, 60 Aberdeen Angus-sired bulls were assigned to a two (slaughter age (SA): 15 (S15) or 19 (S19) months)×two (finishing strategies (FS): grass only or grass+barley-based concentrate) factorial arrangement. There were no (P>0.05) SA×FS interactions. Increasing SA increased carcass weight (265 v. 355 kg), kill-out proportion (542 v. 561 g/kg), conformation (6.7 v. 8.3, 1 to 15) (P<0.001) and fat (5.8 v. 6.8) scores (P<0.01), and resulted in yellower subcutaneous fat (‘b’ value, 6.6 v. 8.3) and darker muscle (‘L’ value, 30.0 v. 28.3) (P<0.01). Supplementation reduced estimated herbage intake by 0.60 and 0.47 kg dry matter (DM)/kg DM of concentrates for S15 and S19, respectively. Supplementation increased carcass weight (+6.7%, P<0.001) and kill-out proportion (+1.8%, P=0.06) but had no effect on carcass fat and conformation scores or fat and muscle colour. In conclusion, carcasses were adequately finished, with or without concentrates for S19, but not for S15. Supplementation had no effect, and age had relatively minor effects, on fat and muscle colour.
During 2000–07, five giant icebergs (B15A, B15J, B15K, C16 and C25) adrift in the southwestern Ross Sea, Antarctica, were instrumented with global positioning system (GPS) receivers and other instruments to monitor their behavior in the near-coastal environment. The measurements show that collision processes can strongly influence iceberg behavior and delay their progress in drifting to the open ocean. Collisions appear to have been a dominant control on the movement of B15A, the largest of the icebergs, during the 4-year period it gyrated within the limited confines of Ross Island, the fixed Ross Ice Shelf and grounded C16. Iceberg interactions in the near-coastal regime are largely driven by ocean tidal effects which determine the magnitude of forces generated during collision and break-up events. Estimates of forces derived from the observed drift trajectories during the iceberg-collisioninduced calving of iceberg C19 from the Ross Ice Shelf, during the iceberg-induced break-off of the tip of the Drygalski Ice Tongue and the break-up of B15A provide a crude estimate of the stress scale involved in iceberg calving. Considering the total area the vertical face of new rifts created in the calving or break-up process, and not accounting for local stress amplification near rift tips, this estimated stress scale is 104 Pa.
To establish if the relatively low rate of involuntary psychiatric admission in a suburban area between 2007 and 2011 was maintained in 2014/2015, and explore key correlates of involuntary status.
We used existing hospital records and data sources to extract rates and selected potential correlates of voluntary and involuntary admission in south west Dublin (catchment area: 273 419 people) over 18 months in 2014/2015 and compared these with published national data from the census and Health Research Board.
The rate of involuntary admission in the suburban area studied between 2007 and 2011 was 33.8 involuntary admissions per 100 000 population annually, which was lower than the national rate (48.6). By 2014/2015, the rate of involuntary admission in this area had risen to 46.8 involuntary admissions per 100 000 population annually, similar to the national rate (44.9). Nevertheless, the overall (voluntary and involuntary) admission rate in the suburban area (346.7 admissions per 100 000 population annually) was still lower the national rate (387.9), owing to a lower rate of voluntary admission in the suburban area (299.9) compared to Ireland as a whole (342.9). Multi-variable testing demonstrated that diagnosis was the strongest driver of involuntary admission in the suburban area: this area had 28.5 involuntary admissions per 100 000 population annually with schizophrenia or related disorders, compared to 18.9 nationally. Schizophrenia and related disorders accounted for 60.9% of involuntary admissions in the suburban area compared to 42.1% nationally.
Schizophrenia is the strongest driver of involuntary admission in the suburban area in this study.
Objectives: Sleep quality affects memory and executive function in older adults, but little is known about its effects in midlife. If it affects cognition in midlife, it may be a modifiable factor for later-life functioning. Methods: We examined the association between sleep quality and cognition in 1220 middle-aged male twins (age 51–60 years) from the Vietnam Era Twin Study of Aging. We interviewed participants with the Pittsburgh Sleep Quality Index and tested them for episodic memory as well as executive functions of inhibitory and interference control, updating in working memory, and set shifting. Interference control was assessed during episodic memory, inhibitory control during working memory, and non-memory conditions and set shifting during working memory and non-memory conditions. Results: After adjusting for covariates and correcting for multiple comparisons, sleep quality was positively associated with updating in working memory, set shifting in the context of working memory, and better visual-spatial (but not verbal) episodic memory, and at trend level, with interference control in the context of episodic memory. Conclusions: Sleep quality was associated with visual-spatial recall and possible resistance to proactive/retroactive interference. It was also associated with updating in working memory and with set shifting, but only when working memory demands were relatively high. Thus, effects of sleep quality on midlife cognition appear to be at the intersection of executive function and memory processes. Subtle deficits in these age-susceptible cognitive functions may indicate increased risk for decline in cognitive abilities later in life that might be reduced by improved midlife sleep quality. (JINS, 2018, 24, 67–76)