To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: An important challenge physicians face when treating acute heart failure (AHF) patients in the emergency department (ED) is deciding whether to admit or discharge, with or without early follow-up. The overall goal of our project was to improve care for AHF patients seen in the ED while avoiding unnecessary hospital admissions. The specific goal was to introduce hospital rapid referral clinics to ensure AHF patients were seen within 7 days of ED discharge. Methods: This prospective before-after study was conducted at two campuses of a large tertiary care hospital, including the EDs and specialty outpatient clinics. We enrolled AHF patients ≥50 years who presented to the ED with shortness of breath (<7 days). The 12-month before (control) period was separated from the 12-month after (intervention) period by a 3-month implementation period. Implementation included creation of rapid access AHF clinics staffed by cardiology and internal medicine, and development of referral procedures. There was extensive in-servicing of all ED staff. The primary outcome measure was hospital admission at the index visit or within 30 days. Secondary outcomes included mortality and actual access to rapid follow-up. We used segmented autoregression analysis of the monthly proportions to determine whether there was a change in admissions coinciding with the introduction of the intervention and estimated a sample size of 700 patients. Results: The patients in the before period (N = 355) and the after period (N = 374) were similar for age (77.8 vs. 78.1 years), arrival by ambulance (48.7% vs 51.1%), comorbidities, current medications, and need for non-invasive ventilation (10.4% vs. 6.7%). Comparing the before to the after periods, we observed a decrease in hospital admissions on index visit (from 57.7% to 42.0%; P <0.01), as well as all admissions within 30 days (from 65.1% to 53.5% (P < 0.01). The autoregression analysis, however, demonstrated a pre-existing trend to fewer admissions and could not attribute this to the intervention (P = 0.91). Attendance at a specialty clinic, amongst those discharged increased from 17.8% to 42.1% (P < 0.01) and the median days to clinic decreased from 13 to 6 days (P < 0.01). 30-day mortality did not change (4.5% vs. 4.0%; P = 0.76). Conclusion: Implementation of rapid-access dedicated AHF clinics led to considerably increased access to specialist care, much reduced follow-up times, and possible reduction in hospital admissions. Widespread use of this approach can improve AHF care in Canada.
We investigated whether neurobehavioral markers of risk for emotion dysregulation were evident among newborns, as well as whether the identified markers were associated with prenatal exposure to maternal emotion dysregulation. Pregnant women (N = 162) reported on their emotion dysregulation prior to a laboratory assessment. The women were then invited to the laboratory to assess baseline respiratory sinus arrhythmia (RSA) and RSA in response to an infant cry. Newborns were assessed after birth via the NICU Network Neurobehavioral Scale. We identified two newborn neurobehavioral factors—arousal and attention—via exploratory factor analysis. Low arousal was characterized by less irritability, excitability, and motor agitation, while low attention was related to a lower threshold for auditory and visual stimulation, less sustained attention, and poorer visual tracking abilities. Pregnant women who reported higher levels of emotion dysregulation had newborns with low arousal levels and less attention. Larger decreases in maternal RSA in response to cry were also related to lower newborn arousal. We provide the first evidence that a woman's emotion dysregulation while pregnant is associated with risks for dysregulation in her newborn. Implications for intergenerational transmission of emotion dysregulation are discussed.
Research on the drivers of vaccine acceptance has expanded but most interventions fall short of coverage targets. We explored whether vaccine uptake is driven directly or indirectly by disgust with attitudes towards vaccines acting as a possible mediator. An online cross-sectional study of 1007 adults of the USA via Amazon's Mechanical Turk was conducted in January 2017. The questionnaire consisted of four sections: (1) items assessing attitudes towards vaccines and vaccine uptake, (2) revised Disgust Scale (DS-R) to measure Disgust Sensitivity, (3) Perceived Vulnerability to Disease scale (PVD) to measure Germ Aversion and Perceived Susceptibility, and (4) socio-demographic information. Using mediation analysis, we assess the direct, the indirect (through Vaccine Attitudes) and the total effect of Disgust Sensitivity, Germ Aversion and Perceived Susceptibility on 2016 self-reported flu vaccine uptake. Mediation analysis showed the effect of Disgust Sensitivity and Germ Aversion on vaccine uptake to be twofold: a direct positive effect on vaccine uptake and an indirect negative effect through Vaccine Attitudes. In contrast, Perceived Susceptibility was found to have only a direct positive effect on vaccine uptake. Nonetheless, these effects were attenuated and small compared to economic, logistic and psychological determinants of vaccine uptake.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
The visual system is recognized as an important site of pathology and dysfunction in schizophrenia. In this study, we evaluated different visual perceptual functions in patients with psychotic disorders using a potentially clinically applicable task battery and assessed their relationship with symptom severity in patients, and with schizotypal features in healthy participants.
Five different areas of visual functioning were evaluated in patients with schizophrenia and schizoaffective disorder (n = 28) and healthy control subjects (n = 31) using a battery that included visuospatial working memory (VSWM), velocity discrimination (VD), contour integration, visual context processing, and backward masking tasks.
The patient group demonstrated significantly lower performance in VD, contour integration, and VSWM tasks. Performance did not differ between the two groups on the visual context processing task and did not differ across levels of interstimulus intervals in the backward masking task. Performances on VSWM, VD, and contour integration tasks were correlated with negative symptom severity but not with other symptom dimensions in the patient group. VSWM and VD performances were also correlated with negative sychizotypal features in healthy controls.
Taken together, these results demonstrate significant abnormalities in multiple visual processing tasks in patients with psychotic disorders, adding to the literature implicating visual abnormalities in these conditions. Furthermore, our results show that visual processing impairments are associated with the negative symptom dimension in patients as well as healthy individuals.
This study determines the prevalence of inadequate micronutrient intakes consumed by long-term care (LTC) residents. This cross-sectional study was completed in thirty-two LTC homes in four Canadian provinces. Weighed and estimated food and beverage intake were collected over 3 non-consecutive days from 632 randomly selected residents. Nutrient intakes were adjusted for intra-individual variation and compared with the Dietary Reference Intakes. Proportion of participants, stratified by sex and use of modified (MTF) or regular texture foods, with intakes below the Estimated Average Requirement (EAR) or Adequate Intake (AI), were identified. Numbers of participants that met these adequacy values with use of micronutrient supplements was determined. Mean age of males (n 197) was 85·2 (sd 7·6) years and females (n 435) was 87·4 (sd 7·8) years. In all, 33 % consumed MTF; 78·2 % (males) and 76·1 % (females) took at least one micronutrient pill. Participants on a MTF had lower intake for some nutrients (males=4; females=8), but also consumed a few nutrients in larger amounts than regular texture consumers (males=4; females =1). More than 50 % of participants in both sexes and texture groups consumed inadequate amounts of folate, vitamins B6, Ca, Mg and Zn (males only), with >90 % consuming amounts below the EAR/AI for vitamin D, E, K, Mg (males only) and K. Vitamin D supplements resolved inadequate intakes for 50–70 % of participants. High proportions of LTC residents have intakes for nine of twenty nutrients examined below the EAR or AI. Strategies to improve intake specific to these nutrients are needed.
Avian influenza virus (AIV) subtypes H5 and H7 can infect poultry causing low pathogenicity (LP) AI, but these LPAIVs may mutate to highly pathogenic AIV in chickens or turkeys causing high mortality, hence H5/H7 subtypes demand statutory intervention. Serological surveillance in the European Union provides evidence of H5/H7 AIV exposure in apparently healthy poultry. To identify the most sensitive screening method as the first step in an algorithm to provide evidence of H5/H7 AIV infection, the standard approach of H5/H7 antibody testing by haemagglutination inhibition (HI) was compared with an ELISA, which detects antibodies to all subtypes. Sera (n = 1055) from 74 commercial chicken flocks were tested by both methods. A Bayesian approach served to estimate diagnostic test sensitivities and specificities, without assuming any ‘gold standard’. Sensitivity and specificity of the ELISA was 97% and 99.8%, and for H5/H7 HI 43% and 99.8%, respectively, although H5/H7 HI sensitivity varied considerably between infected flocks. ELISA therefore provides superior sensitivity for the screening of chicken flocks as part of an algorithm, which subsequently utilises H5/H7 HI to identify infection by these two subtypes. With the calculated sensitivity and specificity, testing nine sera per flock is sufficient to detect a flock seroprevalence of 30% with 95% probability.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
We describe the investigation of two temporally coincident illness clusters involving salmonella and Staphylococcus aureus in two states. Cases were defined as gastrointestinal illness following two meal events. Investigators interviewed ill persons. Stool, food and environmental samples underwent pathogen testing. Alabama: Eighty cases were identified. Median time from meal to illness was 5·8 h. Salmonella Heidelberg was identified from 27 of 28 stool specimens tested, and coagulase-positive S. aureus was isolated from three of 16 ill persons. Environmental investigation indicated that food handling deficiencies occurred. Colorado: Seven cases were identified. Median time from meal to illness was 4·5 h. Five persons were hospitalised, four of whom were admitted to the intensive care unit. Salmonella Heidelberg was identified in six of seven stool specimens and coagulase-positive S. aureus in three of six tested. No single food item was implicated in either outbreak. These two outbreaks were linked to infection with Salmonella Heidelberg, but additional factors, such as dual aetiology that included S. aureus or the dose of salmonella ingested may have contributed to the short incubation periods and high illness severity. The outbreaks underscore the importance of measures to prevent foodborne illness through appropriate washing, handling, preparation and storage of food.
Taylor's law (TL) originated as an empirical pattern in ecology. In many sets of samples of population density, the variance of each sample was approximately proportional to a power of the mean of that sample. In a family of nonnegative random variables, TL asserts that the population variance is proportional to a power of the population mean. TL, sometimes called fluctuation scaling, holds widely in physics, ecology, finance, demography, epidemiology, and other sciences, and characterizes many classical probability distributions and stochastic processes such as branching processes and birth-and-death processes. We demonstrate analytically for the first time that a version of TL holds for a class of distributions with infinite mean. These distributions, a subset of stable laws, and the associated TL differ qualitatively from those of light-tailed distributions. Our results employ and contribute to the methodology of Albrecher and Teugels (2006) and Albrecher et al. (2010). This work opens a new domain of investigation for generalizations of TL.
In 2013, New York State mandated that, during influenza season, unvaccinated healthcare personnel (HCP) wear a surgical mask in areas where patients are typically present. We found that this mandate was associated with increased HCP vaccination and decreased HCP visits to the hospital Workforce Health and Safety Department with respiratory illnesses and laboratory-confirmed influenza.
Information on the factors that cause or amplify foodborne illness outbreaks (contributing factors), such as ill workers or cross-contamination of food by workers, is critical to outbreak prevention. However, only about half of foodborne illness outbreaks reported to the United States’ Centers for Disease Control and Prevention (CDC) have an identified contributing factor, and data on outbreak characteristics that promote contributing factor identification are limited. To address these gaps, we analyzed data from 297 single-setting outbreaks reported to CDC's new outbreak surveillance system, which collects data from the environmental health component of outbreak investigations (often called environmental assessments), to identify outbreak characteristics associated with contributing factor identification. These analyses showed that outbreak contributing factors were more often identified when an outbreak etiologic agent had been identified, when the outbreak establishment prepared all meals on location and served more than 150 meals a day, when investigators contacted the establishment to schedule the environmental assessment within a day of the establishment being linked with an outbreak, and when multiple establishment visits were made to complete the environmental assessment. These findings suggest that contributing factor identification is influenced by multiple outbreak characteristics, and that timely and comprehensive environmental assessments are important to contributing factor identification. They also highlight the need for strong environmental health and food safety programs that have the capacity to complete such environmental assessments during outbreak investigations.
To examine variation in antibiotic coverage and detection of resistant pathogens in community-onset pneumonia.
A total of 128 hospitals in the Veterans Affairs health system.
Hospitalizations with a principal diagnosis of pneumonia from 2009 through 2010.
We examined proportions of hospitalizations with empiric antibiotic coverage for methicillin-resistant Staphylococcus aureus (MRSA) and Pseudomonas aeruginosa (PAER) and with initial detection in blood or respiratory cultures. We compared lowest- versus highest-decile hospitals, and we estimated adjusted probabilities (AP) for patient- and hospital-level factors predicting coverage and detection using hierarchical regression modeling.
Among 38,473 hospitalizations, empiric coverage varied widely across hospitals (MRSA lowest vs highest, 8.2% vs 42.0%; PAER lowest vs highest, 13.9% vs 44.4%). Detection rates also varied (MRSA lowest vs highest, 0.5% vs 3.6%; PAER lowest vs highest, 0.6% vs 3.7%). Whereas coverage was greatest among patients with recent hospitalizations (AP for anti-MRSA, 54%; AP for anti-PAER, 59%) and long-term care (AP for anti-MRSA, 60%; AP for anti-PAER, 66%), detection was greatest in patients with a previous history of a positive culture (AP for MRSA, 7.9%; AP for PAER, 11.9%) and in hospitals with a high prevalence of the organism in pneumonia (AP for MRSA, 3.9%; AP for PAER, 3.2%). Low hospital complexity and rural setting were strong negative predictors of coverage but not of detection.
Hospitals demonstrated widespread variation in both coverage and detection of MRSA and PAER, but probability of coverage correlated poorly with probability of detection. Factors associated with empiric coverage (eg, healthcare exposure) were different from those associated with detection (eg, microbiology history). Providing microbiology data during empiric antibiotic decision making could better align coverage to risk for resistant pathogens and could promote more judicious use of broad-spectrum antibiotics.
Introduction: The Institute of Medicine (IOM) has recommended that high-quality, evidence-based guidelines be developed for emergency medical services (EMS). The National Association of EMS Physicians (NAEMSP) has outlined a strategy that will see this task fulfilled, consisting of multiple working groups focused on all aspects of guideline development and implementation. A first step, and our objective, was a cataloguing and appraisal of the current guidelines targeting EMS providers. Methods: A systematic search of the literature was conducted in MEDLINE (1175), EMBASE (519), PubMed (14), Trip (416), and guidelines.gov (64) through May 1, 2016. Two independent reviewers screened titles for relevance to prehospital care, and then abstracts for essential guideline features, including a systematic review, a grading system, and an association between level of evidence and strength of recommendation. All disagreements were moderated by a third party. Citations meeting inclusion criteria were appraised with the AGREE II tool, which looks at six different domains of guideline quality, containing a total of 23 items rated from 1 to 7. Each guideline was appraised by three separate reviewers, and composite scores were calculated by averaging the scaled domain totals. Results: After primary (kappa 97%) and secondary (kappa 93%) screening, 49 guidelines were retained for full review. Only three guidelines obtained a score of >90%, the topics of which included aeromedical transport, analgesia in trauma, and resuscitation of avalanche victims. Only two guidelines scored between 80% and 90%, the topics of which included stroke and pediatric seizure management. One guideline, splinting in an austere environment, scored between 70% and 80%. Nine guidelines scored between 60% and 70%, the topics of which included ischemic stroke, cardiovascular life support, hemorrhage control, intubation, triage, hypothermia, and fibrinolytic use. Of the remaining guidelines, 14 scored between 50% and 60%, and 20 obtained a score of <50%. Conclusion: There are few high-quality, evidence-based guidelines in EMS. Of those that are published, the majority fail to meet established quality measures. Although a lack of randomized controlled trials (RCTs) conducted in the prehospital field continues to limit guideline development, suboptimal methodology is also commonplace within the existing literature.
It has long been contentious as to whether the presence of bilateral infundibulums, or conuses, is a prerequisite for the diagnosis of double-outlet right ventricle. As the use of such a criterion would abrogate the so-called “morphological method”, which correctly states that one variable entity should not be defined on the basis of another entity that is itself variable, it is now accepted that double outlet can exist in the setting of fibrous continuity between the leaflets of the atrioventricular and arterial valves. Although this debate has now been resolved, there are other contentious areas still requiring clarification in the setting of hearts unified because of the presence of this particular ventriculo-arterial connection – for example, it is questionable whether the channel between the ventricles should be described as a “ventricular septal defect”, whereas it is equally arguable that the mere presence of fibrous continuity between the leaflets of the arterial valves does not necessarily place the channel in a doubly committed location. In this review, we describe a series of autopsied hearts in which the anatomical features serve to illuminate these various topics. We then discuss recent findings regarding cardiac development that point to the individuality of the building blocks of the ventricular outflow tracts, specifically the outlet septum, the inner heart curvature, or ventriculo-infundibular fold, and the septomarginal trabeculation, or septal band.
Although specific phobia is highly prevalent, associated with impairment, and an important risk factor for the development of other mental disorders, cross-national epidemiological data are scarce, especially from low- and middle-income countries. This paper presents epidemiological data from 22 low-, lower-middle-, upper-middle- and high-income countries.
Data came from 25 representative population-based surveys conducted in 22 countries (2001–2011) as part of the World Health Organization World Mental Health Surveys initiative (n = 124 902). The presence of specific phobia as defined by the Diagnostic and Statistical Manual of Mental Disorders, fourth edition was evaluated using the World Health Organization Composite International Diagnostic Interview.
The cross-national lifetime and 12-month prevalence rates of specific phobia were, respectively, 7.4% and 5.5%, being higher in females (9.8 and 7.7%) than in males (4.9% and 3.3%) and higher in high- and higher-middle-income countries than in low-/lower-middle-income countries. The median age of onset was young (8 years). Of the 12-month patients, 18.7% reported severe role impairment (13.3–21.9% across income groups) and 23.1% reported any treatment (9.6–30.1% across income groups). Lifetime co-morbidity was observed in 60.5% of those with lifetime specific phobia, with the onset of specific phobia preceding the other disorder in most cases (72.6%). Interestingly, rates of impairment, treatment use and co-morbidity increased with the number of fear subtypes.
Specific phobia is common and associated with impairment in a considerable percentage of cases. Importantly, specific phobia often precedes the onset of other mental disorders, making it a possible early-life indicator of psychopathology vulnerability.
Although most non-typhoidal Salmonella illnesses are self-limiting, antimicrobial treatment is critical for invasive infections. To describe resistance in Salmonella that caused foodborne outbreaks in the United States, we linked outbreaks submitted to the Foodborne Disease Outbreak Surveillance System to isolate susceptibility data in the National Antimicrobial Resistance Monitoring System. Resistant outbreaks were defined as those linked to one or more isolates with resistance to at least one antimicrobial drug. Multidrug resistant (MDR) outbreaks had at least one isolate resistant to three or more antimicrobial classes. Twenty-one per cent (37/176) of linked outbreaks were resistant. In outbreaks attributed to a single food group, 73% (16/22) of resistant outbreaks and 46% (31/68) of non-resistant outbreaks were attributed to foods from land animals (P < 0·05). MDR Salmonella with clinically important resistance caused 29% (14/48) of outbreaks from land animals and 8% (3/40) of outbreaks from plant products (P < 0·01). In our study, resistant Salmonella infections were more common in outbreaks attributed to foods from land animals than outbreaks from foods from plants or aquatic animals. Antimicrobial susceptibility data on isolates from foodborne Salmonella outbreaks can help determine which foods are associated with resistant infections.