To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Given the challenges in accurately identifying unexposed controls in case–control studies of diarrhoea, we examined diarrhoea incidence, subclinical enteric infections and growth stunting within a reference population in the Global Enteric Multicenter Study, Kenya site. Within ‘control’ children (0–59 months old without diarrhoea in the 7 days before enrolment, n = 2384), we examined surveys at enrolment and 60-day follow-up, stool at enrolment and a 14-day post-enrolment memory aid for diarrhoea incidence. At enrolment, 19% of controls had ⩾1 enteric pathogen associated with moderate-to-severe diarrhoea (‘MSD pathogens’) in stool; following enrolment, many reported diarrhoea (27% in 7 days, 39% in 14 days). Controls with and without reported diarrhoea had similar carriage of MSD pathogens at enrolment; however, controls reporting diarrhoea were more likely to report visiting a health facility for diarrhoea (27% vs. 7%) or fever (23% vs. 16%) at follow-up than controls without diarrhoea. Odds of stunting differed by both MSD and ‘any’ (including non-MSD pathogens) enteric pathogen carriage, but not diarrhoea, suggesting control classification may warrant modification when assessing long-term outcomes. High diarrhoea incidence following enrolment and prevalent carriage of enteric pathogens have implications for sequelae associated with subclinical enteric infections and for design and interpretation of case–control studies examining diarrhoea.
Toca 511 (vocimagene amiretrorepvec) is an investigational, conditionally lytic, retroviral replicating vector (RRV). RRVs selectively infect cancer cells due to innate and adaptive immune response defects in cancers that allow virus replication, and the requirement for cell division for virus integration into the genome. Toca 511 spreads through tumors, stably delivering an optimized yeast cytosine deaminase gene that converts the prodrug Toca FC (investigational, extended-release 5-FC) into 5-FU within the tumor microenvironment. 5-FU kills infected dividing cancer cells and surrounding tumor, myeloid derived suppressor cells, and tumor associated macrophages, resulting in long-term tumor immunity in preclinical models. Data from a Phase 1 resection trial showed six durable CRs and extended mOS compared to historical controls. The FDA granted Breakthrough Therapy Designation for Toca 511 & Toca FC in the treatment of patients with rHGG. Toca 5 is an international, randomized, open-label Phase 3 trial (NCT02414165) of Toca 511 & Toca FC versus SOC in patients undergoing resection for first or second recurrence of rHGG. Patients will be stratified by IDH1 status, KPS, and geographic region. Primary endpoint is OS, and secondary endpoints are durable response rate, durable clinical benefit rate, duration of durable response, and 12-month survival rate. Key inclusion criteria are histologically proven GBM or AA, tumor size ≥1cm and ≤5cm, and KPS ≥70. Immune monitoring and molecular profiling will be performed. Approximately 380 patients will be randomized. An IDMC is commissioned to review the safety and efficacy data which includes 2 interim analyses. Enrollment is ongoing.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
In this brief report, computed tomography perfusion (CTP) thresholds predicting follow-up infarction in patients presenting <3 hours from stroke onset and achieving ultra-early reperfusion (<45 minutes from CTP) are reported. CTP thresholds that predict follow-up infarction vary based on time to reperfusion: Tmax >20 to 23 seconds and cerebral blood flow <5 to 7 ml/min−1/(100 g)−1 or relative cerebral blood flow <0.14 to 0.20 optimally predicted the final infarct. These thresholds are stricter than published thresholds.
It has not been well established whether dietary folate intake reduces the risk of diabetes development. We aimed to clarify the prospective association between dietary folate intake and type 2 diabetes (T2D) risk among 7333 Korean adults aged 40 years or older who were included in the Multi-Rural Communities Cohort. Dietary folate intake was estimated from all 106 food items listed on a FFQ, not including folate intake from supplements. Two different measurements of dietary folate intake were used: the baseline consumption and the average consumption from baseline until just before the end of follow-up. The association between folate intake and T2D risk was determined through a modified Poisson regression model with a robust error estimator controlling for potential confounders. For 29 745 person years, 319 cases of diabetes were ascertained. In multivariable analyses, dietary folate intake was inversely associated with risk of T2D for women, not for men. For women, the incidence rate ratio of diabetes in the third tertile compared with the first tertile was 0·57 (95 % CI 0·38–0·87, Pfor trend=0·0085) in the baseline consumption model and 0·64 (95 % CI 0·43–0·95, Pfor trend=0·0244) in the average consumption model. These inverse associations was found in both normal fasting blood glucose group and impaired fasting glucose group among women. Among non-users of multinutrients and vitamin supplements, the significant inverse association remained. Thus, higher dietary intake of folate is prospectively associated with lower risk of diabetes for women.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
We compared the impact of a commercial chlorination product (brand name Air RahMat) in stored drinking water to traditional boiling practices in Indonesia. We conducted a baseline survey of all households with children <5 years in four communities, made 11 subsequent weekly home visits to assess acceptability and use of water treatment methods, measured Escherichia coli concentration in stored water, and determined diarrhoea prevalence among children <5 years. Of 281 households surveyed, boiling (83%) and Air RahMat (7%) were the principal water treatment methods. Multivariable log-binomial regression analyses showed lower risk of E. coli in stored water treated with Air RahMat than boiling (risk ratio (RR) 0·75, 95% confidence interval (CI) 0·56–1·00). The risk of diarrhoea in children <5 years was lower among households using Air RahMat (RR 0·43, 95% CI 0·19–0·97) than boiling, and higher in households with E. coli concentrations of 1–1000 MPN/100 ml (RR 1·54, 95% CI 1·04–2·28) or >1000 MPN/100 ml (RR 1·86, 95% CI 1·09–3·19) in stored water than in households without detectable E. coli. Although results suggested that Air RahMat water treatment was associated with lower E. coli contamination and diarrhoeal rates among children <5 years than water treatment by boiling, Air RahMat use remained low.
Maltreated children in foster care are at high risk for dysregulated hypothalamus–pituitary–adrenal (HPA) axis functioning and educational difficulties. The present study examined the effects of a short-term school readiness intervention on HPA axis functioning in response to the start of kindergarten, a critical transition marking entry to formal schooling, and whether altered HPA axis functioning influenced children's school adjustment. Compared to a foster care comparison group, children in the intervention group showed a steeper diurnal cortisol slope on the first day of school, a pattern previously observed among nonmaltreated children. A steeper first day of school diurnal cortisol slope predicted teacher ratings of better school adjustment (i.e., academic performance, appropriate classroom behaviors, and engagement in learning) in the fall of kindergarten. Furthermore, the children's HPA axis response to the start of school mediated the effect of the intervention on school adjustment. These findings support the potential for ameliorative effects of interventions targeting critical transitional periods, such as the transition of formal schooling. This school readiness intervention appears to influence stress neurobiology, which in turn facilitates positive engagement with the school environment and better school adjustment in children who have experienced significant early adversity.
During 2000–07, five giant icebergs (B15A, B15J, B15K, C16 and C25) adrift in the southwestern Ross Sea, Antarctica, were instrumented with global positioning system (GPS) receivers and other instruments to monitor their behavior in the near-coastal environment. The measurements show that collision processes can strongly influence iceberg behavior and delay their progress in drifting to the open ocean. Collisions appear to have been a dominant control on the movement of B15A, the largest of the icebergs, during the 4-year period it gyrated within the limited confines of Ross Island, the fixed Ross Ice Shelf and grounded C16. Iceberg interactions in the near-coastal regime are largely driven by ocean tidal effects which determine the magnitude of forces generated during collision and break-up events. Estimates of forces derived from the observed drift trajectories during the iceberg-collisioninduced calving of iceberg C19 from the Ross Ice Shelf, during the iceberg-induced break-off of the tip of the Drygalski Ice Tongue and the break-up of B15A provide a crude estimate of the stress scale involved in iceberg calving. Considering the total area the vertical face of new rifts created in the calving or break-up process, and not accounting for local stress amplification near rift tips, this estimated stress scale is 104 Pa.
We report on the astrometric registration of VLBI images of the SiO and H2O masers in OH 231.8+4.2, the iconic Proto-Planetary Nebula also known as the Calabash nebula, using the KVN and Source/Frequency Phase Referencing. This, for the first time, robustly confirms the alignment of the SiO masers, close to the AGB star, which drives the bi-lobe structure with the water masers in the out-flow.
We sought to comprehensively assess the prevalence and outcomes of complications associated with Staphylococcus aureus bacteremia (SAB) in children. Secondarily, prevalence of methicillin resistance and outcomes of complications from methicillin-resistant S. aureus (MRSA) vs. methicillin-susceptible S. aureus SAB were assessed. This is a single-center cross-sectional study of 376 patients ⩽18 years old with SAB in 1990–2014. Overall, 197 (52%) patients experienced complications, the most common being osteomyelitis (33%), skin and soft tissue infection (31%), and pneumonia (25%). Patients with complications were older (median 3 vs. 0·7 years, P = 0·05) and more had community-associated SAB (66% vs. 34%, P = 0·001). Fewer patients with complications had a SAB-related emergency department or hospital readmission (10% vs. 19%, P = 0·014). Prevalence of methicillin resistance increased from 1990–1999 to 2000–2009, but decreased in 2010–2014. Complicated MRSA bacteremia resulted in more intensive care unit admissions (66% vs. 47%, P = 0·03) and led to increased likelihood of having ⩾2 foci (58% vs. 26%, P < 0·001). From multivariate analysis, community-associated SAB increased risk and concurrent infections decreased risk of complications (odds ratio (OR) 1·82 (1·1–3·02), P = 0·021) and (OR 0·58 (0·34–0·97), P = 0·038), respectively. In conclusion, children with SAB should be carefully evaluated for complications. Methicillin resistance remains associated with poor outcomes but have decreased in overall prevalence.
Introduction: Syncope is a common emergency department (ED) presentation and constitutes 1% of all ED visits, approximately 160,000 visits annually across Canada. Lack of standardized syncope care has economic and cost implications. Currently, emergency medical services (EMS) is over utilized, variations in ED management exist and a substantial proportion (46.5%) are hospitalized for cardiac monitoring. Our previous studies have proposed ways to reduce health care utilization through development of EMS clinical decision tool, ED risk scores and remote cardiac monitoring. We sought to: 1) Estimate costs associated with syncope care in the pre-hospital, ED and inpatient settings; and 2) Determine potential cost savings if the proposed alternate strategies were adopted. Methods: A prospective cohort study was conducted in five Canadian EDs from 2010-2014. We enrolled adult (≥16 years) syncope patients and excluded those with prolonged loss of consciousness, mental status changes, seizure, significant trauma, or alcohol/illicit drug abuse. Demographics, medical history, mode of arrival, EMS time points, reasons for hospitalization, ED and inpatient length of stay, final ED diagnosis and any serious adverse event within 30 days of index visit were collected. Descriptive and inferential statistics were used. Results: Out of 4,064 patients enrolled, 67.3% were transported to the ED by EMS and the average cost per event was $262.78 (range at study sites: $156.43-$553.03). The average cost per ED visit was $267.98 (range: $174.66-$374.95). 12.9% of the patients were admitted and the average of cost per admission was $9,886.15 (range: $9,715.23-$10,277.98). Syncope is associated with an estimated total annual cost of $257 million. In Canada, we estimate that diverting low-risk patients will save $5 million in the pre-hospital setting and $15 million in the ED annually, and implementing a remote cardiac monitoring strategy will save $50 million annually. Conclusion: It is estimated that the proposed strategies will save $70 million annually. This is likely an underestimation as cost savings due to reduction in investigations related to diversion of ED patients, reduction in ED length of stay and hospitalization are unaccounted. Adoption of similar strategies will likely lead to significantly higher cost savings in countries with higher resource utilization for syncope management.
Introduction: The majority of syncope patients transported to the emergency department (ED) by emergency medical services (EMS) are low-risk with very few suffering serious adverse events (SAE) within 30-days and over 50% are diagnosed with vasovagal syncope. These patients can potentially be diverted by EMS to alternate pathways of care (primary care or syncope clinic) if appropriately identified. We sought to identify high-risk factors associated with SAE within 30-days of ED disposition as a step towards developing an EMS clinical decision tool. Methods: We prospectively enrolled adult syncope patients who were transported to 5 academic EDs by EMS. We collected standardized variables at EMS presentation from history, clinical examination and investigations including ECG and ED disposition. We also collected concerning symptoms identified and EMS interventions. Adjudicated SAE included death, myocardial infarction, arrhythmia, structural heart disease, pulmonary embolism, hemorrhage and procedural interventions. Multivariable logistic regression was used for analysis. Results: 990 adult syncope patients (mean age 58.9 years, 54.9% females and 16.8% hospitalized) were enrolled with 137 (14.6%) patients suffering SAE within 30-days of ED disposition. Of 42 candidate predictors, we identified 5 predictors that were significantly associated with SAE on multivariable analysis: ECG abnormalities [OR=1.77; 95%CI 1.36-2.48] (non-sinus rhythm, high degree atrioventricular block, left bundle branch block, ST-T wave changes or Q waves), cardiac history [OR=2.87; 95%CI 1.86-4.41] (valvular or coronary heart disease, cardiomyopathy, congestive heart failure, arrhythmias or device insertions), EMS interventions or concerning symptoms [OR=4.88; 95%CI 3.13- 7.62], age >50 years [OR=3.18; 95%CI 1.68-6.02], any abnormal vital signs [OR=1.58; 95%CI 1.03-2.42] (any EMS systolic blood pressure >180 or <100 mmHg, heart rate <50 or >100/minute, respiratory rate >25/minute, oxygen saturation <91%). [C-statistic: 0.81; Hosmer Lemeshow p=0.30]. Conclusion: We identified high-risk factors that are associated with 30-day SAE among syncope patients transported to the ED by EMS. This will aid in the development of a clinical decision tool to identify low-risk patients for diversion to alternate pathways of care.
Introduction: Relatively little is known about outcomes after disposition among syncope patients assigned various diagnostic categories during emergency department (ED) evaluation. We sought to measure the 30-day serious outcomes among 4 diagnostic groups (vasovagal, orthostatic hypotension, cardiac, other/unknown) within 30 days of the index ED visit. Methods: We prospectively enrolled adult syncope patients at six EDs and excluded patients with pre-syncope, persistent mental status changes, intoxication, seizure, and major trauma. Patient characteristics, ED management, diagnostic impression (vasovagal, orthostatic, cardiac, or other/unknown) at the end of the ED visit and physicians’ confidence in assigning the etiology were collected. Serious outcomes at 30-days included: death, arrhythmia, myocardial infarction, structural heart disease, pulmonary embolism, and hemorrhage. Results: 5,010 patients (mean age 53.4 years; 54.8% females) were enrolled; 3.5% suffered serious outcomes: deaths (0.3%), arrhythmias (1.8%), non-arrhythmic cardiac (0.5%) and non-cardiac (0.9%). The cause of syncope was determined as vasovagal among 53.3% and cardiac in 5.4% of patients. The proportion of patients with ED investigations (p<0.001) and short-term serious outcomes increased (p<0.01) increased in each diagnostic category in the following order: vasovagal, orthostatic hypotension, other/unknown cause and cardiac. No deaths occurred in patients with vasovagal syncope. A higher proportion of all serious outcomes occurred among patients suspected of cardiac syncope in the ED (p<0.01). Confidence was highest among physicians for a vasovagal syncope diagnosis and lowest when the cause was other/unknown. Conclusion: Short-term serious outcomes strongly correlated with the etiology assigned in the ED visit. The physician’s clinical judgment should be incorporated in risk-stratification for prognostication and safe management of ED syncope patients.
Introduction: It is well established that a negative D-dimer will reliably rule out thromboembolism in selected low risk patients. Multiple modified D-dimer cutoffs have been suggested for older patients to improve diagnostic specificity. However, these approaches are better established for pulmonary embolism than for deep venous thrombosis (DVT). This study will evaluate the diagnostic performance of previously suggested D-dimer cutoffs for low risk DVT patients in the ED, and assess for a novel cutoff with improved performance. Methods: This health records review included patients >50 years with suspected DVT who were low-risk and had a D-dimer performed. Our analysis evaluated the diagnostic accuracy of D-dimer cutoffs of 500 and the age adjusted (age x 10) rule for patients >50 years; and 750, and 1,000 cutoffs for patients >60 years. 30-day outcome was a diagnosis of DVT. We also assessed the diagnostic accuracy for a novel cutoff (age x 12.5). Results: 1,000 patients (mean age 68 years; 59% female) were included. Of these, 110 patients (11%) were diagnosed with DVT. The conventional cutoff of <500 µg/L demonstrated a sensitivity of 99.1% (95% CI 95.0-99.9) and a specificity of 36.4% (95% CI 33.2-39.7). For patients >60 years, the absolute cutoffs of 750 and 1,000 showed sensitivity of 98.7% (95% CI, 92.9, 99.9), and the specificity increased to 48.6% (95% CI, 44.5-52.8%) and 62.1% (95% CI, 58.1-66.1%) respectively. For all study patients, age adjusted D-dimer demonstrated a sensitivity of 99.1% (95% CI 95.0-99.9) and a specificity of 51.2% (95% CI, 47.9-54.6). A novel age adjusted cutoff (age x 12.5) for patients >50, demonstrated a sensitivity of 97.3% (95% CI 92.2-99.4) and a specificity of 61.2% (95% CI 58.0-64.5). When compared to conventional cutoff, the age adjusted cutoffs (age x 10 and age x 12.5) would have resulted in an absolute decrease in further investigations of 13.1% and 22.2%, respectively, with false negative rates of 0.1% and 0.3%. Conclusion: Among older patients with suspected DVT and low clinical probability, the age adjusted D-dimer increases the proportion of patients among whom DVT can be ruled out. A novel cutoff (age x 12.5) demonstrated improved specificity. Future large scale prospective studies are needed to confirm this finding and to explore the cost savings of these approaches.
Introduction: Emergency department (ED) access block is the #1 safety concern in Canadian EDs. Its main cause is hospital access block, manifested by prolonged boarding of inpatients in EDs. Hospital administrators often believe this problem is too big to be solved and would require large increases in hospital capacity. Our objective was to quantify ED access gap by estimating the cumulative hours that CTAS 1-3 patients are blocked in waiting areas. This value, expressed as a proportion of inpatient care capacity, is an estimate of the bed hours a hospital would have to find in order to resolve ED access. Methods: A convenience sample of urban Canadian ED directors were asked to provide data summarizing their CTAS 1-3 inflow, the proportion triaged to nursed stretchers vs. RAZ or Intake areas, and time to care space. Total ED access gap was calculated by multiplying the number of CTAS 1-3 patients by their average delay to care space. Time to stretcher was captured electronically at participating sites, but time to RAZ or intake spaces was often not. In such cases, respondents provided time from triage to first RN or MD assessment in these areas. The primary outcome was total annual ED access block hours for emergent-urgent patients, expressed as a proportion of funded inpatient bed hours. Results: Directors of 40 EDs were queried. Six sites did not gather the data elements required. Of 34 remaining, 29 (85.3%) provided data, including 15 tertiary (T), 10 community (C) and 2 pediatric (P) sites in 12 cities. Mean census for the 3 ED types was 72,308 (T), 58,849 C) and 61,050 (P) visits per year. CTAS 1-3 patients accounted for 73.4% (T), 67.7% (C) and 66.2% (P) of visits in the 3 groups, and 34% (T), 46% (C) and 44% (P) of these patients were treated in RAZ or intake areas rather than staffed ED stretchers. Mean time to stretcher/RAZ care was 50/71 min (T), 46/62 min (C), and 37/59 min (P). Average ED access gap was 47,564 hrs (T), 37,222 hrs (C) and 35,407 hrs (P), while average inpatient bed capacity was 599 beds (5,243,486 hrs), 291 beds (2,545,875 hrs) and 150 beds (1,314,000 hrs) respectively. ED access gap as a proportion of inpatient care capacity was 0.93% for tertiary, 1.46% for community and 2.69% for pediatric centres. Conclusion: ED access gap is very large in Canadian EDs, but small compared to hospital operating capacity. Hospital capacity or efficiency improvements in the range of 1-3% could profoundly mitigate ED access block.
Introduction: 2.6% of emergency department (ED) syncope patients will have underlying cardiac serious conditions (e.g. arrhythmia, serious structural heart disease) identified within 30-days of disposition. If those at risk are discharged home, outpatient cardiac testing can detect underlying arrhythmias and structural heart disease, and thereby improve patient safety. We describe the frequency of outpatient referrals for cardiac testing and the proportion of cardiac serious adverse events (SAE) among high risk and non-high (low and medium) risk ED syncope patients, as defined by the Canadian Syncope Risk Score (CSRS). Methods: We conducted a multicenter prospective cohort study to enroll adult syncope patients across five large tertiary care EDs. We collected demographics, medical history, disposition, CSRS value, outpatient referrals and testing results (holter, echocardiography), and cardiac SAE. Adjudicated 30-day SAE included death due to unknown cause, myocardial infarction, arrhythmia, and structural heart disease. We used descriptive analysis. Results: Of 4,064 enrolled patients, a total of 955 patients (23%) received an outpatient referral (mean age 57.7 years, 52.1% female). Of the 299 patients (7%) hospitalized, 154 received outpatient cardiac testing after discharge. Among the 3,765 patients discharged home from the ED, 40% of the non-high risk patients (305/756) and 56% of the high risk patients (25/45) received outpatient cardiac testing. Of all patients who received outpatient cardiac testing, 4 patients (0.8%) had serious cardiac conditions identified and all were arrhythmias. Among those with no cardiac testing, 5 patients (0.9%) suffered cardiac SAE (80% arrhythmias) outside the hospital. Of the 20 (44%) high risk patients who did not receive outpatient cardiac testing, 2 (10%) patients suffered arrhythmias outside the hospital. While among the 451 non-high risk patients, only 0.8% suffered arrhythmia outside the hospital. Conclusion: Outpatient cardiac testing among ED syncope patients is largely underutilized, especially among high risk patients. Better guidelines for outpatient cardiac testing are needed, as current practice is highly variable and mismatched with patient risk.
Extended-spectrum β-lactamase (ESBL) production has been very rare in serotype K1 Klebsiella pneumoniae ST23 strains, which are well-known invasive community strains. Among 92 ESBL-producing strains identified in 218 isolates from nine Asian countries, serotype K1 K. pneumoniae strains were screened. Two ESBL-producing K. pneumoniae isolates from Singapore and Indonesia were determined to be serotype K1 and ST23. Their plasmids, which contain CTX-M-15 genes, are transferable rendering the effective transfer of ESBL resistance plasmids to other organisms.
Universal screening for postpartum depression is recommended in many countries. Knowledge of whether the disclosure of depressive symptoms in the postpartum period differs across cultures could improve detection and provide new insights into the pathogenesis. Moreover, it is a necessary step to evaluate the universal use of screening instruments in research and clinical practice. In the current study we sought to assess whether the Edinburgh Postnatal Depression Scale (EPDS), the most widely used screening tool for postpartum depression, measures the same underlying construct across cultural groups in a large international dataset.
Ordinal regression and measurement invariance were used to explore the association between culture, operationalized as education, ethnicity/race and continent, and endorsement of depressive symptoms using the EPDS on 8209 new mothers from Europe and the USA.
Education, but not ethnicity/race, influenced the reporting of postpartum depression [difference between robust comparative fit indexes (∆*CFI) < 0.01]. The structure of EPDS responses significantly differed between Europe and the USA (∆*CFI > 0.01), but not between European countries (∆*CFI < 0.01).
Investigators and clinicians should be aware of the potential differences in expression of phenotype of postpartum depression that women of different educational backgrounds may manifest. The increasing cultural heterogeneity of societies together with the tendency towards globalization requires a culturally sensitive approach to patients, research and policies, that takes into account, beyond rhetoric, the context of a person's experiences and the context in which the research is conducted.