To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The oceans have a huge capability to store, release, and transport heat, water, and various chemical species on timescales from seasons to centuries. Their transports affect global energy, water, and biogeochemical cycles and are crucial elements of Earth’s climate system. Ocean variability, as represented, for example, by sea surface temperature (SST) variations, can result in anomalous diabatic heating or cooling of the overlying atmosphere, which can in turn alter atmospheric circulation in such a way as to feedback on ocean thermal and current structures to modify the original SST variations. Ocean–atmosphere interactions in one ocean basin can also influence remote regions via interbasin teleconnections that can trigger responses having both local and far-field impacts. This chapter highlights the defining aspects of the climate in individual ocean basins, including mean states, seasonal cycles, interannual-to-interdecadal variability, and interactions with other basins. Key components of the global and tropical ocean observing system are also described.
In this paper, the generation of relativistic electron mirrors (REM) and the reflection of an ultra-short laser off the mirrors are discussed, applying two-dimension particle-in-cell simulations. REMs with ultra-high acceleration and expanding velocity can be produced from a solid nanofoil illuminated normally by an ultra-intense femtosecond laser pulse with a sharp rising edge. Chirped attosecond pulse can be produced through the reflection of a counter-propagating probe laser off the accelerating REM. In the electron moving frame, the plasma frequency of the REM keeps decreasing due to its rapid expansion. The laser frequency, on the contrary, keeps increasing due to the acceleration of REM and the relativistic Doppler shift from the lab frame to the electron moving frame. Within an ultra-short time interval, the two frequencies will be equal in the electron moving frame, which leads to the resonance between laser and REM. The reflected radiation near this interval and corresponding spectra will be amplified due to the resonance. Through adjusting the arriving time of the probe laser, a certain part of the reflected field could be selectively amplified or depressed, leading to the selective adjustment of the corresponding spectra.
Introduction: Patients with poorly-controlled diabetes often visit the emergency department (ED) for treatment of hyperglycemia. While previous qualitative studies have examined the patient experience of diabetes as a chronic illness, there are no studies describing patients’ perceptions of ED care for hyperglycemia. The objective of this study was to explore the patient experience regarding ED hyperglycemia visits, and to characterize perceived barriers to adequate glycemic control post-discharge. Methods: This study was conducted at a tertiary care academic centre in London, Ontario. A qualitative constructivist grounded theory methodology was used to understand the experience of adult patient partners who have had an ED hyperglycemia visit. Patient partners, purposively sampled to capture a breadth of age, sex, disease and presentation frequency were invited to participate in a semi-structured individual interview to probe their experiences. Sampling continued until a theoretical framework representing key experiences and expectations reached sufficiency. Data were collected and analyzed iteratively using a constant comparative approach. Results: 22 patients with type 1 or 2 diabetes were interviewed. Participants sought care in the ED over other options because of their concern of having a potentially life-threatening condition, advice from a healthcare provider or family member, or a perceived lack of convenient alternatives to the ED based on time and location. Participants’ care expectations centred around symptom relief, glycemic control, reassurance and education, and seeking referral to specialist diabetes care post-discharge. Finally, perceived system barriers that challenged participants’ glycemic control included affordability of medical supplies and medications, access to follow-up and, in some cases, the transition from pediatric to adult diabetes care. Conclusion: Patients with diabetes utilize the ED for a variety of urgent and emergent hyperglycemic concerns. In addition to providing excellent medical treatment, ED healthcare providers should consider patients’ expectations when caring for those presenting with hyperglycemia. Future studies will focus on developing strategies to help patients navigate some of the barriers that exist within our current limited healthcare system, enhance follow-up care, and improve short- and long-term health outcomes.
Introduction: Extreme heat events due to climate change are becoming increasingly frequent and severe, and may have an impact on human health. Administrative database studies using International Classification of Diseases 10th revision codes (ICD-10) are powerful tools to measure the burden of acute heat illness (AHI) in Canada. We aimed to assess the validity of the coding algorithm for emergency department (ED) encounters for AHI in our region. Methods: Two independent reviewers retrospectively abstracted data from 507 medical records of patients presenting at two EDs in Ontario between May-September 2015-2018. The Gold Standard definition of an AHI is chart-documented heat exposure with a heat related complaint, such as syncope while working outdoors on a hot day. To determine ICD coding algorithm positive predictive value (PPV), records that were previously coded as ICD-10 heat illnesses were compared to the Gold Standard for AHI. To determine sensitivity (Sn), specificity (Sp) and negative predictive values (NPV), the Gold Standard was compared to randomly selected records. A total of 326,702 ED visits were included in study period with 208 having an ICD-10 code related to heat illness. Sample size calculation demonstrated a need to manually review 62 previously coded heat illnesses and 931 random cases, of which 50 and 474 have been reviewed, respectively. In both abstractions, 20% of cases underwent a blinded duplicate review. Results: In our review of 474 random records, 2 cases were identified as AHI but without an appropriate ICD-10 code, 445 were not AHIs, and no cases had been identified as having an AHI ICD-10 inappropriately applied. In our review of 50 previously coded heat illnesses, 34 were found to be appropriately coded and 16 inappropriately coded, as AHI ICD-10. Average patient age and gender of heat illness vs non-heat illness ED presentations were 32 and 48 years of age and 49% and 64% male, respectively. The leading complaint in AHI was heat stroke/exhaustion (39%), followed by headaches (15%), dizziness (9%), shortness of breath (9%) and syncope/presyncope (6%). 76% of all heat illness presentations presented following a period of physical exertion. Conclusion: Final calculation of Sn, Sp, PPV, NPV for the algorithm will occur upon completion of the review. Preliminary results suggest that ICD-10 coding for AHI may be applied correctly in the ED. This study will help to determine if administrative data can accurately be used to measure the burden of heat illness in Canada.
Introduction: Cannabinoid Hyperemesis Syndrome (CHS) in pediatric patients is poorly characterized. Literature is scarce, making identification and treatment challenging. This study's objective was to describe demographics and visit data of pediatric patients presenting to the emergency department (ED) with suspected CHS, in order to improve understanding of the disorder. Methods: A retrospective chart review was conducted of pediatric patients (12-17 years) with suspected CHS presenting to one of two tertiary-care EDs; one pediatric and one pediatric/adult (combined annual pediatric census 40,550) between April 2014-March 2019. Charts were selected based on discharge diagnosis of abdominal pain or nausea/vomiting with positive cannabis urine screen, or discharge diagnosis of cannabis use, using ICD-10 codes. Patients with confirmed or likely diagnosis of CHS were identified and data including demographics, clinical history, and ED investigations/treatments were recorded by a trained research assistant. Results: 242 patients met criteria for review. 39 were identified as having a confirmed or likely diagnosis of CHS (mean age 16.2, SD 0.85 years with 64% female). 87% were triaged as either CTAS-2 or CTAS-3. 80% of patients had cannabis use frequency/duration documented. Of these, 89% reported at least daily use, the mean consumption was 1.30g/day (SD 1.13g/day), and all reported ≥6 months of heavy use. 69% of patients had at least one psychiatric comorbidity. When presenting to the ED, all had vomiting, 81% had nausea, 81% had abdominal pain, and 30% reported weight loss. Investigations done included venous blood gas (30%), pregnancy test in females (84%), liver enzymes (57%), pelvic or abdominal ultrasound (19%), abdominal X-ray (19%), and CT head (5%). 89% of patients received treatment in the ED with 81% receiving anti-emetics, 68% receiving intravenous (IV) fluids, and 22% receiving analgesics. Normal saline was the most used IV fluid (80%) and ondansetron was the most used anti-emetic (90%). Cannabis was suspected to account for symptoms in 74%, with 31% of these given the formal diagnosis of CHS. 62% of patients had another visit to the ED within 30 days (prior to or post sentinel visit), 59% of these for similar symptoms. Conclusion: This study of pediatric CHS reveals unique findings including a preponderance of female patients, a majority that consume cannabis daily, and weight loss reported in nearly one third. Many received extensive workups and most had multiple clustered visits to the ED.
The efficacy of venlafaxine extended-release (XR) at doses between 75 mg/d and 300 mg/d has been demonstrated in patients with recurrent major depressive disorder (MDD) over 2.5 years. This analysis evaluated the long-term efficacy of venlafaxine XR ≤225 mg/d, the approved dosage in many countries.
In the primary multicenter, double-blind trial, outpatients with recurrent MDD (N=1096) were randomized to receive 10-week acute-phase treatment with venlafaxine XR (75 mg/d to 300 mg/d) or fluoxetine (20 mg/d to 60 mg/d), followed by a 6-month continuation phase. Subsequently, at the start of 2 consecutive, double-blind, 12-month maintenance phases, venlafaxine XR responders were randomized to receive venlafaxine XR or placebo. Data from the 24 months of maintenance treatment were analyzed for the combined end point of maintenance of response (ie, no recurrence of depression and no dose increase above 225 mg/d), and each component individually. Time to each outcome was evaluated with Kaplan-Meier methods using log-rank tests for venlafaxine XR-placebo comparisons.
The analysis population included 114 patients who had received venlafaxine XR doses less than or equal to 225 mg/d prior to maintenance phase baseline (venlafaxine XR: n=55; placebo: n=59). Probability estimates for maintaining response were 70% for venlafaxine XR and 38% for placebo (P=0.007), for no dose increase were 76% and 58%, respectively (P=0.019), and for no recurrence were 87% vs 65%, respectively (P=.099).
These data confirm venlafaxine XR is effective maintaining response at doses ≤225 mg/d for up to 2.5 years in patients with MDD.
Individuals with attention-deficit/hyperactivity disorder (ADHD) may require long-term medication.
To measure growth and sexual maturation of children and adolescents with ADHD receiving lisdexamfetamine dimesylate (LDX) in a 2-year trial (SPD489-404).
To investigate the impact of long-term LDX treatment on growth and maturation.
Participants (6–17 years) received dose-optimized, open-label LDX (30–70 mg/day) for 104 weeks. Weight, height and BMI z-scores were derived using the Centers for Disease Control and Prevention norms . Sexual maturation was assessed using the Tanner scale (participant-rated as closest to their stage of development based on standardized drawings).
Of 314 enrolled participants, 191 (60.8%) completed the study. Mean z-scores at baseline and last on-treatment assessment (LOTA) were 0.53 (standard deviation, 0.963) and 0.02 (1.032) for weight, 0.61 (1.124) and 0.37 (1.131) for height, and 0.32 (0.935) and–0.27 (1.052) for BMI. In general, z-scores shifted lower over the first 36 weeks and then stabilized. At LOTA, most participants remained at their baseline Tanner stage or shifted higher, based on development of hair (males, 95.5%; females, 92.1%) or genitalia/breasts (males, 94.7%; females, 98.4%).
Consistent with previous studies of stimulants used to treat ADHD , z-scores for weight, height and BMI decreased, mostly in the first year, then stabilized. No clinically concerning trends of LDX treatment on sexual maturation or the onset of puberty were observed.
Disclosure of interest
Study funded by Shire Development LLC.
Dr Isabel Hernández Otero (Alicia Koplowitz Foundation, Eli Lilly, Forest, Janssen-Cilag, Junta de Andalucia, Roche, Shire, Shire Pharmaceuticals Iberica S.L., and Sunovion).
The long-term safety and efficacy of lisdexamfetamine dimesylate (LDX) in children and adolescents with attention deficit/hyperactivity disorder (ADHD) was evaluated in a European 2-year, open-label study (SPD489-404).
To evaluate the time-course of treatment-emergent adverse events (TEAEs) in SPD489-404.
Participants aged 6–17 years received open-label LDX (30, 50 or 70 mg/day) for 104 weeks (4 weeks dose-optimization; 100 weeks dose-maintenance).
All enrolled participants (n = 314) were included in the safety population and 191 (60.8%) completed the study. TEAEs occurred in 282 (89.8%) participants; most were mild or moderate. TEAEs considered by the investigators as related to LDX were reported by 232 (73.9%) participants with the following reported for ≥ 10% of participants: decreased appetite (49.4%), weight decreased (18.2%), insomnia (13.1%). TEAEs leading to discontinuation and serious TEAEs occurred in 39 (12.4%) and 28 (8.9%) participants, respectively. The median (range) time to first onset and duration, respectively, of TEAEs identified by the sponsor as being of special interest were: insomnia (insomnia, initial insomnia, middle insomnia, terminal insomnia), 17.0 (1–729) and 42.8 (1–739) days; weight decreased, 29.0 (1–677) and 225.0 (26–724) days; decreased appetite, 13.5 (1–653) and 169.0 (1–749) days; headache, 22.0 (1–718) and 2.0 (1–729) days. Reports of insomnia, weight decreased, decreased appetite and headache were highest in the first 4–12 weeks.
TEAEs associated with long-term LDX treatment were characteristic of stimulant medications, with the greatest incidence observed during the first 4–12 weeks.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The development of digestive organs and the establishment of gut microbiota in pullets play an important role throughout life. This study was conducted to investigate the effects of Bacillus subtilis (BS) on growth performance, intestinal function and gut microbiota in pullets from 0 to 6 weeks of age. Hy-line Brown laying hens (1-day-old, n = 504) were randomly allotted into four diets with a 2 × 2 factorial design: (1) basal diet group (control); (2) antibiotics group (AGP), the basal diet supplemented with 20 mg/kg Bacitracin Zinc and 4 mg/kg Colistin Sulphate; (3) BS group, the basal diet supplemented with 500 mg/kg BS and (4) mixed group, the basal diet supplemented with both AGP and BS. As a result, when BS was considered the main effect, BS addition (1) reduced the feed conversion ratio at 4 to 6 weeks (P < 0.05); (2) decreased duodenal and jejunal crypt depth at 3 weeks; (3) increased the villus height : crypt depth (V : C) ratio in the duodenum at 3 weeks and jejunal villus height at 6 weeks and (4) increased sucrase mRNA expression in the duodenum at 3 weeks as well as the jejunum at 6 weeks, and jejunal maltase and aminopeptidase expression at 3 weeks. When AGP was considered the main effect, AGP supplementation (1) increased the V : C ratio in the ileum at 3 weeks of age; (2) increased sucrase mRNA expression in the duodenum at 3 weeks as well as the ileum at 6 weeks, and increased maltase expression in the ileum. The BS × AGP interaction was observed to affect average daily feed intake at 4 to 6 weeks, and duodenal sucrase and jejunal maltase expression at 3 weeks. Furthermore, dietary BS or AGP addition improved caecal microbial diversity at 3 weeks, and a BS × AGP interaction was observed (P < 0.05) for the Shannon and Simpson indexes. At the genus level, the relative abundance of Lactobacillus was found to be higher in the mixed group at 3 weeks and in the BS group at 6 weeks. Moreover, Anaerostipes, Dehalobacterium and Oscillospira were also found to be dominant genera in pullets with dietary BS addition. In conclusion, BS could improve intestinal morphology and change digestive enzyme relative expression and caecum microbiota, thereby increasing the efficiency of nutrient utilization. Our findings suggested that BS might have more beneficial effects than AGP in the study, which would provide theoretical evidence and new insight into BS application in layer pullets.
Patients with severe mental disorders in low-resource settings have limited access to services, resulting in overwhelming caregiving burden for families. In extreme cases, this has led to the long-term restraining of patients in their homes. China underwent a nationwide initiative to unlock patients and provide continued treatment. This study aims to quantify household economic burden in families after unlocking and treatment, and to identify factors associated with increased burden due to schizophrenia.
A total of 264 subjects were enrolled from three geographically diverse provinces in 2012. Subjects were patients with schizophrenia who were previously put under restraints and had participated in the ‘unlocking and treatment’ intervention. The primary outcome was the current household economic burden, obtained from past year financial information collected through on-site interview. Patient disease characteristics, treatment, outcomes and family caregiving burden were collected as well. Univariate and multivariate linear regression were used to construct risk factor models for indirect economic burden.
After participating in the intervention, 85% of patients continued to receive mental health services, 70% used medication as prescribed and 80% were never relocked. Family members reported significantly decreased caregiving burden after receiving the intervention. Mean direct and indirect household economic burdens were CNY963 (US$31.7) and CNY11 724 (US$1670) per year, respectively, while family total income was on average CNY12 108 (US$1913) per year. Greater disease severity and poorer patient psychosocial function at time of study were found to be independent factors related to increased indirect burden.
The ‘unlocking and treatment’ intervention has improved the lives of patients and families. Indirect burden due to disease is still a major economic issue that needs to be addressed, potentially through improving treatment and patient functioning. Our findings contribute to the unravelling and eventual elimination of chronic restraining of mentally ill patients in low-resource settings.
There is evidence indicating that using the current UK energy feeding system to ration the present sheep flocks may underestimate their nutrient requirements. The objective of the present study was to address this issue by developing updated maintenance energy requirements for the current sheep flocks and evaluating if these requirements were influenced by a range of dietary and animal factors. Data (n = 131) used were collated from five experiments with sheep (5 to 18 months old and 29.0 to 69.8 kg BW) undertaken at the Agri-Food and Biosciences Institute of the UK from 2013 to 2017. The trials were designed to evaluate the effects of dietary type, genotype, physiological stage and sex on nutrient utilization and energetic efficiencies. Energy intake and output data were measured in individual calorimeter chambers. Energy balance (Eg) was calculated as the difference between gross energy intake and a sum of fecal energy, urine energy, methane energy and heat production. Data were analysed using the restricted maximum likelihood analysis to develop the linear relationship between Eg or heat production and metabolizable energy (ME) intake, with the effects of a range of dietary and animal factors removed. The net energy (NEm) and ME (MEm) requirements for maintenance derived from the linear relationship between Eg and ME intake were 0.358 and 0.486 MJ/kg BW0.75, respectively, which are 40% to 53% higher than those recommended in energy feeding systems currently used to ration sheep in the USA and the UK. Further analysis of the current dataset revealed that concentrate supplement, sire type or physiological stage had no significant effect on the derived NEm values. However, female lambs had a significantly higher NEm (0.352 v. 0.306 or 0.288 MJ/kg BW0.75) or MEm (0.507 v. 0.441 or 0.415 MJ/kg BW0.75) than those for male or castrated lambs. The present results indicate that using present energy feeding systems in the UK developed over 40 years ago to ration the current sheep flocks could underestimate maintenance energy requirements. There is an urgent need to update these systems to reflect the higher metabolic rates of the current sheep flocks.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Introduction: Endotracheal intubation (ETI) is a lifesaving procedure commonly performed by emergency department (ED) physicians that may lead to patient discomfort or adverse events (e.g., unintended extubation) if sedation is inadequate. No ED-based sedation guidelines currently exist, so individual practice varies widely. This study's objective was to describe the self-reported post-ETI sedation practice of Canadian adult ED physicians. Methods: An anonymous, cross-sectional, web-based survey featuring 7 common ED scenarios requiring ETI was distributed to adult ED physician members of the Canadian Association of Emergency Physicians (CAEP). Scenarios included post-cardiac arrest, hypercapnic and hypoxic respiratory failure, status epilepticus, polytrauma, traumatic brain injury, and toxicology. Participants indicated first and second choice of sedative medication following ETI, as well as bolus vs. infusion administration in each scenario. Data was presented by descriptive statistics. Results: 207 (response rate 16.8%) ED physicians responded to the survey. Emergency medicine training of respondents included CCFP-EM (47.0%), FRCPC (35.8%), and CCFP (13.9%). 51.0% of respondents work primarily in academic/teaching hospitals and 40.4% work in community teaching hospitals. On average, responding physicians report providing care for 4.9 ± 6.8 (mean ± SD) intubated adult patients per month for varying durations (39.2% for 1–2 hours, 27.8% for 2–4 hours, and 22.7% for ≤1 hour). Combining all clinical scenarios, propofol was the most frequently used medication for post-ETI sedation (38.0% of all responses) and was the most frequently used agent except for the post-cardiac arrest, polytrauma, and hypercapnic respiratory failure scenarios. Ketamine was used second most frequently (28.2%), with midazolam being third most common (14.5%). Post-ETI sedation was provided by > 98% of physicians in all situations except the post-cardiac arrest (26.1% indicating no sedation) and toxicology (15.5% indicating no sedation) scenarios. Sedation was provided by infusion in 74.6% of cases and bolus in 25.4%. Conclusion: Significant practice variability with respect to post-ETI sedation exists amongst Canadian emergency physicians. Future quality improvement studies should examine sedation provided in real clinical scenarios with a goal of establishing best sedation practices to improve patient safety and quality of care.
Lithium-ion capacitors (LICs) and Hybrid LICs (H-LICs) were assembled as three-layered pouch cells in an asymmetric configuration employing Faradaic pre-lithiated hard carbon anodes and non-Faradaic ion adsorption-desorption activated carbon (AC) cathodes for LICs and lithium iron phosphate (LiFePO4-LFP)/AC composite cathodes for H-LICs. The room temperature rate performance was evaluated after the initial LIC and H-LIC cell formation as a function of the electrolyte additives. The capacity retention was measured after charging at high temperature conditions, while the design factor explored was electrolyte additive formulation, with a focus on their stability. The high temperature potential holds simulate electrochemical energy materials under extreme environments and act to accelerate the failure mechanisms associated with cell degradation to determine robust electrolyte/additive combinations.
To investigate a Middle East respiratory syndrome coronavirus (MERS-CoV) outbreak event involving multiple healthcare facilities in Riyadh, Saudi Arabia; to characterize transmission; and to explore infection control implications.
Cases presented in 4 healthcare facilities in Riyadh, Saudi Arabia: a tertiary-care hospital, a specialty pulmonary hospital, an outpatient clinic, and an outpatient dialysis unit.
Contact tracing and testing were performed following reports of cases at 2 hospitals. Laboratory results were confirmed by real-time reverse transcription polymerase chain reaction (rRT-PCR) and/or genome sequencing. We assessed exposures and determined seropositivity among available healthcare personnel (HCP) cases and HCP contacts of cases.
In total, 48 cases were identified, involving patients, HCP, and family members across 2 hospitals, an outpatient clinic, and a dialysis clinic. At each hospital, transmission was linked to a unique index case. Moreover, 4 cases were associated with superspreading events (any interaction where a case patient transmitted to ≥5 subsequent case patients). All 4 of these patients were severely ill, were initially not recognized as MERS-CoV cases, and subsequently died. Genomic sequences clustered separately, suggesting 2 distinct outbreaks. Overall, 4 (24%) of 17 HCP cases and 3 (3%) of 114 HCP contacts of cases were seropositive.
We describe 2 distinct healthcare-associated outbreaks, each initiated by a unique index case and characterized by multiple superspreading events. Delays in recognition and in subsequent implementation of control measures contributed to secondary transmission. Prompt contact tracing, repeated testing, HCP furloughing, and implementation of recommended transmission-based precautions for suspected cases ultimately halted transmission.
We are performing systematic observation studies on the Galactic interstellar isotopic ratios, including 18O/17O, 12C/13C, 14N/15N and 32S/34S. Our strategy focuses on combination of multi-transition observation data toward large samples with different Galactocentric distances. Our preliminary results show positive Galactic radial gradients of 18O/17O and 12C/13C. In both cases, the ratio increases with the Galactocentric distance, which agrees with the inside-out scenario of our Galaxy. Observations of other isotopes such as 14N/15N and 32S/34S are on-going.
Curcumin has been attributed with antioxidant, anti-inflammatory, antibacterial activities, and has shown highly protective effects against enteropathogenic bacteria and mycotoxins. Ochratoxin A (OTA) is one of the major intestinal pathogenic mycotoxins. The possible effect of curcumin on the alleviation of enterotoxicity induced by OTA is unknown. The effects of dietary curcumin supplementation on OTA-induced oxidative stress, intestinal barrier and mitochondrial dysfunctions were examined in young ducks. A total of 540 mixed-sex 1-day-old White Pekin ducklings with initial BW (43.4±0.1 g) were randomly assigned into controls (fed only the basal diet), a group fed an OTA-contaminated diet (2 mg/kg feed), and a group fed the same OTA-contaminated feed plus 400 mg/kg of curcumin. Each treatment consisted of six replicates, each containing 30 ducklings and treatment lasted for 21 days. There was a significant decrease in average daily gain (ADG) and increased feed : gain caused by OTA (P<0.05); curcumin co-treatment prevented the decrease in BW and ADG compared with the OTA group (P<0.05). Histopathological and ultrastructural examination showed clear signs of enterotoxicity caused by OTA, but these changes were largely prevented by curcumin supplementation. Curcumin decreased the concentrations of interleukin-1β, tumor necrosis factor-α and malondialdehyde, and increased the activity of glutathione peroxidase induced by OTA in the jejunal mucosa of ducks (P<0.05). Additionally, curcumin increased jejunal mucosa occludin and tight junction protein 1 mRNA and protein levels, and decreased those of ρ-associated protein kinase 1 (P<0.05). Notably, curcumin inhibited the increased expression of apoptosis-related genes, and downregulated mitochondrial transcription factors A, B1 and B2 caused by OTA without any effects on RNA polymerase mitochondrial (P<0.05). These results indicated that curcumin could protect ducks from OTA-induced impairment of intestinal barrier function and mitochondrial integrity.
Whether there are distinct subtypes of schizophrenia is an important issue to advance understanding and treatment of schizophrenia.
To understand and treat individuals with schizophrenia, the aim was to advance understanding of differences between individuals, whether there are discrete subtypes, and how fist-episode patients (FEP) may differ from multiple episode patients (MEP).
These issues were analysed in 687 FEP and 1880 MEP with schizophrenia using the Positive and Negative Syndrome Scale for (PANSS) schizophrenia before and after antipsychotic medication for 6 weeks.
The seven Negative Symptoms were correlated with each other and with P2 (conceptual disorganisation), G13 (disturbance of volition), and G7 (motor retardation). The main difference between individuals was in the cluster of seven negative symptoms, which had a continuous unimodal distribution. Medication decreased the PANSS scores for all the symptoms, which were similar in the FEP and MEP groups.
The negative symptoms are a major source of individual differences, and there are potential implications for treatment.