We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Limited data exist for management of hyperuricemia in non-oncologic patients, particularly in paediatric cardiac patients. Hyperuricemia is a risk factor for acute kidney injury and may prompt treatment in critically ill patients. The primary objective was to determine if rasburicase use was associated with greater probability normalisation of serum uric acid compared to allopurinol. Secondary outcomes included percent reduction in uric acid, changes in serum creatinine, and cost of therapy.
Design:
A single-centre retrospective chart review.
Setting:
A 20-bed quaternary cardiovascular ICU in a university-based paediatric hospital in California.
Patients:
Patients admitted to cardiovascular ICU who received rasburicase or intravenous allopurinol between 2015 and 2016.
Interventions:
None.
Measurements and main results:
Data from a cohort of 14 patients receiving rasburicase were compared to 7 patients receiving IV allopurinol. Patients who were administered rasburicase for hyperuricemia were more likely to have a post-treatment uric acid level less than 8 mg/dl as compared to IV allopurinol (100 versus 43%; p = 0.0058). Patients who received rasburicase had a greater absolute reduction in post-treatment day 1 uric acid (−9 mg/dl versus −1.9 mg/dl; p = 0.002). There were no differences in post-treatment day 3 or day 7 serum creatinine or time to normalisation of serum creatinine. The cost of therapy normalised to a 20 kg patient was greater in the allopurinol group ($18,720 versus $1928; p = 0.001).
Conclusion:
In a limited paediatric cardiac cohort, the use of rasburicase was associated with a greater reduction in uric acid levels and associated with a lower cost compared to IV allopurinol.
Recognizing the importance not only of the clinician’s opinion but also of the patient’s experience and perspective, Sequenced Treatment Alternatives to Relieve Depression (STAR*D) utilized both clinician-reported and patient-reported outcomes in a large-scale multi-step study on antidepressant effectiveness in real-world settings. Both approaches indicate that <17% of Major Depressive Disorder (MDD) patients respond to novel oral treatments after two prior antidepressant failures. To address this low response rate and continue to investigate the use of patient-rated outcomes in clinical trials, an antidepressant with a new mechanism of action is being investigated for efficacy and safety utilizing both clinician-rated and patient-reported scales.
Methods
This is a post-hoc analysis of a Janssen R&D Phase 2a clinical trial (ESKETINTRD2003). Subjects aged 20-64 withMDD without psychotic features (DSM IV) and a history of inadequate response to ≥2 antidepressants were randomized [3:1:1:1] to 1 week of twice-weekly treatment with intranasal placebo (n=33), esketamine 28 mg (n=11), 56 mg (n=11), or 84 mg (n=12). Participants taking oral antidepressants at study entry continued treatment during the study. Changes in depression severitywere measured using the Clinical Global Impression Severity (CGI-S) and the Patient Global Impression Severity (PGI-S) scales.
Results
At all esketamine doses (28 mg, 56 mg, 84 mg), subjects reported a one-point mean change in PGI-S from baseline to week one compared to no change on placebo (p-values 0.005, 0.001, 0.032 respectively). Similarly, mean CGI-S scores improved for subjects receiving esketamine at all doses (p-values 0.028, 0.004, 0.049 respectively) compared to no change inplacebo subjects. These data are consistent with previously reported data based on the Montgomery Åsberg Depression Rating Scale (MADRS) and support positive correlation between patient-reported and clinician-reported outcomes.
Discussion
Initial results from this Phase 2a study suggest clinically relevant improvement in depression symptoms in as early as one week when treated with twice-weekly intranasal esketamine as reported by both clinicians and patients. This work will help guide future investigations of esketamine in larger populations to provide better therapeutic options for treatment resistantMDD patients.
Previous analyses of adolescent suicides in England and Wales have focused on short time periods.
Aims
To investigate trends in suicide and accidental deaths in adolescents between 1972 and 2011.
Method
Time trend analysis of rates of suicides and deaths from accidental poisoning and hanging in 10- to 19-year-olds by age, gender and deprivation. Rate ratios were estimated for 1982–1991, 1992–2001 and 2002–2011 with 1972–1981 as comparator.
Results
Suicide rates have remained stable in 10- to 14-year-olds, with strong evidence for a reduction in accidental deaths. In males aged 15–19, suicide rates peaked in 2001 before declining. Suicide by hanging is the most common method of suicide. Rates were higher in males and in 15- to 19-year-olds living in more deprived areas.
Conclusions
Suicide rates in adolescents are at their lowest since the early 1970s with no clear evidence that changes in coroners' practices underlie this trend.
The modification of microbiota composition to a ‘beneficial’ one is a promising approach for improving intestinal as well as overall health. Natural fibres and phytochemicals that reach the proximal colon, such as those present in various nuts, provide substrates for the maintenance of healthy and diverse microbiota. The effects of increased consumption of specific nuts, which are rich in fibre as well as various phytonutrients, on human gut microbiota composition have not been investigated to date. The objective of the present study was to determine the effects of almond and pistachio consumption on human gut microbiota composition. We characterised microbiota in faecal samples collected from volunteers in two separate randomised, controlled, cross-over feeding studies (n 18 for the almond feeding study and n 16 for the pistachio feeding study) with 0, 1·5 or 3 servings/d of the respective nuts for 18 d. Gut microbiota composition was analysed using a 16S rRNA-based approach for bacteria and an internal transcribed spacer region sequencing approach for fungi. The 16S rRNA sequence analysis of 528 028 sequence reads, retained after removing low-quality and short-length reads, revealed various operational taxonomic units that appeared to be affected by nut consumption. The effect of pistachio consumption on gut microbiota composition was much stronger than that of almond consumption and included an increase in the number of potentially beneficial butyrate-producing bacteria. Although the numbers of bifidobacteria were not affected by the consumption of either nut, pistachio consumption appeared to decrease the number of lactic acid bacteria (P< 0·05). Increasing the consumption of almonds or pistachios appears to be an effective means of modifying gut microbiota composition.
A recurring theme of the Late Bronze Age is the apparent association between deliberate deposition of material and wet places. Recently, a human skull has been discovered within the basal sediments of a relict mire at Poulton-le-Fylde, Lancashire, dating to the later Bronze Age (c. 1250–840 cal BC). The find, which belonged to a c. 25–35 year old male, was located within a layer of silty wood peat elm deep, representing the ancient root system of a hazel copse and containing many hazelnuts and some charcoal. Palaeopathological investigation established the likelihood that the skull had decomposed before deposition and there are strong parallels between the find and its context and other prehistoric skulls recorded from British wetlands. The connection of the human remains with considerable amounts of hazel wood may also be of significance when viewed within the wider context of similar associations recorded from European bog-bodies. During the course of excavation and survey of the site worked wood fragments were recovered indicating both human and animal (beaver) activity, dating to the later Bronze Age and Early Iron Age respectively. The stratigraphic sequence indicated that organic sedimentation resulted from the rapid flooding of a formerly relatively dry landscape, perhaps as a result of the effects of beaver damming – a possibility which may hold wider implications for the archaeological interpretation of prehistoric pollen data.
A 54-year-old female presented with a two year history of progressive headaches and upper neck pain. The headaches were worse with coughing and bending. Neurological examination was unremarkable including a normal cranial nerve examination. There was no papilloedema. A computed tomogram (CT) demonstrated a midline, posterior fossa, partly fatty, partly solid mass (Figure 1). Magnetic resonance imaging (MRI) demonstrated a mixed fatty, solid mass arising from the fourth ventricle and extending downward below the foramen magnum to the C1 level (Figure 2). The solid portions demonstrated enhancement. In addition, in the lateral right cerebellar hemisphere, there was a second, separate, solid, enhancing mass without any connection to the larger central lesion. A subtotal resection of the tumor was achieved through a suboccipital craniectomy.
Extended learning courses in Number and Algebra for in-service Middle School Math Teachers have been taught at West Virginia University since 2002. The format of the courses has remained constant: a two credit hour mathematics course and a one credit hour corequisite Curriculum and Instruction course, part I of both Math and C&I taught in the fall semester and part II in the spring semester. Splitting the three hour content into a mathematics portion and a C&I portion is done to make clear the applicability of the mathematics. The courses were conceived as part of an initiative for in-service middle school mathematics teachers in the NSF funded statewide professional development initiative called MERIT (Mathematics Education Reform Initiative for Teachers). More recently the courses have been offered as part of the Southern Regional Education Board (SREB) Making Middle Grades Work program, which is independent of the MERIT initiative, but has a similar philosophy and aims to meet similar goals of increasing capacity and teacher depth of knowledge at the middle school level. The courses are appropriate for in-service and pre-service middle school math teachers, but so far have proved to be of greatest use for in-service teachers.
The objectives of the courses are to increase knowledge and competence for middle school mathematics teachers in content and pedagogy related to the teaching and learning of number and algebra. For the two credit hour mathematics courses, the content portion means
Improve understanding of basic concepts and skills in the area of number and algebra.
View number and algebra from an advanced perspective.
We present new imaging data and archival multiwavelength observations of the little-studied emission nebula K 1-6 and its central star. Narrow-band images inHα (+[N II]) and [O III] taken with the Faulkes Telescope North reveal a stratified, asymmetric, elliptical nebula surrounding a central star which has the colours of a late G or early K-type subgiant or giant. GALEX ultraviolet images reveal a very hot subdwarf or white dwarf coincident in position with this star. The cooler, optically dominant star is strongly variable with a period of 21.312± 0.008 days, and is possibly a high-amplitude member of the RS CVn class, although an FK Com classification is also possible. Archival ROSAT data provide good evidence that the cool star has an active corona. We conclude that K 1-6 is most likely an old bona fide planetary nebula at a distance of ∼1.0 kpc, interacting with the interstellar medium, and containing a binary or ternary central star. The observations and data analyses reported in this paper were conducted in conjunction with Year 11 high school students as part of an Australian Research Council Linkage Grant science education project, denoted Space To Grow, conducted jointly by professional astronomers, educational researchers, teachers, and high-school students.
The objective of this study was to determine the resource utilization of a tertiary care Japanese emergency department (ED) that was not immediately adjacent to the area of the 2011 Great East Japan earthquake and tsunami.
Methods
A retrospective chart review was performed at a tertiary care university-based urban ED located approximately 290 km from the primary site of destruction secondary to an earthquake measuring 9.0 on the Richter Scale and the resulting tsunami. All patients who presented for a period of twelve days before and twelve days after the disaster were included. Data were collected using preformed data collection sheets, and stored in an Excel file. Abstracted data included gender, time in the ED, intravenous fluid administration, blood transfusion, oxygen, laboratories, electrocardiograms (ECGs), radiographs, ultrasound, diagnoses, surgical and medical referrals, and prescriptions written. Ten percent of the charts were reviewed for accuracy, and an error rate reported. Data were analyzed using 2-tailed t-tests, Fisher's exact tests or rank sum tests. Bonferroni correction was used to adjust P values for multiple comparisons.
Results
Charts for 1193 patients were evaluated. The error rate for the abstracted data was 3.2% (95% CI, 2.4%-4.1%). Six hundred fifty-seven patients (53% male) were evaluated in the ED after the earthquake, representing a 23% increase in patient volume. Mean patient time spent in the ED decreased from 61 minutes to 52 minutes (median decrease from 35 minutes to 32 minutes; P = .005). Laboratory utilization decreased from 51% to 43% (P = .006). The percentage of patients receiving prescriptions increased from 48% to 54% (P = .002). There was no change in the number of patients evaluated for surgical complaints, but there was an increase in the number treated for medical or psychiatric complaints.
Conclusion
There was a significant increase in the number of people utilizing the ED in Tokyo after the Great East Japan earthquake and tsunami. Time spent in the ED was decreased along with laboratory utilization, possibly reflecting decreased patient acuity. This information may help in the allocation of national resources when planning for disasters.
ShimadaM, TanabeA, GunshinM, RiffenburghRH, TanenDA. Resource Utilization in the Emergency Department of a Tertiary Care University-Based Hospital in Tokyo Before and After the 2011 Great East Japan Earthquake and Tsunami. Prehosp Disaster Med. 2012;27(6):1-4.
High-quality evidence on morale in the mental health workforce is
lacking.
Aims
To describe staff well-being and satisfaction in a multicentre UK
National Health Service (NHS) sample and explore associated factors.
Method
A questionnaire-based survey (n = 2258) was conducted in
100 wards and 36 community teams in England. Measures included a set of
frequently used indicators of staff morale, and measures of perceived job
characteristics based on Karasek's demand–control–support model.
Results
Staff well-being and job satisfaction were fairly good on most
indicators, but emotional exhaustion was high among acute general ward
and community mental health team (CMHT) staff and among social workers.
Most morale indicators were moderately but significantly intercorrelated.
Principal components analysis yielded two components, one appearing to
reflect emotional strain, the other positive engagement with work. In
multilevel regression analyses factors associated with greater emotional
strain included working in a CMHT or psychiatric intensive care unit
(PICU), high job demands, low autonomy, limited support from managers and
colleagues, age under 45 years and junior grade. Greater positive
engagement was associated with high job demands, autonomy and support
from managers and colleagues, Black or Asian ethnic group, being a
psychiatrist or service manager and shorter length of service.
Conclusions
Potential foci for interventions to increase morale include CMHTs, PICUs
and general acute wards. The explanatory value of the
demand–support–control model was confirmed, but job characteristics did
not fully explain differences in morale indicators across service types
and professions.
Paleoindian archaeology on the Great Plains is often characterized by the investigation of large mammal kill/butchery bonebeds with relatively high archaeological visibility. Extensively documented aspects of Paleoindian behavioral variability include the form and composition of weaponry systems, hunting strategies, carcass exploitation, and hunter mobility. Non-hunting oriented aspects of settlement and subsistence behavior are less documented. Information from Component 2 at the O.V. Clary site, in Ash Hollow, western Nebraska, lessens this imbalance of knowledge. It provides a fine-grained, spatially extensive record of Late Paleoindian (Allen Complex) activities at a winter base camp occupied for 5-7 months. This paper highlights elements of site structure and activity organization, emphasizing domestic behaviors including hearth use, site maintenance, and hide working. ArcGIS 9.3.1 (ESRI) and GeoDa 0.9.5-1 (Anselin 2003; Anselin et al. 2006) are employed in conjunction with middle-range observations and expectations to document and interpret spatial patterning in the distribution of over 57,000 artifacts, ecofacts, and red ochre nodules. More broadly, results are related to two models of Paleoindian residential mobility: the place-oriented model and the high-tech forager model. Rather than mutually exclusive scenarios, Component 2 indicates that these models reflect complementary structural poses within the overall behavioral system.
The Angus Mammoth site in south-central Nebraska has been controversial since its discovery in 1931 when a fluted artifact was reported to be associated with the mammoth. For nearly 80 years it has not been known if Angus was a paleontological site predating the human occupation of North America as has been asserted by some geologists and paleontologists, or an archaeological site dating to the late Pleistocene as has been advocated by some archaeologists. Geomorphic study and luminescence dating have finally solved the problem after nearly eight decades. Although microwear and technological analyses have determined that the Angus biface is an authentic artifact, TL and IRSL dates have shown that the matrix above the mammoth is much too old for a mammoth/fluted point association to be valid.
While it is clear that self-reported racial/ethnic discrimination is related to illness, there are challenges in measuring self-reported discrimination or unfair treatment. In the present study, we evaluate the psychometric properties of a self-reported instrument across racial/ethnic groups in a population-based sample, and we test and interpret findings from applying two different widely-used approaches to asking about discrimination and unfair treatment. Even though we found that the subset of items we tested tap into a single underlying concept, we also found that different groups are more likely to report on different aspects of discrimination. Whether race is mentioned in the survey question affects both frequency and mean scores of reports of racial/ethnic discrimination. Our findings suggest caution to researchers when comparing studies that have used different approaches to measure racial/ethnic discrimination and allow us to suggest practical empirical guidelines for measuring and analyzing racial/ethnic discrimination. No less important, we have developed a self-reported measure of recent racial/ethnic discrimination that functions well in a range of different racial/ethnic groups and makes it possible to compare how racial/ethnic discrimination is associated with health disparities among multiple racial/ethnic groups.