We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
Plasmodium coatneyi has been proposed as an animal model for human Plasmodium falciparum malaria as it appears to replicate many aspects of pathogenesis and clinical symptomology. As part of the ongoing evaluation of the rhesus macaque model of severe malaria, a detailed ultrastructural analysis of the interaction between the parasite and both the host erythrocytes and the microvasculature was undertaken. Tissue (brain, heart and kidney) from splenectomized rhesus macaques and blood from spleen-intact animals infected with P. coatneyi were examined by electron microscopy. In all three tissues, similar interactions (sequestration) between infected red blood cells (iRBC) and blood vessels were observed with evidence of rosette and auto-agglutinate formation. The iRBCs possessed caveolae similar to P. vivax and knob-like structures similar to P. falciparum. However, the knobs often appeared incompletely formed in the splenectomized animals in contrast to the intact knobs exhibited by spleen intact animals. Plasmodium coatneyi infection in the monkey replicates many of the ultrastructural features particularly associated with P. falciparum in humans and as such supports its use as a suitable animal model. However, the possible effect on host–parasite interactions and the pathogenesis of disease due to the use of splenectomized animals needs to be taken into consideration.
We present a broadband radio study of the transient jets ejected from the black hole candidate X-ray binary MAXI J1535–571, which underwent a prolonged outburst beginning on 2017 September 2. We monitored MAXI J1535–571 with the Murchison Widefield Array (MWA) at frequencies from 119 to 186 MHz over six epochs from 2017 September 20 to 2017 October 14. The source was quasi-simultaneously observed over the frequency range 0.84–19 GHz by UTMOST (the Upgraded Molonglo Observatory Synthesis Telescope) the Australian Square Kilometre Array Pathfinder (ASKAP), the Australia Telescope Compact Array (ATCA), and the Australian Long Baseline Array (LBA). Using the LBA observations from 2017 September 23, we measured the source size to be
$34\pm1$
mas. During the brightest radio flare on 2017 September 21, the source was detected down to 119 MHz by the MWA, and the radio spectrum indicates a turnover between 250 and 500 MHz, which is most likely due to synchrotron self-absorption (SSA). By fitting the radio spectrum with a SSA model and using the LBA size measurement, we determined various physical parameters of the jet knot (identified in ATCA data), including the jet opening angle (
$\phi_{\rm op} = 4.5\pm1.2^{\circ}$
) and the magnetic field strength (
$B_{\rm s} = 104^{+80}_{-78}$
mG). Our fitted magnetic field strength agrees reasonably well with that inferred from the standard equipartition approach, suggesting the jet knot to be close to equipartition. Our study highlights the capabilities of the Australian suite of radio telescopes to jointly probe radio jets in black hole X-ray binaries via simultaneous observations over a broad frequency range, and with differing angular resolutions. This suite allows us to determine the physical properties of X-ray binary jets. Finally, our study emphasises the potential contributions that can be made by the low-frequency part of the Square Kilometre Array (SKA-Low) in the study of black hole X-ray binaries.
The aim of the current study was to explore the effect of gender, age at onset, and duration on the long-term course of schizophrenia.
Methods
Twenty-nine centers from 25 countries representing all continents participated in the study that included 2358 patients aged 37.21 ± 11.87 years with a DSM-IV or DSM-5 diagnosis of schizophrenia; the Positive and Negative Syndrome Scale as well as relevant clinicodemographic data were gathered. Analysis of variance and analysis of covariance were used, and the methodology corrected for the presence of potentially confounding effects.
Results
There was a 3-year later age at onset for females (P < .001) and lower rates of negative symptoms (P < .01) and higher depression/anxiety measures (P < .05) at some stages. The age at onset manifested a distribution with a single peak for both genders with a tendency of patients with younger onset having slower advancement through illness stages (P = .001). No significant effects were found concerning duration of illness.
Discussion
Our results confirmed a later onset and a possibly more benign course and outcome in females. Age at onset manifested a single peak in both genders, and surprisingly, earlier onset was related to a slower progression of the illness. No effect of duration has been detected. These results are partially in accord with the literature, but they also differ as a consequence of the different starting point of our methodology (a novel staging model), which in our opinion precluded the impact of confounding effects. Future research should focus on the therapeutic policy and implications of these results in more representative samples.
A novel paediatric disease, multi-system inflammatory syndrome in children, has emerged during the 2019 coronavirus disease pandemic.
Objectives:
To describe the short-term evolution of cardiac complications and associated risk factors in patients with multi-system inflammatory syndrome in children.
Methods:
Retrospective single-centre study of confirmed multi-system inflammatory syndrome in children treated from 29 March, 2020 to 1 September, 2020. Cardiac complications during the acute phase were defined as decreased systolic function, coronary artery abnormalities, pericardial effusion, or mitral and/or tricuspid valve regurgitation. Patients with or without cardiac complications were compared with chi-square, Fisher’s exact, and Wilcoxon rank sum.
Results:
Thirty-nine children with median (interquartile range) age 7.8 (3.6–12.7) years were included. Nineteen (49%) patients developed cardiac complications including systolic dysfunction (33%), valvular regurgitation (31%), coronary artery abnormalities (18%), and pericardial effusion (5%). At the time of the most recent follow-up, at a median (interquartile range) of 49 (26–61) days, cardiac complications resolved in 16/19 (84%) patients. Two patients had persistent mild systolic dysfunction and one patient had persistent coronary artery abnormality. Children with cardiac complications were more likely to have higher N-terminal B-type natriuretic peptide (p = 0.01), higher white blood cell count (p = 0.01), higher neutrophil count (p = 0.02), severe lymphopenia (p = 0.05), use of milrinone (p = 0.03), and intensive care requirement (p = 0.04).
Conclusion:
Patients with multi-system inflammatory syndrome in children had a high rate of cardiac complications in the acute phase, with associated inflammatory markers. Although cardiac complications resolved in 84% of patients, further long-term studies are needed to assess if the cardiac abnormalities (transient or persistent) are associated with major cardiac events.
When subordinates have suffered an unfairness, managers sometimes try to compensate them by allocating something extra that belongs to the organization. These reactions, which we label as managerial Robin Hood behaviors, are undertaken without the consent of senior leadership. In four studies, we present and test a theory of managerial Robin Hoodism. In study 1, we found that managers themselves reported engaging in Robin Hoodism for various reasons, including a moral concern with restoring justice. Study 2 results suggested that managerial Robin Hoodism is more likely to occur when the justice violations involve distributive and interpersonal justice rather than procedural justice violations. In studies 3 and 4, when moral identity (trait or primed) was low, both distributive and interpersonal justice violations showed similar relationships to managerial Robin Hoodism. However, when moral identity was high, interpersonal justice violations showed a strong relationship to managerial Robin Hoodism regardless of the level of distributive justice.
The National Neuropsychology Network (NNN) is a multicenter clinical research initiative funded by the National Institute of Mental Health (NIMH; R01 MH118514) to facilitate neuropsychology’s transition to contemporary psychometric assessment methods with resultant improvement in test validation and assessment efficiency.
Method:
The NNN includes four clinical research sites (Emory University; Medical College of Wisconsin; University of California, Los Angeles (UCLA); University of Florida) and Pearson Clinical Assessment. Pearson Q-interactive (Q-i) is used for data capture for Pearson published tests; web-based data capture tools programmed by UCLA, which serves as the Coordinating Center, are employed for remaining measures.
Results:
NNN is acquiring item-level data from 500–10,000 patients across 47 widely used Neuropsychology (NP) tests and sharing these data via the NIMH Data Archive. Modern psychometric methods (e.g., item response theory) will specify the constructs measured by different tests and determine their positive/negative predictive power regarding diagnostic outcomes and relationships to other clinical, historical, and demographic factors. The Structured History Protocol for NP (SHiP-NP) helps standardize acquisition of relevant history and self-report data.
Conclusions:
NNN is a proof-of-principle collaboration: by addressing logistical challenges, NNN aims to engage other clinics to create a national and ultimately an international network. The mature NNN will provide mechanisms for data aggregation enabling shared analysis and collaborative research. NNN promises ultimately to enable robust diagnostic inferences about neuropsychological test patterns and to promote the validation of novel adaptive assessment strategies that will be more efficient, more precise, and more sensitive to clinical contexts and individual/cultural differences.
Coronavirus disease 2019 (COVID-19) has migrated to regions that were initially spared, and it is likely that different populations are currently at risk for illness. Herein, we present our observations of the change in characteristics and resource use of COVID-19 patients over time in a national system of community hospitals to help inform those managing surge planning, operational management, and future policy decisions.
To determine risk factors for mortality among COVID-19 patients admitted to a system of community hospitals in the United States.
Design:
Retrospective analysis of patient data collected from the routine care of COVID-19 patients.
Setting:
System of >180 acute-care facilities in the United States.
Participants:
All admitted patients with positive identification of COVID-19 and a documented discharge as of May 12, 2020.
Methods:
Determination of demographic characteristics, vital signs at admission, patient comorbidities and recorded discharge disposition in this population to construct a logistic regression estimating the odds of mortality, particular for those patients characterized as not being critically ill at admission.
Results:
In total, 6,180 COVID-19+ patients were identified as of May 12, 2020. Most COVID-19+ patients (4,808, 77.8%) were admitted directly to a medical-surgical unit with no documented critical care or mechanical ventilation within 8 hours of admission. After adjusting for demographic characteristics, comorbidities, and vital signs at admission in this subgroup, the largest driver of the odds of mortality was patient age (OR, 1.07; 95% CI, 1.06–1.08; P < .001). Decreased oxygen saturation at admission was associated with increased odds of mortality (OR, 1.09; 95% CI, 1.06–1.12; P < .001) as was diabetes (OR, 1.57; 95% CI, 1.21–2.03; P < .001).
Conclusions:
The identification of factors observable at admission that are associated with mortality in COVID-19 patients who are initially admitted to non-critical care units may help care providers, hospital epidemiologists, and hospital safety experts better plan for the care of these patients.
Even though sub-Saharan African women spend millions of person-hours per day fetching water and pounding grain, to date, few studies have rigorously assessed the energy expenditure costs of such domestic activities. As a result, most analyses that consider head-hauling water or hand pounding of grain with a mortar and pestle (pilão use) employ energy expenditure values derived from limited research. The current paper compares estimated energy expenditure values from heart rate monitors v. indirect calorimetry in order to understand some of the limitations with using such monitors to measure domestic activities.
Design:
This confirmation study estimates the metabolic equivalent of task (MET) value for head-hauling water and hand-pounding grain using both indirect calorimetry and heart rate monitors under laboratory conditions.
Setting:
The study was conducted in Nampula, Mozambique.
Participants:
Forty university students in Nampula city who recurrently engaged in water-fetching activities.
Results:
Including all participants, the mean MET value for head hauling 20 litres (20·5 kg, including container) of water (2·7 km/h, 0 % slope) was 4·3 (sd 0·9) and 3·7 (sd 1·2) for pilão use. Estimated energy expenditure predictions from a mixed model were found to correlate with observed energy expenditure (r2 0·68, r 0·82). Re-estimating the model with pilão use data excluded improved the fit substantially (r2 0·83, r 0·91).
Conclusions:
The current study finds that heart rate monitors are suitable instruments for providing accurate quantification of energy expenditure for some domestic activities, such as head-hauling water, but are not appropriate for quantifying expenditures of other activities, such as hand-pounding grain.
All sperm accrue varying amounts of DNA damage during maturation and storage, a process that appears to be mediated through oxidative stress. The clinical significance of genetic damage in the male germ line depends upon severity and how that damage is distributed among the sperm population. In human reproduction, the embryo is capable of significant DNA repair, which occurs prior to the first cleavage event. However, when the magnitude of genomic damage reaches pathologic levels, reproductive outcomes begin to be affected. Evidence now exists linking excessive sperm DNA fragmentation with time to pregnancy for natural conception, pregnancy outcomes of intrauterine insemination and in vitro fertilization, and miscarriage rates when intracytoplasmic sperm injection is employed. This review will discuss the pathophysiology of sperm DNA damage, the studies linking it to impaired reproductive outcomes, and how clinicians may render treatment to optimize the chance of paternity for their patients.
Obstructive azoospermia (OA) is a common presenting condition of male infertility, resulting from either congenital or acquired blockage of the reproductive tract. Men facing a diagnosis of OA now have an array of treatment options, including definitive reconstruction and various forms of sperm retrieval. The optimum treatment decision for OA will depend on the goals, values, and expectations of the patient and his partner. In this review we will discuss the therapeutic approach to OA, stressing the requirement of a clear and thoughtful plan for staged intervention. Any proposed treatment strategy should optimize the chances of paternity while minimizing damage to the male genitourinary system. Special attention will be paid to the role of microdissection testicular sperm extraction (microTESE), as it is a useful and often underutilized rescue procedure for OA. Specifically, the advantages and disadvantages of microTESE will be evaluated, with particular focus on success rates and safety.
The study was triggered by the first author's own experience on an undergraduate elective at the National Mental Wellness Centre in St Lucia. This was an eye-opening experience of psychiatry in a less economically developed environment. It highlighted disparities between practice in the developed and the developing world. Notably significant differences were apparent in facilities, epidemiology of presenting complaints, the interaction of cultural beliefs as well as the method of assessment and management.
Aims/objectives
To review the literature on the educational impact of electives in psychiatry.
Methods
A literature search using Ovid MEDLINE was conducted using the keywords’medical student’ AND’elective’ AND’psychiatry’. A total of 229 results were returned. These were then analysed for their relevance.
Results
Only one paper was found emphasising the importance of electives in psychiatry. This reported on one individual's personal experience. There also were reports highlighting the importance of undergraduate elective experience and the need to increase exposure to psychiatry to improve the uptake of postgraduate training programmes. There were no papers objectively assessing the educational quality or impact of a psychiatric elective experience.
Conclusions
An overseas elective experience was subjectively beneficial for the author but there is a lack of objective research to show the educational benefit of psychiatry electives on a wider scale. Further research regarding the educational benefits of electives in psychiatry is needed.
Schizophrenia often presents in adolescence (13–18 years), is more likely to have a poor prognosis and young people are also more prone to adverse effects. Clearer guidance is needed in order to plan treatment for early onset cases more effectively.
Objectives:
We aimed to evaluate effects of atypical antipsychotic medications for psychosis in adolescents.
Search methods:
We searched the Cochrane Schizophrenia Group's Register. References of all identified studies were inspected for further trials.
Methodology:
All relevant RCTs that compared atypical antipsychotic medication with pharmacological or non-pharmacological interventions in adolescents with psychosis were included. We reliably selected, quality assessed and extracted data from trials.
Results:
There were 13 RCTs with a total of 1112 participants. Adolescents improved more on standard dose of risperidone (1.5 – 6.0 mg) against low dose of risperidone (0.15 – 0.6 mg) (1 RCT, n = 255, RR 0.54 CI 0.38 to 0.75). Participants on clozapine were three times more likely to have drowsiness as compared to haloperidol (1 RCT, n = 21, RR 3.30 CI 1.23 to 8.85, NNH 2 CI 2 to 17). Lesser number of adolescents on atypical antipsychotics left the study due to adverse effects (3 RCTs, n = 187, RR 0.65 CI 0.36 to 1.15) than on typical antipsychotics.
Authors' conclusions:
There is no convincing evidence that atypical-antipsychotic medications are superior over typical antipsychotic medications. There is some evidence to show that adolescents respond better to standard-dose as opposed to lower dose of medications. Larger, more robust, trials are required.
The aim of the current study was to explore the changing interrelationships among clinical variables through the stages of schizophrenia in order to assemble a comprehensive and meaningful disease model.
Methods
Twenty-nine centers from 25 countries participated and included 2358 patients aged 37.21 ± 11.87 years with schizophrenia. Multiple linear regression analysis and visual inspection of plots were performed.
Results
The results suggest that with progression stages, there are changing correlations among Positive and Negative Syndrome Scale factors at each stage and each factor correlates with all the others in that particular stage, in which this factor is dominant. This internal structure further supports the validity of an already proposed four stages model, with positive symptoms dominating the first stage, excitement/hostility the second, depression the third, and neurocognitive decline the last stage.
Conclusions
The current study investigated the mental organization and functioning in patients with schizophrenia in relation to different stages of illness progression. It revealed two distinct “cores” of schizophrenia, the “Positive” and the “Negative,” while neurocognitive decline escalates during the later stages. Future research should focus on the therapeutic implications of such a model. Stopping the progress of the illness could demand to stop the succession of stages. This could be achieved not only by both halting the triggering effect of positive and negative symptoms, but also by stopping the sensitization effect on the neural pathways responsible for the development of hostility, excitement, anxiety, and depression as well as the deleterious effect on neural networks responsible for neurocognition.
Giant miscanthus has the potential to move beyond cultivated fields and invade noncrop areas, but this can be overshadowed by aesthetic appeal and monetary value as a biofuel crop. Most research on giant miscanthus has focused on herbicide tolerance for establishment and production rather than terminating an existing stand. This study was conducted to evaluate herbicide options for control or terminating a stand of giant miscanthus. In 2013 and 2014, field experiments were conducted on established stands of the giant miscanthus cultivars ‘Nagara’ and ‘Freedom.’ Herbicides evaluated in both years included glyphosate, hexazinone, imazapic, imazapyr, clethodim, fluazifop, and glyphosate plus fluazifop. All treatments were applied in summer (June or July) and September. For both years, biomass reduction ranged from 85% to 100% when glyphosate was applied in June or July at 4.5 or 7.3 kg ae ha−1. No other treatment applied at this timing provided more than 50% giant miscanthus biomass reduction 1 yr after application. September applications of glyphosate were not consistent: treatments in 2013 reduced biomass by 40% or less, whereas in 2014, at all rates provided at least 78% biomass reduction. Glyphosate applied in June or July was the only treatment that provided effective and consistent control of giant miscanthus 1 yr after treatment.
Diversified farms are operations that raise a variety of crops and/or multiple species of livestock, with the goal of utilising the products of one for the growth of the other, thus fostering a sustainable cycle. This type of farming reflects consumers' increasing demand for sustainably produced, naturally raised or pasture-raised animal products that are commonly produced on diversified farms. The specific objectives of this study were to characterise diversified small-scale farms (DSSF) in California, estimate the prevalence of Salmonella enterica and Campylobacter spp. in livestock and poultry, and evaluate the association between farm- and sample-level risk factors and the prevalence of Campylobacter spp. on DSSF in California using a multilevel logistic model. Most participating farms were organic and raised more than one animal species. Overall Salmonella prevalence was 1.19% (95% confidence interval (CI95) 0.6–2), and overall Campylobacter spp. prevalence was 10.8% (CI95 = 9–12.9). Significant risk factors associated with Campylobacter spp. were farm size (odds ratio (OR)10–50 acres: less than 10 acres = 6, CI95 = 2.11–29.8), ownership of swine (OR = 9.3, CI95 = 3.4–38.8) and season (ORSpring: Coastal summer = 3.5, CI95 = 1.1–10.9; ORWinter: Coastal summer = 3.23, CI95 = 1.4–7.4). As the number of DSSF continues to grow, evaluating risk factors and management practices that are unique to these operations will help identify risk mitigation strategies and develop outreach materials to improve the food safety of animal and vegetable products produced on DSSF.