To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
We conducted a matched case-control (MCC), test-negative case-control (TNCC) and case-cohort study in 2016 in Lusaka, Zambia, following a mass vaccination campaign. Confirmed cholera cases served as cases in all three study designs. In the TNCC, control-subjects were cases with negative cholera culture and polymerase chain reaction results. Matched controls by age and sex were selected among neighbours of the confirmed cases in the MCC study. For the case-cohort study, we recruited a cohort of randomly selected individuals living in areas considered at-risk of cholera. We recruited 211 suspected cases (66 confirmed cholera cases and 145 non-cholera diarrhoea cases), 1055 matched controls and a cohort of 921. Adjusted vaccine effectiveness of one dose of oral cholera vaccine (OCV) was 88.9% (95% confidence interval (CI) 42.7–97.8) in the MCC study, 80.2% (95% CI: 16.9–95.3) in the TNCC design and 89.4% (95% CI: 64.6–96.9) in the case-cohort study. Three study designs confirmed the short-term effectiveness of single dose OCV. Major healthcare-seeking behaviour bias did not appear to affect our estimates. Most of the protection among vaccinated individuals could be attributed to the direct effect of the vaccine.
Dietary Zn has significant impacts on the growth and development of breeding rams. The objectives of this study were to evaluate the effects of dietary Zn source and concentration on serum Zn concentration, growth performance, wool traits and reproductive performance in rams. Forty-four Targhee rams (14 months; 68 ± 18 kg BW) were used in an 84-day completely randomized design and were fed one of three pelleted dietary treatments: (1) a control without fortified Zn (CON; n = 15; ~1 × NRC); (2) a diet fortified with a Zn amino acid complex (ZnAA; n = 14; ~2 × NRC) and (3) a diet fortified with ZnSO4 (ZnSO4; n = 15; ~2 × NRC). Growth and wool characteristics measured throughout the course of the study were BW, average daily gain (ADG), dry matter intake (DMI), feed efficiency (G : F), longissimus dorsi muscle depth (LMD), back fat (BF), wool staple length (SL) and average fibre diameter (AFD). Blood was collected from each ram at four time periods to quantify serum Zn and testosterone concentrations. Semen was collected 1 to 2 days after the trial was completed. There were no differences in BW (P = 0.45), DMI (P = 0.18), LMD (P = 0.48), BF (P = 0.47) and AFD (P = 0.9) among treatment groups. ZnSO4 had greater (P ≤ 0.03) serum Zn concentrations compared with ZnAA and CON treatments. Rams consuming ZnAA had greater (P ≤ 0.03) ADG than ZnSO4 and CON. There tended to be differences among groups for G : F (P = 0.06), with ZnAA being numerically greater than ZnSO4 and CON. Wool staple length regrowth was greater (P < 0.001) in ZnSO4 and tended to be longer (P = 0.06) in ZnAA treatment group compared with CON. No differences were observed among treatments in scrotal circumference, testosterone, spermatozoa concentration within ram semen, % motility, % live sperm and % sperm abnormalities (P ≥ 0.23). Results indicated beneficial effects of feeding increased Zn concentrations to developing Targhee rams, although Zn source elicited differential responses in performance characteristics measured.
An unexpected increase in gastroenteritis cases was reported by healthcare workers on the KwaZulu-Natal Coast, South Africa, January 2017 with >600 cases seen over a 3-week period. A case–control study was conducted to identify the source and risk factors associated with the outbreak so as to recommend control and prevention measures. Record review identified cases and controls and structured-telephonic interviews were conducted to obtain exposure history. Stool specimens were collected from 20 cases along with environmental samples and both screened for enteric pathogens. A total of 126 cases and 62 controls were included in the analysis. The odds of developing gastroenteritis were 6.0 times greater among holiday makers than residents (95% confidence interval (CI) 2.0–17.7). Swimming in the lagoon increased the odds of developing gastroenteritis by 3.3 times (95% CI 1.06–10.38). Lagoon water samples tested positive for norovirus (NoV) GI.6, GII.3 and GII.6, astrovirus and rotavirus. Eleven (55%) stool specimens were positive for NoV with eight genotyped as GI.1 (n = 2), GI.5 (n = 3), GI.6 (n = 2), and GI.7 (n = 1). A reported sewage contamination event impacting the lagoon was the likely source with person-to-person spread perpetuating the outbreak. Restriction to swimming in the lagoon was apparently ineffective at preventing the outbreak, possibly due to inadequate enforcement, communication and signage strategies.
Introduction: Community Paramedics (CPs) require access to timely blood analysis in the field to guide treatment and transport decisions. Point of care testing (POCT), as opposed to traditional laboratory analysis, may offer a solution, but limited research exists on CP POCT. The objective of this study is to compare the validity of two POCT devices (Abbott i-STAT® and Alere epoc®) and their use by CPs in the community. Methods: In a CP programme responding to 6,000 annual patient care events, a split sample validation of POCT against traditional laboratory analysis for seven analytes (sodium, potassium, chloride, creatinine, hemoglobin, hematocrit, and glucose) was conducted on a consecutive sample of patients. The difference of proportion of discrepant results between POCT and laboratory was compared using a two sample proportion test. Usability was analysed by survey of CP experience, an expert heuristic evaluation of devices, a review of device-logged errors, coded observations of POCT use during quality control testing, and a linear mixed effects model of Systems Usability Scale (SUS) adjusted for CP clinical and POCT experience. Results: Of 1,649 CP calls for service screened for enrollment, 174 had a blood draw, with 108 patient care encounters (62.1%) enrolled from 73 participants. Participants had a mean age of 58.7 years (SD16.3); 49% were female. In 4 of 646 (0.6%) individual comparisons, POCT reported a critical value that the laboratory did not; with no statistically significant difference in the number of discrepant critical values reported with epoc® compared to i-STAT®. There were no instances of the laboratory reporting a critical value when POCT did not. In 88 of 1,046 (8.4%) individual comparisons, the a priori defined acceptable difference between POCT and the laboratory was exceeded; occurring more often in epoc® (10.7%;95%CI:8.1%,13.3%) compared to i-STAT® (6.1%;95%CI:4.1%,8.2%)(p=0.007). Eighteen of 19 CP surveys were returned, with 11/18 (61.1%) preferring i-STAT® over epoc®. The i-STAT® had a higher mean SUS score (higher usability) compared to the epoc® (84.0/100 vs. 59.6/100; p=0.011). Fewer field blood analysis device-logged errors occurred in i-STAT® (7.8%;95%CI:2.9%,12.7%) compared to epoc® (15.5%;95%CI:9.3%,21.7%) although not statistically significant (p=0.063). Conclusion: CP programs can expect valid results from POCT. Usability assessment suggests a preference for i-STAT.
Studies have consistently shown that subthreshold depression is associated with an increased risk of developing major depression. However, no study has yet calculated a pooled estimate that quantifies the magnitude of this risk across multiple studies.
We conducted a systematic review to identify longitudinal cohort studies containing data on the association between subthreshold depression and future major depression. A baseline meta-analysis was conducted using the inverse variance heterogeneity method to calculate the incidence rate ratio (IRR) of major depression among people with subthreshold depression relative to non-depressed controls. Subgroup analyses were conducted to investigate whether IRR estimates differed between studies categorised by age group or sample type. Sensitivity analyses were also conducted to test the robustness of baseline results to several sources of study heterogeneity, such as the case definition for subthreshold depression.
Data from 16 studies (n = 67 318) revealed that people with subthreshold depression had an increased risk of developing major depression (IRR = 1.95, 95% confidence interval 1.28–2.97). Subgroup analyses estimated similar IRRs for different age groups (youth, adults and the elderly) and sample types (community-based and primary care). Sensitivity analyses demonstrated that baseline results were robust to different sources of study heterogeneity.
The results of this study support the scaling up of effective indicated prevention interventions for people with subthreshold depression, regardless of age group or setting.
Routine symptom monitoring and feedback improves out-patient outcomes, but the feasibility of its use to inform decisions about discharge from in-patient care has not been explored.
To examine the potential value to clinical decision-making of monitoring symptoms during psychiatric in-patient hospitalisation.
A total of 1102 in-patients in a private psychiatric hospital, primarily with affective and neurotic disorders, rated daily distress levels throughout their hospital stay. The trajectories of patients who had, and had not, met a criterion of clinically significant improvement were examined.
Two-thirds of patients (n=604) met the clinically significant improvement criterion at discharge, and three-quarters (n=867) met the criterion earlier during their hospital stay. After meeting the criterion, the majority (73.2%) showed stable symptoms across the remainder of their hospital stay, and both classes showed substantially lower symptoms than at admission.
Monitoring of progress towards this criterion provides additional information regarding significant treatment response that could inform clinical decisions around discharge readiness.
The current study aimed to examine the impact of sociodemographic and health-service factors on breast-feeding in sub-Saharan African (SSA) countries with high diarrhoea mortality.
The study used the most recent and pooled Demographic and Health Survey data sets collected in nine SSA countries with high diarrhoea mortality. Multivariate logistic regression models that adjusted for cluster and sampling weights were used to investigate the association between sociodemographic and health-service factors and breast-feeding in SSA countries.
Sub-Saharan Africa with high diarrhoea mortality.
Children (n 50 975) under 24 months old (Burkina Faso (2010, N 5710); Demographic Republic of Congo (2013, N 6797); Ethiopia (2013, N 4193); Kenya (2014, N 7024); Mali (2013, N 3802); Niger (2013, N 4930); Nigeria (2013, N 11 712); Tanzania (2015, N 3894); and Uganda (2010, N 2913)).
Overall prevalence of exclusive breast-feeding (EBF) and early initiation of breast-feeding (EIBF) was 35 and 44 %, respectively. Uganda, Ethiopia and Tanzania had higher EBF prevalence compared with Nigeria and Niger. Prevalence of EIBF was highest in Mali and lowest in Kenya. Higher educational attainment and frequent health-service visits of mothers (i.e. antenatal care, postnatal care and delivery at a health facility) were associated with EBF and EIBF.
Breast-feeding practices in SSA countries with high diarrhoea mortality varied across geographical regions. To improve breast-feeding behaviours among mothers in SSA countries with high diarrhoea mortality, breast-feeding initiatives and policies should be context-specific, measurable and culturally appropriate, and should focus on all women, particularly mothers from low socio-economic groups with limited health-service access.
Public health interest in norovirus (NoV) has increased in recent years following improved diagnostics, global burden estimates and the development of NoV vaccine candidates. This study aimed to describe the detection rate, clinical characteristics and environmental features associated with NoV detection in hospitalized children <5 years with diarrhoea in South Africa (SA). Between 2009 and 2013, prospective diarrhoeal surveillance was conducted at four sites in SA. Stool specimens were collected and screened for NoVs and other enteric pathogens using molecular and serological assays. Epidemiological and clinical data were compared in patients with or without detection of NoV. The study detected NoV in 15% (452/3103) of hospitalized children <5 years with diarrhoea with the majority of disease in children <2 years (92%; 417/452). NoV-positive children were more likely to present with diarrhoea and vomiting (odds ratio (OR) 1·3; 95% confidence interval (CI) 1·1–1·7; P = 0·011) with none-to-mild dehydration (adjusted OR 0·5; 95% CI 0·3–0·7) compared with NoV-negative children. Amongst children testing NoV positive, HIV-infected children were more likely to have prolonged hospitalization and increased mortality compared with HIV-uninfected children. Continued surveillance will be important to consider the epidemic trends and estimate the burden and risk of NoV infection in SA.
We present an overview of the survey for radio emission from active stars that has been in progress for the last six years using the observatories at Fleurs, Molonglo, Parkes and Tidbinbilla. The role of complementary optical observations at the Anglo-Australian Observatory, Mount Burnett, Mount Stromlo and Siding Spring Observatories and Mount Tamborine are also outlined. We describe the different types of star that have been included in our survey and discuss some of the problems in making the radio observations.
We have designed and manufactured a multi-purpose electronic, computer-operated blink-comparator and measuring engine. It has been specifically designed to facilitate the examination of stellar images on Uppsala Schmidt photographic plates, identify and establish coordinates of new suspect variable stars appearing on the plates being examined, and also to derive photometric values for those stars manifesting variability.
This study shows that correlated trading by gambling-motivated investors generates excess return comovement among stocks with lottery features. Lottery-like stocks comove strongly with one another, and this return comovement is strongest among lottery stocks located in regions where investors exhibit stronger gambling propensity. Looking directly at investor trades, we find that investors with a greater propensity to gamble trade lottery-like stocks more actively and that those trades are more strongly correlated. Finally, we demonstrate that time variation in general gambling enthusiasm and income shocks from fluctuating economic conditions induce a systematic component in investors’ demand for lottery-like stocks.
We report herein the investigation of a leptospirosis outbreak occurring in triathlon competitors on Réunion Island, Indian Ocean. All participants were contacted by phone or email and answered a questionnaire. Detection and molecular characterization of pathogenic Leptospira was conducted in inpatients and in rodents trapped at the vicinity of the event. Of the 160 athletes competing, 101 (63·1%) agreed to participate in the study. Leptospirosis was biologically confirmed for 9/10 suspected cases either by real-time PCR or serological tests (MAT or ELISA). The total attack rate, children's attack rate, swimmers’ attack rate, and the attack rate in adult swimmers were respectively estimated at 8·1% [95% confidence interval (CI) 4·3–14·7], 0%, 12·7% (95% CI 6·8–22·4) and 23·1% (95% CI 12·6–33·8). Leptospirosis cases reported significantly more wounds [risk ratio (RR) 4·5, 95% CI 1·6–13], wore complete neoprene suits less often (RR 4·3, 95% CI 1·3–14·5) and were most frequently unlicensed (RR 6·6, 95% CI 2·9–14·8). The epidemiological investigation supported that some measures such as the use of neoprene suits proved efficient in protecting swimmers against infection. PCR detection in rats revealed high Leptospira infection rates. Partial sequencing of the 16S gene and serology on both human and animal samples strongly suggests that rats were the main contaminators and were likely at the origin of the infection in humans.
Agitation and aggression are significant problems in acute psychiatric
units. There is little consensus on which drug is most effective and
safest for sedation of these patients.
To compare the effectiveness and safety of haloperidol
v. droperidol for patients with agitation and
In a masked, randomised controlled trial (ACTRN12611000565943)
intramuscular droperidol (10 mg) was compared with intramuscular
haloperidol (10 mg) for adult patients with acute behavioural disturbance
in a psychiatric intensive care unit. The primary outcome was time to
sedation within 120 min. Secondary outcomes were use of additional
sedation, adverse events and staff injuries.
From 584 patients, 110 were randomised to haloperidol and 118 to
droperidol. Effective sedation occurred in 210 (92%) patients within 120
min. There was no significant difference in median time to sedation: 20
min (interquartile range 15–30, range 10–75) for haloperidol
v. 25 min (IQR 15–30, range 10–115) for droperidol
(P = 0.89). Additional sedation was used more often
with haloperidol (13% v. 5%, P = 0.06),
but adverse effects were less common with haloperidol (1%
v. 5%, P = 0.12). There were 8 staff
Both haloperidol and droperidol were effective for sedation of patients
with acute behavioural disturbance.
Key pathophysiology of sickle cell anaemia includes compensatory erythropoiesis, vascular injury and chronic inflammation, which divert amino acids from tissue deposition for growth/weight gain and muscle formation. We hypothesised that sickle mice maintained on an isoenergetic diet with a high percentage of energy derived from protein (35 %), as opposed to a standard diet with 20 % of energy derived from protein, would improve body composition, bone mass and grip strength. Male Berkeley transgenic sickle mice (S; n 8–12) were fed either 20 % (S20) or 35 % (S35) diets for 3 months. Grip strength (BIOSEB meter) and body composition (dual-energy X-ray absorptiometry scan) were measured. After 3 months, control mice had the highest bone mineral density (BMD) and bone mineral content (BMC) (P < 0·005). S35 mice had the largest increase in grip strength. A two-way ANOVA of change in grip strength (P = 0·043) attributed this difference to genotype (P = 0·025) and a trend in type of diet (P = 0·067). l-Arginine (l-Arg) supplementation of the 20 % diet was explored, as a possible mechanism for improvement obtained with the 35 % diet. Townes transgenic sickle mice (TS; n 6–9) received 0·8, 1·6, 3·2 or 6·4 % l-Arg based on the same protocol and outcome measures used for the S mice. TS mice fed 1·6 % l-Arg for 3 months (TS1.6) had the highest weight gain, BMD, BMC and lean body mass compared with other groups. TS3.2 mice showed significantly more improvement in grip strength than TS0·8 and TS1.6 mice (P < 0·05). In conclusion, the high-protein diet improved body composition and grip strength. Outcomes observed with TS1.6 and TS3.2 mice, respectively, confirm the hypothesis and reveal l-Arg as part of the mechanism.