To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is a copy of the slides presented at the meeting but not formally written up for the volume.
In order to pursue device applications of magnetoelectric Cr2O3, we have fabricated epitaxial Cr2O3 thin films on (001), (110) and (101) oriented Nb doped TiO2 by pulsed laser deposition. The Cr2O3 films with different thicknesses (0.3 1 ¦Ìm) showed extremely smooth surfaces with rms roughness ¡Ö 0.3 nm (for 10 X 10 ¦Ìm2) as measured by AFM for all 3 different orientations. The films display robust insulating properties at room temperature with leakage current density of 8.9 X 10-6 A/cm2 at 10 kV/cm for 300 nm thick films. In order to investigate exchange bias, we fabricated bilayer films of Cr2O3/Co with all 3 orientations. The magnetic properties of the films were measured using SQUID and the magnetic optical Kerr effect (MOKE). From the Cr2O3/Co film grown on a (110) oriented TiO2, we clearly observe exchange bias of ¡Ö 13 Oe with a coercive field of 115 Oe upon cooling from 320 K to 30 K in a 1 T magnetic field. The microstructural properties of the bilayers and the effect of electric field on the exchange bias behavior were investigated using TEM, VSM and MOKE. Comparison of exchange bias with BiFeO3 and TbMnO3 multiferroic thin films will also be discussed. This work is supported by W. M. Keck Foundation, ONR grant No. N00014-01-1-0761, N00014-04-1-0085, and the NSF under grants DMR-00-94265 (CAREER), NSF DMR-00-0231291, NSF 0095166, NSF-MRSEC Award No. DMR-00-0520471. We acknowledge use of the Nanoscale Imaging, Spectroscopy, and Properties (NISP) Laboratory for TEM characterization.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Influenza and respiratory syncytial virus (RSV) are common causes of respiratory tract infections and place a burden on health services each winter. Systems to describe the timing and intensity of such activity will improve the public health response and deployment of interventions to these pressures. Here we develop early warning and activity intensity thresholds for monitoring influenza and RSV using two novel data sources: general practitioner out-of-hours consultations (GP OOH) and telehealth calls (NHS 111). Moving Epidemic Method (MEM) thresholds were developed for winter 2017–2018. The NHS 111 cold/flu threshold was breached several weeks in advance of other systems. The NHS 111 RSV epidemic threshold was breached in week 41, in advance of RSV laboratory reporting. Combining the use of MEM thresholds with daily monitoring of NHS 111 and GP OOH syndromic surveillance systems provides the potential to alert to threshold breaches in real-time. An advantage of using thresholds across different health systems is the ability to capture a range of healthcare-seeking behaviour, which may reflect differences in disease severity. This study also provides a quantifiable measure of seasonal RSV activity, which contributes to our understanding of RSV activity in advance of the potential introduction of new RSV vaccines.
For artificial agents trading off exploration (food seeking) versus (short-term) exploitation (or consumption), our experiments suggest that uncertainty (interpreted information, theoretically) magnifies food seeking. In more uncertain environments, with food distributed uniformly randomly, exploration appears to be beneficial. In contrast, in biassed (less uncertain) environments, with food concentrated in only one part, exploitation appears to be more advantageous. Agents also appear to do better in biassed environments.
Breakthrough Listen is a 10-yr initiative to search for signatures of technologies created by extraterrestrial civilisations at radio and optical wavelengths. Here, we detail the digital data recording system deployed for Breakthrough Listen observations at the 64-m aperture CSIRO Parkes Telescope in New South Wales, Australia. The recording system currently implements two modes: a dual-polarisation, 1.125-GHz bandwidth mode for single-beam observations, and a 26-input, 308-MHz bandwidth mode for the 21-cm multibeam receiver. The system is also designed to support a 3-GHz single-beam mode for the forthcoming Parkes ultra-wideband feed. In this paper, we present details of the system architecture, provide an overview of hardware and software, and present initial performance results.
We study the chromatic number of the curve graph of a surface. We show that the chromatic number grows like
for the graph of separating curves on a surface of Euler characteristic
. We also show that the graph of curves that represent a fixed nonzero homology class is uniquely
denotes its clique number. Together, these results lead to the best known bounds on the chromatic number of the curve graph. We also study variations for arc graphs and obtain exact results for surfaces of low complexity. Our investigation leads to connections with Kneser graphs, the Johnson homomorphism, and hyperbolic geometry.
During the 2009 influenza pandemic, a rapid assessment of disease severity was a challenge as a significant proportion of cases did not seek medical care; care-seeking behaviour changed and the proportion asymptomatic was unknown. A random-digit-dialling telephone survey was undertaken during the 2011/12 winter season in England and Wales to address the feasibility of answering these questions. A proportional quota sampling strategy was employed based on gender, age group, geographical location, employment status and level of education. Households were recruited pre-season and re-contacted immediately following peak seasonal influenza activity. The pre-peak survey was undertaken in October 2011 with 1061 individuals recruited and the post-peak telephone survey in March 2012. Eight hundred and thirty-four of the 1061 (78.6%) participants were successfully re-contacted. Their demographic characteristics compared well to national census data. In total, 8.4% of participants self-reported an influenza-like illness (ILI) in the previous 2 weeks, with 3.2% conforming to the World Health Organization (WHO) ILI case definition. In total, 29.6% of the cases reported consulting their general practitioner. 54.1% of the 1061 participants agreed to be re-contacted about providing biological samples. A population-based cohort was successfully recruited and followed up. Longitudinal survey methodology provides a practical tool to assess disease severity during future pandemics.
In 2016, imported Zika virus (ZIKV) infections and the presence of a potentially competent mosquito vector (Aedes albopictus) implied that ZIKV transmission in New York City (NYC) was possible. The NYC Department of Health and Mental Hygiene developed contingency plans for a urosurvey to rule out ongoing local transmission as quickly as possible if a locally acquired case of confirmed ZIKV infection was suspected. We identified tools to (1) rapidly estimate the population living in any given 150-m radius (i.e. within the typical flight distance of an Aedes mosquito) and (2) calculate the sample size needed to test and rule out the further local transmission. As we expected near-zero ZIKV prevalence, methods relying on the normal approximation to the binomial distribution were inappropriate. Instead, we assumed a hypergeometric distribution, 10 missed cases at maximum, a urine assay sensitivity of 92.6% and 100% specificity. Three suspected example risk areas were evaluated with estimated population sizes of 479–4,453, corresponding to a minimum of 133–1244 urine samples. This planning exercise improved our capacity for ruling out local transmission of an emerging infection in a dense, urban environment where all residents in a suspected risk area cannot be feasibly sampled.
To integrate electronic clinical decision support tools into clinical practice and to evaluate the impact on indwelling urinary catheter (IUC) use and catheter-associated urinary tract infections (CAUTIs).
Design, Setting, and Participants
This 4-phase observational study included all inpatients at a multicampus, academic medical center between 2011 and 2015.
Phase 1 comprised best practices training and standardization of electronic documentation. Phase 2 comprised real-time electronic tracking of IUC duration. In phase 3, a triggered alert reminded clinicians of IUC duration. In phase 4, a new IUC order (1) introduced automated order expiration and (2) required consideration of alternatives and selection of an appropriate indication.
Overall, 2,121 CAUTIs, 179,070 new catheters, 643,055 catheter days, and 2,186 reinsertions occurred in 3·85 million hospitalized patient days during the study period. The CAUTI rate per 10,000 patient days decreased incrementally in each phase from 9·06 in phase 1 to 1·65 in phase 4 (relative risk [RR], 0·182; 95% confidence interval [CI], 0·153–0·216; P<·001). New catheters per 1,000 patient days declined from 53·4 in phase 1 to 39·5 in phase 4 (RR, 0·740; 95% CI, 0·730; P<·001), and catheter days per 1,000 patient days decreased from 194·5 in phase 1 to 140·7 in phase 4 (RR, 0·723; 95% CI, 0·719–0·728; P<·001). The reinsertion rate declined from 3·66% in phase 1 to 3·25% in phase 4 (RR, 0·894; 95% CI, 0·834–0·959; P=·0017).
The phased introduction of decision support tools was associated with progressive declines in new catheters, total catheter days, and CAUTIs. Clinical decision support tools offer a viable and scalable intervention to target hospital-wide IUC use and hold promise for other quality improvement initiatives.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
Lichens are one of the common dominant biota in biological soil crusts (biocrusts), a community that is one of the largest in extent in the world. Here we present a summary of the main features of the lifestyle of soil crust lichens, emphasizing their habitat, ecophysiology and versatility. The soil crust is exposed to full light, often to high temperatures and has an additional water source, the soil beneath the lichens. However, despite the open nature of the habitat the lichens are active under shady and cooler conditions and avoid climate extremes of high temperature and light. In temperate and alpine habitats they can also be active for long periods, several months in some cases. They show a mixture of physiological constancy (e.g. similar activity periods and net photosynthetic rates) but also adaptations to the habitat (e.g. the response of net photosynthesis to thallus water content can differ for the same lichen species in Europe and the USA and some species show extensive rhizomorph development). Despite recent increased research, aspects of soil crust ecology, for example under snow, remain little understood.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Urinary catheters, many of which are placed in the emergency department (ED) setting, are often inappropriate, and they are associated with infectious and noninfectious complications. Although several studies evaluating the effect of interventions have focused on reducing catheter use in the ED setting, the organizational contexts within which these interventions were implemented have not been compared.
A total of 18 hospitals in the Ascension health system (ie, system-based hospitals) and 16 hospitals in the state of Michigan (ie, state-based hospitals led by the Michigan Health and Hospital Association) implemented ED interventions focused on reducing urinary catheter use. Data on urinary catheter placement in the ED, indications for catheter use, and presence of physician order for catheter placement were collected for interventions in both hospital types. Multilevel negative binomial regression was used to compare the system-based versus state-based interventions.
A total of 13,215 patients (889 with catheters) from the system-based intervention were compared to 12,104 patients (718 with catheters) from the state-based intervention. Statistically significant and sustainable reductions in urinary catheter placement (incidence rate ratio, 0.79; P=.02) and improvements in appropriate use of urinary catheters (odds ratio [OR], 1.86; P=.004) in the ED were observed in the system-based intervention, compared to the state-based intervention. Differences by collaborative structure in changes in presence of physician order for urinary catheter placement (OR, 1.14; P=.60) were not observed.
An ED intervention consisting of establishing institutional guidelines for appropriate catheter placement and identifying clinical champions to promote adherence was associated with reducing unnecessary urinary catheter use under a system-based collaborative structure.
The increased use of the MATRICS Consensus Cognitive Battery (MCCB) to investigate cognitive dysfunctions in schizophrenia fostered interest in its sensitivity in the context of family studies. As various measures of the same cognitive domains may have different power to distinguish between unaffected relatives of patients and controls, the relative sensitivity of MCCB tests for relative–control differences has to be established. We compared MCCB scores of 852 outpatients with schizophrenia (SCZ) with those of 342 unaffected relatives (REL) and a normative Italian sample of 774 healthy subjects (HCS). We examined familial aggregation of cognitive impairment by investigating within-family prediction of MCCB scores based on probands’ scores.
Multivariate analysis of variance was used to analyze group differences in adjusted MCCB scores. Weighted least-squares analysis was used to investigate whether probands’ MCCB scores predicted REL neurocognitive performance.
SCZ were significantly impaired on all MCCB domains. REL had intermediate scores between SCZ and HCS, showing a similar pattern of impairment, except for social cognition. Proband's scores significantly predicted REL MCCB scores on all domains except for visual learning.
In a large sample of stable patients with schizophrenia, living in the community, and in their unaffected relatives, MCCB demonstrated sensitivity to cognitive deficits in both groups. Our findings of significant within-family prediction of MCCB scores might reflect disease-related genetic or environmental factors.
Healthy adults (n 30) participated in a placebo-controlled, randomised, double-blinded, cross-over study consisting of two 28 d treatments (β2-1 fructan or maltodextrin; 3×5 g/d) separated by a 14-d washout. Subjects provided 1 d faecal collections at days 0 and 28 of each treatment. The ability of faecal bacteria to metabolise β2-1 fructan was common; eighty-seven species (thirty genera, and four phyla) were isolated using anaerobic medium containing β2-1 fructan as the sole carbohydrate source. β2-1 fructan altered the faecal community as determined through analysis of terminal restriction fragment length polymorphisms and 16S rRNA genes. Supplementation with β2-1 fructan reduced faecal community richness, and two patterns of community change were observed. In most subjects, β2-1 fructan reduced the content of phylotypes aligning within the Bacteroides, whereas increasing those aligning within bifidobacteria, Faecalibacterium and the family Lachnospiraceae. In the remaining subjects, supplementation increased the abundance of Bacteroidetes and to a lesser extent bifidobacteria, accompanied by decreases within the Faecalibacterium and family Lachnospiraceae. β2-1 Fructan had no impact on the metagenome or glycoside hydrolase profiles in faeces from four subjects. Few relationships were found between the faecal bacterial community and various host parameters; Bacteroidetes content correlated with faecal propionate, subjects whose faecal community contained higher Bacteroidetes produced more caproic acid independent of treatment, and subjects having lower faecal Bacteroidetes exhibited increased concentrations of serum lipopolysaccharide and lipopolysaccharide binding protein independent of treatment. We found no evidence to support a defined health benefit for the use of β2-1 fructans in healthy subjects.
Introduction: Many barriers exist to integrating smoking cessation into delivery of lung cancer screening including limited provider time and patient misconceptions.
Aims: To demonstrate that proactive outreach from a telephone counsellor outside of the patient's usual care team is feasible and acceptable to patients.
Methods: Smokers undergoing lung cancer screening were approached for a telephone counselling study. Patients agreeing to participate in the intervention (n = 27) received two telephone counselling sessions. A 30-day follow-up evaluation was conducted, which also included screening participants receiving usual care (n = 56).
Results/Findings: Most (89%) intervention participants reported being satisfied with the proactive calls, and 81% reported the sessions were helpful. Use of behavioural cessation support programs in the intervention group was four times higher (44%) compared to the usual care group (11%); Relative Risk (RR) = 4.1; 95% CI: 1.7 to 9.9), and seven-day abstinence in the intervention group was double (19%) compared to the usual care group (7%); RR = 2.6; 95% CI: 0.8 to 8.9).
Conclusions: This practical telephone-based approach, which included risk messages clarifying continued risks of smoking in the context of screening results, suggests such messaging can boost utilisation of evidence-based tobacco treatment, self-efficacy, and potentially increase the likelihood of successful quitting.