To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
First episode psychosis (FEP) patients who use cannabis experience more frequent psychotic and euphoric intoxication experiences compared to controls. It is not clear whether this is consequent to patients being more vulnerable to the effects of cannabis use or to their heavier pattern of use. We aimed to determine whether extent of use predicted psychotic-like and euphoric intoxication experiences in patients and controls and whether this differs between groups.
We analysed data on patients who had ever used cannabis (n = 655) and controls who had ever used cannabis (n = 654) across 15 sites from six countries in the EU-GEI study (2010–2015). We used multiple regression to model predictors of cannabis-induced experiences and to determine if there was an interaction between caseness and extent of use.
Caseness, frequency of cannabis use and money spent on cannabis predicted psychotic-like and euphoric experiences (p ⩽ 0.001). For psychotic-like experiences (PEs) there was a significant interaction for caseness × frequency of use (p < 0.001) and caseness × money spent on cannabis (p = 0.001) such that FEP patients had increased experiences at increased levels of use compared to controls. There was no significant interaction for euphoric experiences (p > 0.5).
FEP patients are particularly sensitive to increased psychotic-like, but not euphoric experiences, at higher levels of cannabis use compared to controls. This suggests a specific psychotomimetic response in FEP patients related to heavy cannabis use. Clinicians should enquire regarding cannabis related PEs and advise that lower levels of cannabis use are associated with less frequent PEs.
This study presents two years of characterization of a warm temperate rhodolith bed in order to analyse how certain environmental changes influence the community ecology. The biomass of rhodoliths and associated species were analysed during this period and in situ experiments were conducted to evaluate the primary production, calcification and respiration of the dominant species of rhodoliths and epiphytes. The highest total biomass of rhodoliths occurred during austral winter. Lithothamnion crispatum was the most abundant rhodolith species in austral summer. Epiphytic macroalgae occurred only in January 2015, with Padina gymnospora being the most abundant. Considering associated fauna, the biomass of Mollusca increased from February 2015 to February 2016. Population densities of key reef fish species inside and around the rhodolith beds showed significant variations in time. The densities of grouper (carnivores/piscivores) increased in time, especially from 2015 to 2016. On the other hand, grunts (macroinvertebrate feeders) had a modest decrease over time (from 2014 to 2016). Other parameters such as primary production and calcification of L. crispatum were higher under enhanced irradiance, yet decreased in the presence of P. gymnospora. Community structure and physiological responses can be explained by the interaction of abiotic and biotic factors, which are driven by environmental changes over time. Biomass changes can indicate that herbivores play a role in limiting the growth of epiphytes, and this is beneficial to the rhodoliths because it decreases competition for environmental resources with fleshy algae.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
Ethnic minority groups in Western countries face an increased risk of psychotic disorders. Causes of this long-standing public health inequality remain poorly understood. We investigated whether social disadvantage, linguistic distance and discrimination contributed to these patterns.
We used case–control data from the EUropean network of national schizophrenia networks studying Gene-Environment Interactions (EU-GEI) study, carried out in 16 centres in six countries. We recruited 1130 cases and 1497 population-based controls. Our main outcome measure was first-episode ICD-10 psychotic disorder (F20–F33), and exposures were ethnicity (white majority, black, mixed, Asian, North-African, white minority and other), generational status, social disadvantage, linguistic distance and discrimination. Age, sex, paternal age, cannabis use, childhood trauma and parental history of psychosis were included as a priori confounders. Exposures and confounders were added sequentially to multivariable logistic models, following multiple imputation for missing data.
Participants from any ethnic minority background had crude excess odds of psychosis [odds ratio (OR) 2.03, 95% confidence interval (CI) 1.69–2.43], which remained after adjustment for confounders (OR 1.61, 95% CI 1.31–1.98). This was progressively attenuated following further adjustment for social disadvantage (OR 1.52, 95% CI 1.22–1.89) and linguistic distance (OR 1.22, 95% CI 0.95–1.57), a pattern mirrored in several specific ethnic groups. Linguistic distance and social disadvantage had stronger effects for first- and later-generation groups, respectively.
Social disadvantage and linguistic distance, two potential markers of sociocultural exclusion, were associated with increased odds of psychotic disorder, and adjusting for these led to equivocal risk between several ethnic minority groups and the white majority.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
Late Holocene sediment deposits in Pine Island Bay, West Antarctica, are hypothesized to be linked to intensive meltwater drainage during the retreat of the paleo-Pine Island Ice Stream after the Last Glacial Maximum. The uppermost sediment units show an abrupt transition from ice-proximal debris to a draped silt during the late Holocene, which is interpreted to coincide with rapid deglaciation. The small scale and fine sorting of the upper unit could be attributed to origins in subglacial meltwater; however the thickness and deposition rate for this unit imply punctuated- rather than continuous-deposition. This, combined with the deposit's location seaward of large, bedrock basins, has led to the interpretation of this unit as the result of subglacial lake outbursts in these basins. However, the fine-scale sorting of the silt unit is problematic for this energetic interpretation, which should mobilize and deposit a wider range of sediment sizes. To resolve this discrepancy, we present an alternative mechanism in which the silt was sorted by a distributed subglacial water system, stored in bedrock basins far inland of the grounding line, and subsequently eroded at higher flow speeds during retreat. We demonstrate that this mechanism is physically plausible given the subglacial conditions during the late Holocene. We hypothesize that similar silt units observed elsewhere in Antarctica downstream of bedrock basins could be the result of the same mechanism.
Laser-Induced Breakdown Spectroscopy (LIBS) is the remote elemental analysis technique used by the ChemCam instrument on the Curiosity rover. LIBS involves remotely ablating material from rocks and soils with a focused high-energy laser, which generates an optically excited plasma from which the elements in the rock or soil sample are quantitatively determined. The LIBS technique offers many advantages for remote chemical analysis. LIBS provides very rapid analyses without the need for any sample preparation. LIBS is capable of detecting all elements present above the detection limits independent of the atomic mass. LIBS quantitative analysis continues to evolve and produce accurate compositions with decreasing uncertainties. Furthermore, the matrix effects that tend to complicate most elemental analysis techniques like LIBS are increasingly exploited to extract more sample details. The focus of this chapter is to describe the current state of LIBS chemical analysis for remote planetary science.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
Stellarator configurations with reactor relevant energetic particle losses are constructed by simultaneously optimizing for quasisymmetry and an analytically derived metric (
), which attempts to align contours of the second adiabatic invariant,
with magnetic surfaces. Results show that with this optimization scheme it is possible to generate quasihelically symmetric equilibria on the scale of ARIES-CS which completely eliminate all collisionless alpha particle losses within normalized radius
. We show that the best performance is obtained by reducing losses at the trapped–passing boundary. Energetic particle transport can be improved even when neoclassical transport, as calculated using the metric
, is degraded. Several quasihelically symmetric equilibria with different aspect ratios are presented, all with excellent energetic particle confinement.
Ion-temperature-gradient-driven (ITG) turbulence is compared for two quasi-symmetric (QS) stellarator configurations to determine the relationship between linear growth rates and nonlinear heat fluxes. We focus on the quasi-helically symmetric (QHS) stellarator HSX and the quasi-axisymmetric (QAS) stellarator NCSX. In normalized units, HSX exhibits higher growth rates than NCSX, while heat fluxes in gyro-Bohm units are lower in HSX. These results hold for simulations made with both adiabatic and kinetic electrons. The results show that HSX has a larger number of subdominant modes than NCSX and that eigenmodes are more spatially extended in HSX. We conclude that the consideration of nonlinear physics is necessary to accurately assess the heat flux due to ITG turbulence when comparing QS stellarator equilibria.
Recent years have seen an exponential increase in the variety of healthcare data captured across numerous sources. However, mechanisms to leverage these data sources to support scientific investigation have remained limited. In 2013 the Pediatric Heart Network (PHN), funded by the National Heart, Lung, and Blood Institute, developed the Integrated CARdiac Data and Outcomes (iCARD) Collaborative with the goals of leveraging available data sources to aid in efficiently planning and conducting PHN studies; supporting integration of PHN data with other sources to foster novel research otherwise not possible; and mentoring young investigators in these areas. This review describes lessons learned through the development of iCARD, initial efforts and scientific output, challenges, and future directions. This information can aid in the use and optimisation of data integration methodologies across other research networks and organisations.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
We evaluated whether a diagnostic stewardship initiative consisting of ASP preauthorization paired with education could reduce false-positive hospital-onset (HO) Clostridioides difficile infection (CDI).
Single center, quasi-experimental study.
Tertiary academic medical center in Chicago, Illinois.
Adult inpatients were included in the intervention if they were admitted between October 1, 2016, and April 30, 2018, and were eligible for C. difficile preauthorization review. Patients admitted to the stem cell transplant (SCT) unit were not included in the intervention and were therefore considered a contemporaneous noninterventional control group.
The intervention consisted of requiring prescriber attestation that diarrhea has met CDI clinical criteria, ASP preauthorization, and verbal clinician feedback. Data were compared 33 months before and 19 months after implementation. Facility-wide HO-CDI incidence rates (IR) per 10,000 patient days (PD) and standardized infection ratios (SIR) were extracted from hospital infection prevention reports.
During the entire 52 month period, the mean facility-wide HO-CDI-IR was 7.8 per 10,000 PD and the SIR was 0.9 overall. The mean ± SD HO-CDI-IR (8.5 ± 2.0 vs 6.5 ± 2.3; P < .001) and SIR (0.97 ± 0.23 vs 0.78 ± 0.26; P = .015) decreased from baseline during the intervention. Segmented regression models identified significant decreases in HO-CDI-IR (Pstep = .06; Ptrend = .008) and SIR (Pstep = .1; Ptrend = .017) trends concurrent with decreases in oral vancomycin (Pstep < .001; Ptrend < .001). HO-CDI-IR within a noninterventional control unit did not change (Pstep = .125; Ptrend = .115).
A multidisciplinary, multifaceted intervention leveraging clinician education and feedback reduced the HO-CDI-IR and the SIR in select populations. Institutions may consider interventions like ours to reduce false-positive C. difficile NAAT tests.