We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite the availability of remote consolation and the evidence for its effectiveness, its adoption has been relatively limited (Hashiguchi, 2020). In light of COVID social distancing measures, there was an immediate requirement to adopt this technology into routine practice.
Objectives
The objective of this evaluation was to examine clinicians’ experiences of the urgent adoption of digital technology in a NHS provider of mental health and community physical health services.
Methods
From a staff survey (n=234) of experiences of working during a period when there were significant levels of Covid-related restrictions, data was extracted and subject to thematic analysis by a research team made up of clinicians, academics, and quality improvement specialists.
Results
Five key themes relevant to the urgent adoption of digital technology were identified (figure 1): (1) Availability of staff for patient contact was generally felt to be improved; (2) Quality of contact was reported to be variable (e.g. some respondents reporting better rapport with patients, whereas others found remote contact interfered with rapport building); (3) Safeguarding concerns were reported to be more difficult to identify through remote consultation; (4) Contingency plans were recommended to allow for vulnerable patients for whom remote consultation was a problem; (5) Multi-agency working was reported to be strengthened.
Conclusions
The findings from this evaluation allow for an informed approach to future adoption of remote consultation in routine practice.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
Methods:
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
Results:
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
Conclusions:
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
Methods:
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Results:
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Conclusions:
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
We investigate experimentally the turbulent flow through a two-dimensional contraction. Using a water tunnel with an active grid we generate turbulence at Taylor microscale Reynolds number $Re_{\unicode[STIX]{x1D706}}\sim 250$ which is advected through a 2.5 : 1 contraction. Volumetric and time-resolved tomographic particle image velocimetry and shake-the-box velocity measurements are used to characterize the evolution of coherent vortical structures at three streamwise locations upstream of and within the contraction. We confirm the conceptual picture of coherent large-scale vortices being stretched and aligned with the mean rate of strain. This alignment of the vortices with the tunnel centreline is stronger compared to the alignment of vorticity with the large-scale strain observed in numerical simulations of homogeneous turbulence. We judge this by the peak probability magnitudes of these alignments. This result is robust and independent of the grid-rotation protocols. On the other hand, while the pointwise vorticity vector also, to a lesser extent, aligns with the mean strain, it principally remains aligned with the intermediate eigenvector of the local instantaneous strain-rate tensor, as is known in other turbulent flows. These results persist when the distance from the grid to the entrance of the contraction is doubled, showing that modest transverse inhomogeneities do not significantly affect these vortical-orientation results.
Early behaviors that differentiate later biomarkers for psychopathology can guide preventive efforts while also facilitating pathophysiological research. We tested whether error-related negativity (ERN) moderates the link between early behavior and later psychopathology in two early childhood phenotypes: behavioral inhibition and irritability. From ages 2 to 7 years, children (n = 291) were assessed longitudinally for behavioral inhibition (BI) and irritability. Behavioral inhibition was assessed via maternal report and behavioral responses to novelty. Childhood irritability was assessed using the Child Behavior Checklist. At age 12, an electroencephalogram (EEG) was recorded while children performed a flanker task to measure ERN, a neural indicator of error monitoring. Clinical assessments of anxiety and irritability were conducted using questionnaires (i.e., Screen for Child Anxiety Related Disorders and Affective Reactivity Index) and clinical interviews. Error monitoring interacted with early BI and early irritability to predict later psychopathology. Among children with high BI, an enhanced ERN predicted greater social anxiety at age 12. In contrast, children with high childhood irritability and blunted ERN predicted greater irritability at age 12. This converges with previous work and provides novel insight into the specificity of pathways associated with psychopathology.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
Objectives:
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
Methods:
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Results:
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Conclusions:
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Methods
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Results
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
Conclusion
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
The Numeniini is a tribe of 13 wader species (Scolopacidae, Charadriiformes) of which seven are Near Threatened or globally threatened, including two Critically Endangered. To help inform conservation management and policy responses, we present the results of an expert assessment of the threats that members of this taxonomic group face across migratory flyways. Most threats are increasing in intensity, particularly in non-breeding areas, where habitat loss resulting from residential and commercial development, aquaculture, mining, transport, disturbance, problematic invasive species, pollution and climate change were regarded as having the greatest detrimental impact. Fewer threats (mining, disturbance, problematic native species and climate change) were identified as widely affecting breeding areas. Numeniini populations face the greatest number of non-breeding threats in the East Asian-Australasian Flyway, especially those associated with coastal reclamation; related threats were also identified across the Central and Atlantic Americas, and East Atlantic flyways. Threats on the breeding grounds were greatest in Central and Atlantic Americas, East Atlantic and West Asian flyways. Three priority actions were associated with monitoring and research: to monitor breeding population trends (which for species breeding in remote areas may best be achieved through surveys at key non-breeding sites), to deploy tracking technologies to identify migratory connectivity, and to monitor land-cover change across breeding and non-breeding areas. Two priority actions were focused on conservation and policy responses: to identify and effectively protect key non-breeding sites across all flyways (particularly in the East Asian- Australasian Flyway), and to implement successful conservation interventions at a sufficient scale across human-dominated landscapes for species’ recovery to be achieved. If implemented urgently, these measures in combination have the potential to alter the current population declines of many Numeniini species and provide a template for the conservation of other groups of threatened species.
Volunteer horseradish plants that emerged from root segments remaining after harvest can reduce yields of rotational crops as well as provide a host for pathogens and insects, thus reducing the benefits of crop rotation. POST applications of halosulfuron in corn can be an effective component to improve management of volunteer horseradish, but the replant interval from application to safe planting of commercial horseradish has not been determined. Fall herbicide applications are another possible volunteer horseradish management strategy than can be implemented once crops are harvested. Therefore, field experiments were conducted to evaluate the safe replant interval of horseradish following halosulfuron applications and to determine the efficacy of fall herbicide applications for volunteer horseradish control. Visual estimates of horseradish injury were greatest (85%) in plantings made zero months after halosulfuron applied at two times the approved rate; moreover, for all rates, injury decreased as the time after halosulfuron application increased. No herbicide injury or root biomass reduction occurred on horseradish at any halosulfuron rate from replanting beyond 4 mo after halosulfuron application. Control of volunteer horseradish was 91% or greater for all fall herbicide applications that included 2,4-D. Furthermore, volunteer horseradish shoot density was the lowest following combinations of 2,4-D tank-mixed with halosulfuron or rimsulfuron : thifensulfuron (0.2 and 0.4 shoots m−2, respectively) compared with the nontreated control (5.1 shoots m−2). This research demonstrates the effectiveness of both halosulfuron and 2,4-D as components of an integrated management strategy for volunteer horseradish control and the potential for halosulfuron applications without soil persistence beyond 4 mo affecting subsequent commercial horseradish production.
Few records in the alpine landscape of western North America document the geomorphic and glaciologic response to climate change during the Pleistocene–Holocene transition. While moraines can provide snapshots of glacier extent, high-resolution records of environmental response to the end of the Last Glacial Maximum, Younger Dryas cooling, and subsequent warming into the stable Holocene are rare. We describe the transition from the late Pleistocene to the Holocene using a ~ 17,000-yr sediment record from Swiftcurrent Lake in eastern Glacier National Park, MT, with a focus on the period from ~ 17 to 11 ka. Total organic and inorganic carbon, grain size, and carbon/nitrogen data provide evidence for glacial retreat from the late Pleistocene into the Holocene, with the exception of a well-constrained advance during the Younger Dryas from 12.75 to 11.5 ka. Increased detrital carbonate concentration in Swiftcurrent Lake sediment reflects enhanced glacial erosion and sediment transport, likely a result of a more proximal ice terminus position and a reduction in the number of alpine lakes acting as sediment sinks in the valley.
Molecular imprinting is the process by which molecules are imprinted into the matrix of a material through non-covalent bonding, including hydrogen bonding and van der Waals interactions. In this study hydrogels were imprinted with glaucoma medication with the purpose of creating a reusable ocular drug delivery device with reversible binding sites. The material was synthesized and tested with UV-Vis spectroscopy to determine the concentration of the released drug after twelve hours in distilled water. Modifications were made to the polymer to explore methods required for the proper delivery of the drug over an adequate period of time.
Invasive vertebrate species have had a dramatic impact on the unique native ecosystems of both Australia and New Zealand. Some of these species were accidentally introduced, though many were introduced deliberately for a number of reasons: as a food resource, for hunting and trade, as a mode of transportation, as a control tool for other pests, and by acclimatisation societies to remind colonists of home. Regardless of the method of introduction, these invasive species have had major negative impacts on the native flora and fauna, including herbivory, predation, competition, disease, hybridisation and habitat change, and have also affected human health and industry. In both countries the aim is now to prevent establishment of new invasive species and preserve key areas of high biodiversity value through the control or eradication of invasive species.
Introduction
Invasions of vertebrate species into habitats outside their natural range have had major impacts across the globe and particularly in Australia and New Zealand (Simberloff & Rejmánek, 2011). Preventing the arrival of these species is the best protection for native ecosystems, but once introduction and spread have taken place, effective and efficient management of entrenched species is the goal. Sound management decisions rely on detailed information on the invasive population, the type and degree of impacts, and the strategic, science-based application of control.
Behavioral inhibition, a temperament identifiable in infancy, is associated with heightened withdrawal from social encounters. Prior studies raise particular interest in the striatum, which responds uniquely to monetary gains in behaviorally inhibited children followed into adolescence. Although behavioral manifestations of inhibition are expressed primarily in the social domain, it remains unclear whether observed striatal alterations to monetary incentives also extend to social contexts. In the current study, imaging data were acquired from 39 participants (17 males, 22 females; ages 16–18 years) characterized since infancy on measures of behavioral inhibition. A social evaluation task was used to assess neural response to anticipation and receipt of positive and negative feedback from novel peers, classified by participants as being of high or low interest. As with monetary rewards, striatal response patterns differed during both anticipation and receipt of social reward between behaviorally inhibited and noninhibited adolescents. The current results, when combined with prior findings, suggest that early-life temperament predicts altered striatal response in both social and nonsocial contexts and provide support for continuity between temperament measured in early childhood and neural response to social signals measured in late adolescence and early adulthood.