To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mass gatherings and high-density activities, such as sporting events, conventions, and theme parks, are consistently included among highest-risk activities given the increased potential for widespread coronavirus disease 2019 (COVID-19) transmission. A more balanced risk management approach is required because absolute suppression of risk is unrealistic in all facets of life. Contact tracing remains a limiting factor in achieving such a balance. The use of Bluetooth or pairing devices is proposed to address this challenge. This simple approach, when applied in a manner that satisfies privacy and trust concerns, would allow high-risk encounters to be quickly identified, namely those where participants have spent 15 minutes or more within 6 ft of each other per current guidelines. If an attendee later tests positive for COVID-19 and tracing is required, the event organizer can provide a limited list of potential close contacts rather than an exhaustive list of all attendees. Contact tracers can, therefore, limit efforts to this concise group rather than needing to contact thousands of people or conduct mass media communications. Such a system, if institutionalized, supports risk assurance and safety measures for businesses by demonstrating a commitment to staff, customer protection, and ensuring high-risk encounters are logged, reinforcing longer-term societal pandemic resilience.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
Vitamin D deficiency has been commonly reported in elite athletes, but the vitamin D status of UK university athletes in different training environments remains unknown. The present study aimed to determine any seasonal changes in vitamin D status among indoor and outdoor athletes, and whether there was any relationship between vitamin D status and indices of physical performance and bone health. A group of forty-seven university athletes (indoor n 22, outdoor n 25) were tested during autumn and spring for serum vitamin D status, bone health and physical performance parameters. Blood samples were analysed for serum 25-hydroxyvitamin D (s-25(OH)D) status. Peak isometric knee extensor torque using an isokinetic dynamometer and jump height was assessed using an Optojump. Aerobic capacity was estimated using the Yo-Yo intermittent recovery test. Peripheral quantitative computed tomography scans measured radial bone mineral density. Statistical analyses were performed using appropriate parametric/non-parametric testing depending on the normality of the data. s-25(OH)D significantly fell between autumn (52·8 (sd 22·0) nmol/l) and spring (31·0 (sd 16·5) nmol/l; P < 0·001). In spring, 34 % of participants were considered to be vitamin D deficient (<25 nmol/l) according to the revised 2016 UK guidelines. These data suggest that UK university athletes are at risk of vitamin D deficiency. Thus, further research is warranted to investigate the concomitant effects of low vitamin D status on health and performance outcomes in university athletes residing at northern latitudes.
Chinese privet (Ligustrum sinense Lour.) is a deciduous to evergreen shrub with an expansive nonnative global range. Control costs are often high, so land managers must carefully consider whether the plant’s potential negative effects warrant active management. To help facilitate this decision-making process, we reviewed and synthesized the literature on the potential ecological effects of L. sinense invasion. We also identified research gaps in need of further study. We found ample evidence of negative relationships between L. sinense invasion and native plant communities. While observational studies are not able to confirm whether L. sinense is driving these relationships, experimental evidence suggests that there is a cause–effect relationship. Of particular concern is the possibility that L. sinense could suppress forest regeneration and cause areas to transition from forest to L. sinense–dominated shrublands. Although this outcome would obviously impact a wide variety of wildlife species, empirical evidence of negative effects of L. sinense on wildlife are limited, and some species may actually benefit from the additional cover and foraging opportunities that L. sinense can provide. Further research on the potential effects of L. sinense invasion on large-scale forest structure and wildlife populations is needed. In areas where L. sinense invasion is a concern, evidence suggests early detection and management can mitigate control costs.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
Starburst galaxies are often found to be the result of galaxy mergers. As a result, galaxy mergers are often believed to lie above the galaxy main sequence: the tight correlation between stellar mass and star formation rate. Here, we aim to test this claim.
Deep learning techniques are applied to images from the Sloan Digital Sky Survey to provide visual-like classifications for over 340 000 objects between redshifts of 0.005 and 0.1. The aim of this classification is to split the galaxy population into merger and non-merger systems and we are currently achieving an accuracy of 92.5%. Stellar masses and star formation rates are also estimated using panchromatic data for the entire galaxy population. With these preliminary data, the mergers are placed onto the full galaxy main sequence, where we find that merging systems lie across the entire star formation rate - stellar mass plane.
In benefit-cost analysis, fatality risk reductions are usually valued based on estimates of adults’ willingness to pay for changes in their own risks, regardless of whether the risk reduction accrues to adults or children. This approach reflects the relatively large number of valuation studies that address adults; however, the literature on children is growing. We review these studies, focusing on those that estimate values for both adults and children using a consistent approach to limit the effects of between-study variability. We rely on explicit selection criteria to identify studies that measure reasonably comparable outcomes and are candidates for application to analyses of U.S. policies. The ratio of values for children to values for adults ranges from 0.6 to 2.9; however, most estimates are greater than 1.5. Although some studies suggest that the divergence between child and adult values decreases as the child ages, this finding is not universal. We conclude that analysts should test the sensitivity of their results to the use of higher values for children than adults. Additional empirical research is needed to support more precise estimates of the variation in values by age that can be featured in the primary analysis.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Glaciers retreating in response to climate warming are progressively exposing primary mineral substrates to surface conditions. As primary production is constrained by nitrogen (N) availability in these emerging ecosystems, improving our understanding of how N accumulates with soil formation is of critical concern. In this study, we quantified how the distribution and speciation of N, as well as rates of free-living biological N fixation (BNF), change along a 2000-year chronosequence of soil development in a High Arctic glacier forefield. Our results show the soil N pool increases with time since exposure and that the rate at which it accumulates is influenced by soil texture. Further, all N increases were organically bound in soils which had been ice-free for 0–50 years. This is indicative of N limitation and should promote BNF. Using the acetylene reduction assay technique, we demonstrated that microbially mediated inputs of N only occurred in soils which had been ice-free for 0 and 3 years, and that potential rates of BNF declined with increased N availability. Thus, BNF only supports N accumulation in young soils. When considering that glacier forefields are projected to become more expansive, this study has implications for understanding how ice-free ecosystems will become productive over time.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Field identification of ST-elevation myocardial infarction (STEMI) and advanced hospital notification decreases first-medical-contact-to-balloon (FMC2B) time. A recent study in this system found that electrocardiogram (ECG) transmission following a STEMI alert was frequently unsuccessful.
Instituting weekly test ECG transmissions from paramedic units to the hospital would increase successful transmission of ECGs and decrease FMC2B and door-to-balloon (D2B) times.
This was a natural experiment of consecutive patients with field-identified STEMI transported to a single percutaneous coronary intervention (PCI)-capable hospital in a regional STEMI system before and after implementation of scheduled test ECG transmissions. In November 2014, paramedic units began weekly test transmissions. The mobile intensive care nurse (MICN) confirmed the transmission, or if not received, contacted the paramedic unit and the department’s nurse educator to identify and resolve the problem. Per system-wide protocol, paramedics transmit all ECGs with interpretation of STEMI. Receiving hospitals submit patient data to a single registry as part of ongoing system quality improvement. The frequency of successful ECG transmission and time to intervention (FMC2B and D2B times) in the 18 months following implementation was compared to the 10 months prior. Post-implementation, the time the ECG transmission was received was also collected to determine the transmission gap time (time from ECG acquisition to ECG transmission received) and the advanced notification time (time from ECG transmission received to patient arrival).
There were 388 patients with field ECG interpretations of STEMI, 131 pre-intervention and 257 post-intervention. The frequency of successful transmission post-intervention was 73% compared to 64% prior; risk difference (RD)=9%; 95% CI, 1-18%. In the post-intervention period, the median FMC2B time was 79 minutes (inter-quartile range [IQR]=68-102) versus 86 minutes (IQR=71-108) pre-intervention (P=.3) and the median D2B time was 59 minutes (IQR=44-74) versus 60 minutes (IQR=53-88) pre-intervention (P=.2). The median transmission gap was three minutes (IQR=1-8) and median advanced notification time was 16 minutes (IQR=10-25).
Implementation of weekly test ECG transmissions was associated with improvement in successful real-time transmissions from field to hospital, which provided a median advanced notification time of 16 minutes, but no decrease in FMC2B or D2B times.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Fine resolution topographic data derived from methods such as Structure from Motion (SfM) and Multi-View Stereo (MVS) have the potential to provide detailed observations of geomorphological change, but have thus far been limited by the logistical constraints of conducting repeat surveys in the field. Here, we present the results from an automated time-lapse camera array, deployed around an ice-marginal lake on the western margin of the Greenland ice sheet. Fifteen cameras acquired imagery three-times per day over a 426 day period, yielding a dataset of ~19 000 images. From these data we derived 18 point clouds of the ice-margin across a range of seasons and successfully identified calving events (ranging from 234 to 1475 m2 in area and 815–8725 m3 in volume) induced by ice cliff undercutting at the waterline and the collapse of spalling flakes. Low ambient light levels, locally reflective surfaces and the large survey range hindered analysis of smaller scale ice-margin dynamics. Nevertheless, this study demonstrates that an integrated SfM-MVS and time-lapse approach can be employed to generate long-term 3-D topographic datasets and thus quantify ice-margin dynamics at a fine spatio-temporal scale. This approach provides a template for future studies of geomorphological change.
The Rockefeller Clinical Scholars (KL2) program began in 1976 and transitioned into a 3-year Master’s degree program in 2006 when Rockefeller joined the National Institute of Health Clinical and Translational Science Award program. The program consists of ∼15 trainees supported by the Clinical and Translational Science Award KL2 award and University funds. It is designed to provide an optimal environment for junior translational investigators to develop team science and leadership skills by designing and performing a human subjects protocol under the supervision of a distinguished senior investigator mentor and a team of content expert educators. This is complemented by a tutorial focused on important translational skills.
Since 2006, 40 Clinical Scholars have graduated from the programs and gone on to careers in academia (72%), government service (5%), industry (15%), and private medical practice (3%); 2 (5%) remain in training programs; 39/40 remain in translational research careers with 23 National Institute of Health awards totaling $23 million, foundation and philanthropic support of $20.3 million, and foreign government and foundation support of $6 million. They have made wide ranging scientific discoveries and have endeavored to translate those discoveries into improved human health.
The Rockefeller Clinical Scholars (KL2) program provides one model for translational science training.
Anomalous aortic origin of the coronary arteries is associated with exercise-induced ischaemia, leading some physicians to restrict exercise in patients with this condition. We sought to determine whether exercise restriction was associated with increasing body mass index over time. From 1998 to 2015, 440 patients ⩽30 years old were enrolled into an inception cohort. Exercise-restriction status was documented in 143 patients. Using linear mixed model repeated-measures regression, factors associated with increasing body mass index z-score over time, including exercise restriction and surgical intervention as time-varying covariates, were investigated. The 143 patients attended 558 clinic visits for which exercise-restriction status was recorded. The mean number of clinic visits per patient was 4, and the median duration of follow-up was 1.7 years (interquartile range (IQR) 0.5–4.4). The median age at first clinic visit was 10.3 years (IQR 7.1–13.9), and 71% (101/143) were males. All patients were alive at their most recent follow-up. At the first clinic visit, 54% (78/143) were exercise restricted, and restriction status changed in 34% (48/143) during follow-up. The median baseline body mass index z-score was 0.2 (IQR 0.3–0.9). In repeated-measures analysis, neither time-related exercise restriction nor its interaction with time was associated with increasing body mass index z-score. Surgical intervention and its interaction with time were associated with decreasing body mass index z-score. Although exercise restriction was not associated with increasing body mass index over time, surgical intervention was associated with decreasing body mass index z-score over time in patients with anomalous aortic origin of the coronary arteries.