To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Balloon valvuloplasty and surgical aortic valvotomy have been the treatment mainstays for congenital aortic stenosis in children. Choice of intervention often differs depending upon centre bias with limited relevant, comparative literature.
This study aims to provide an unbiased, contemporary matched comparison of these balloon and surgical approaches.
Retrospective analysis of patients with congenital aortic valve stenosis who underwent balloon valvuloplasty (Queensland Children’s Hospital, Brisbane) or surgical valvotomy (Royal Children’s Hospital, Melbourne) between 2005 and 2016. Patients were excluded if pre-intervention assessment indicated ineligibility to either group. Propensity score matching was performed based on age, weight, and valve morphology.
Sixty-five balloon patients and seventy-seven surgical patients were included. Overall, the groups were well matched with 18 neonates/25 infants in the balloon group and 17 neonates/28 infants in the surgical group. Median age at balloon was 92 days (range 2 days – 18.8 years) compared to 167 days (range 0 days – 18.1 years) for surgery (rank-sum p = 0.08). Mean follow-up was 5.3 years. There was one late balloon death and two early surgical deaths due to left ventricular failure. There was no significant difference in freedom from reintervention at latest follow-up (69% in the balloon group and 70% in the surgical group, p = 1.0).
Contemporary analysis of balloon aortic valvuloplasty and surgical aortic valvotomy shows no difference in overall reintervention rates in the medium term. Balloon valvuloplasty performs well across all age groups, achieving delay or avoidance of surgical intervention.
The systems ecology paradigm (SEP) emerged in the late 1960s at a time when societies throughout the world were beginning to recognize that our environment and natural resources were being threatened by their activities. Management practices in rangelands, forests, agricultural lands, wetlands, and waterways were inadequate to meet the challenges of deteriorating environments, many of which were caused by the practices themselves. Scientists recognized an immediate need was developing a knowledge base about how ecosystems function. That effort took nearly two decades (1980s) and concluded with the acceptance that humans were components of ecosystems, not just controllers and manipulators of lands and waters. While ecosystem science was being developed, management options based on ecosystem science were shifting dramatically toward practices supporting sustainability, resilience, ecosystem services, biodiversity, and local to global interconnections of ecosystems. Emerging from the new knowledge about how ecosystems function and the application of the systems ecology approach was the collaboration of scientists, managers, decision-makers, and stakeholders locally and globally. Today’s concepts of ecosystem management and related ideas, such as sustainable agriculture, ecosystem health and restoration, consequences of and adaptation to climate change, and many other important local to global challenges are a direct result of the SEP.
This Element describes for the first time the database of peer review reports at PLOS ONE, the largest scientific journal in the world, to which the authors had unique access. Specifically, this Element presents the background contexts and histories of peer review, the data-handling sensitivities of this type of research, the typical properties of reports in the journal to which the authors had access, a taxonomy of the reports, and their sentiment arcs. This unique work thereby yields a compelling and unprecedented set of insights into the evolving state of peer review in the twenty-first century, at a crucial political moment for the transformation of science. It also, though, presents a study in radicalism and the ways in which PLOS's vision for science can be said to have effected change in the ultra-conservative contemporary university. This title is also available as Open Access on Cambridge Core.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
To characterise dietary habits, their temporal and spatial patterns and associations with BMI in the 23andMe study population.
We present a large-scale cross-sectional analysis of self-reported dietary intake data derived from the web-based National Health and Nutrition Examination Survey 2009–2010 dietary screener. Survey-weighted estimates for each food item were characterised by age, sex, race/ethnicity, education and BMI. Temporal patterns were plotted over a 2-year time period, and average consumption for select food items was mapped by state. Finally, dietary intake variables were tested for association with BMI.
US-based adults 20–85 years of age participating in the 23andMe research programme.
Participants were 23andMe customers who consented to participate in research (n 526 774) and completed web-based surveys on demographic and dietary habits.
Survey-weighted estimates show very few participants met federal recommendations for fruit: 2·6 %, vegetables: 5·9 % and dairy intake: 2·8 %. Between 2017 and 2019, fruit, vegetables and milk intake frequency declined, while total dairy remained stable and added sugars increased. Seasonal patterns in reporting were most pronounced for ice cream, chocolate, fruits and vegetables. Dietary habits varied across the USA, with higher intake of sugar and energy dense foods characterising areas with higher average BMI. In multivariate-adjusted models, BMI was directly associated with the intake of processed meat, red meat, dairy and inversely associated with consumption of fruit, vegetables and whole grains.
23andMe research participants have created an opportunity for rapid, large-scale, real-time nutritional data collection, informing demographic, seasonal and spatial patterns with broad geographical coverage across the USA.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
Understanding the properties of time averaging (age mixing) in a stratigraphic layer is essential for properly interpreting the paleofauna preserved in the geologic record. This work assesses the age and quantifies the scale and structure of time averaging of land snail-rich colluvial sediments from the Madeira Archipelago (Portugal) by dating individual shells using amino acid racemization calibrated with graphite-target and carbonate-target accelerator mass spectrometry radiocarbon methods. Gastropod shells of Actinella nitidiuscula were collected from seven sites on the volcanic islands of Bugio and Deserta Grande (Desertas Islands), where snail shells are abundant and well preserved in Quaternary colluvial deposits. Results show that the shells ranged in age from modern to ~48 cal ka BP (calibrated radiocarbon age), covering the last glacial and present interglacial periods. Snail shells retrieved from two of the colluvial sites exhibit multimillennial age mixing (>6 ka), which significantly exceeds the analytical error from dating methods and calibration. The observed multimillennial mixing of these assemblages should be taking into consideration in upcoming paleoenvironmental and paleoecological studies in the region. The extent of age mixing may also inform about the time span of colluvial deposition, which can be useful in future geomorphological studies. In addition, this study presents the first carbonate-target radiocarbon results for land snail shells and suggests that this novel, rapid, and more affordable dating method offers reliable age estimates for small land snail shells younger than ~20 cal ka BP.
The Iberia–Newfoundland continental margin is one of the most-studied conjugate margins in the world. However, many unknowns remain regarding the nature of rifting preceding its break-up. We analyse a large dataset of tectonic subsidence curves, created from publicly available well data, to show spatial and temporal trends of rifting in the proximal domains of the margin. We develop a novel methodology of bulk averaging tectonic subsidence curves that can be applied on any conjugate margin with a similar spread of well data. The method does not rely on the existence of conjugate, deep seismic profiles and, specifically, attempts to forego the risk of quantitative bias derived from localized anomalies and uncertain stratigraphic dating and correlation. Results for the Iberia–Newfoundland margin show that active rift-driven tectonic subsidence occurred in the Central segment of the conjugate margin from c. 227 Ma (early Norian) to c. 152.1 Ma (early Tithonian), in the southern segment from c. 208.5 Ma (early Rhaetian) to c. 152.1 Ma (early Tithonian) and in the northern segment from c. 201.3 Ma (early Hettangian) to c. 132.9 Ma (early Hauterivian). This indicates that rifting in the stretching phase of the proximal domain of the Iberia–Newfoundland margin does not mirror hyperextended domain rifting trends (south to north) that ultimately led to break-up. The insights into broad-scale three-dimensional spatial and temporal trends, produced using the novel methodology presented in this paper, provide added value for interpretation of the development of passive margins, and new constraints for modelling of the formation of conjugate margins.
“Medicalization” has been a contentious notion since its introduction centuries ago. While some scholars lamented a medical overreach into social domains, others hailed its promise for social justice advocacy. Against the backdrop of a growing commitment to health equity across the nation, this article reviews historical interpretations of medicalization, offers an application of the term to non-biologic risk factors for disease, and presents the case of housing the demonstrate the great potential of medicalizing poverty.
Fuscopannaria leucosticta is a rare and understudied cyanolichen with an interesting and unusual distribution in tertiary relict hotspots worldwide. There is a relatively large population in eastern North America, where it occurs mostly throughout the Appalachian Mountains and reaches its northernmost extent in New Brunswick and Nova Scotia, Canada. The ability to detect this species, and thus determine its habitat requirements, is critical for understanding how it might be affected by human-induced environmental degradation. Maximum entropy modelling with MaxEnt was used to predict the distribution of suitable habitat for this species in Nova Scotia using 62 presence locations, 1405 pseudo-absence locations and four environmental covariates: depth to water table (a proxy for relative soil moisture), distance to the coast and mean annual temperature and precipitation. Our predictive maps identify important habitat features and areas of high suitability in Nova Scotia with an area under the curve value of 0·85. The predicted distribution of this lichen was most affected by temperature. This study elucidates locations as well as species-habitat relationships for F. leucosticta, providing land managers with baseline data that can aid in the discovery of additional populations and provide a better understanding of its ecological requirements which will support the development of sound conservation strategies for this rare lichen.
Vanadium Oxide has application to infrared bolometers due to high temperature coefficient of resistivity (TCR). It has attracted interest for switchable plasmonic devices due to its metal to insulator transition near room temperature. We report here the properties of vanadium oxide deposited by an aqueous spray process. The films have a ropy surface morphology with ∼70 nm surface roughness. The polycrystalline phase depends on annealing conditions. The films have TCR of ∼2%/deg, which compares well with sputtered films. Only weak evidence is found for an insulator-metal phase transition in these films.
Bella Robinson, Commonwealth Scientific and Industrial Research Organisation,
Robert Power, Commonwealth Scientific and Industrial Research Organisation,
Mark Cameron, Commonwealth Scientific and Industrial Research Organisation
Twitter is a new data channel for emergency managers to source public information for situational awareness and as a means of engaging with the community during disaster response and recovery activities. Twitter has been used successfully to identify emergency events, obtain crowd sourced information as the event unfolds, provide up-to-date information to the affected community from authoritative agencies, and conduct resource planning.
Natural disasters have increased in severity and frequency in recent years. According to Guha-Sapir et al. (2011), in 2010, 385 natural disasters killed over 297,000 people worldwide, impacted 217 million human lives, and cost the global economy an estimated US$123.9 billion. There are numerous examples from around the world: the 2004 Indian Ocean earthquake and tsunami; the more recent 2011 Tōhoku earthquake and tsunami, which damaged the Fukushima nuclear power station; hurricanes Katrina and Sandy in 2005 and 2012 respectively; the 2010 China floods, which caused widespread devastation; and Victoria's 2009 “Black Saturday” bushfires in Australia, killing 173 people and having an estimated A$2.9 billion in total losses (Stephenson, Handmer, & Haywood, 2012).
With urban development occurring on coastlines and spreading into rural areas, houses and supporting infrastructure are expanding into high-risk regions. The growing world population is moving into areas progressively more prone to natural disasters and unpredictable weather events. These events have been increasing in frequency and severity in recent years (Hawkins et al., 2012).
It has been recognized that information published by the general public on social media is relevant to emergency managers and that social media is a useful means of providing information to communities that may be impacted by emergency events (Lindsay, 2011; Anderson, 2012). To prepare and respond to such emergency situations effectively, it is critical that emergency managers have relevant and reliable information. For example, bushfire management is typically a regional government responsibility, and each jurisdiction has its own agency that takes the lead in coordinating community preparedness and responding to bushfires when they occur.
Fundamental mode classical Cepheids have light curves which repeat accurately enough that we can watch them evolve (change period). The new level of accuracy and quantity of data with the Kepler and MOST satellites probes this further. An intriguing result was found in the long time-series of Kepler data for V1154 Cyg the one classical Cepheid (fundamental mode, P = 4.9 d) in the field, which has short term changes in period (≃20 minutes), correlated for ≃10 cycles (period jitter). To follow this up, we obtained a month long series of observations of the fundamental mode Cepheid RT Aur and the first overtone pulsator SZ Tau. RT Aur shows the traditional strict repetition of the light curve, with the Fourier amplitude ratio R1/R2 remaining nearly constant. The light curve of SZ Tau, on the other hand, fluctuates in amplitude ratio at the level of approximately 50%. Furthermore prewhitening the RT Aur data with 10 frequencies reduces the Fourier spectrum to noise. For SZ Tau, considerable power is left after this prewhitening in a complicated variety of frequencies.
The Australian Square Kilometre Array Pathfinder (ASKAP) will give us an unprecedented opportunity to investigate the transient sky at radio wavelengths. In this paper we present VAST, an ASKAP survey for Variables and Slow Transients. VAST will exploit the wide-field survey capabilities of ASKAP to enable the discovery and investigation of variable and transient phenomena from the local to the cosmological, including flare stars, intermittent pulsars, X-ray binaries, magnetars, extreme scattering events, interstellar scintillation, radio supernovae, and orphan afterglows of gamma-ray bursts. In addition, it will allow us to probe unexplored regions of parameter space where new classes of transient sources may be detected. In this paper we review the known radio transient and variable populations and the current results from blind radio surveys. We outline a comprehensive program based on a multi-tiered survey strategy to characterise the radio transient sky through detection and monitoring of transient and variable sources on the ASKAP imaging timescales of 5 s and greater. We also present an analysis of the expected source populations that we will be able to detect with VAST.
Compound specific radiocarbon measurements can be made instantaneously using a gas chromatograph (GC) combustion system coupled to a 14C AMS system fitted with a gas ion source. Samples below 10 μg C can be analyzed but the precision is reduced to 5–10% because of lower source efficiency. We modified our GC for CH4 and CO2 analysis and injected samples multiple times to sum data and increase precision. We attained a maximum precision of 0.6% for modern CO2 from 25 injections of 27 μg C and a background of ≃0.5% (40 kyr) for ancient methane. The 14C content of dissolved CO2 and CH4 in water samples collected at a deep-sea hydrothermal vent and a serpentine mud volcano was measured and the results for the vent sample are consistent with previously published data. Further experiments are required to determine a calibration and correction procedure to maximize accuracy.