To send this article to your account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send this article to your Kindle, first ensure email@example.com is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Binary Population and Spectral Synthesis suite of binary stellar evolution models and synthetic stellar populations provides a framework for the physically motivated analysis of both the integrated light from distant stellar populations and the detailed properties of those nearby. We present a new version 2.1 data release of these models, detailing the methodology by which Binary Population and Spectral Synthesis incorporates binary mass transfer and its effect on stellar evolution pathways, as well as the construction of simple stellar populations. We demonstrate key tests of the latest Binary Population and Spectral Synthesis model suite demonstrating its ability to reproduce the colours and derived properties of resolved stellar populations, including well-constrained eclipsing binaries. We consider observational constraints on the ratio of massive star types and the distribution of stellar remnant masses. We describe the identification of supernova progenitors in our models, and demonstrate a good agreement to the properties of observed progenitors. We also test our models against photometric and spectroscopic observations of unresolved stellar populations, both in the local and distant Universe, finding that binary models provide a self-consistent explanation for observed galaxy properties across a broad redshift range. Finally, we carefully describe the limitations of our models, and areas where we expect to see significant improvement in future versions.
The ability to quickly detect transient sources in optical images and trigger multi-wavelength follow up is key for the discovery of fast transients. These include events rare and difficult to detect such as kilonovae, supernova shock breakout, and ‘orphan’ Gamma-ray Burst afterglows. We present the Mary pipeline, a (mostly) automated tool to discover transients during high-cadenced observations with the Dark Energy Camera at Cerro Tololo Inter-American Observatory (CTIO). The observations are part of the ‘Deeper Wider Faster’ programme, a multi-facility, multi-wavelength programme designed to discover fast transients, including counterparts to Fast Radio Bursts and gravitational waves. Our tests of the Mary pipeline on Dark Energy Camera images return a false positive rate of ~2.2% and a missed fraction of ~3.4% obtained in less than 2 min, which proves the pipeline to be suitable for rapid and high-quality transient searches. The pipeline can be adapted to search for transients in data obtained with imagers other than Dark Energy Camera.
Most major discoveries in astronomy are unplanned, and result from surveying the Universe in a new way, rather than by testing a hypothesis or conducting an investigation with planned outcomes. For example, of the ten greatest discoveries made by the Hubble Space Telescope, only one was listed in its key science goals. So a telescope that merely achieves its stated science goals is not achieving its potential scientific productivity.
Several next-generation astronomical survey telescopes are currently being designed and constructed that will significantly expand the volume of observational parameter space, and should in principle discover unexpected new phenomena and new types of object. However, the complexity of the telescopes and the large data volumes mean that these discoveries are unlikely to be found by chance. Therefore, it is necessary to plan explicitly for unexpected discoveries in the design and construction. Two types of discovery are recognised: unexpected objects and unexpected phenomena.
This paper argues that next-generation astronomical surveys require an explicit process for detecting the unexpected, and proposes an implementation of this process. This implementation addresses both types of discovery, and relies heavily on machine-learning techniques, and also on theory-based simulations that encapsulate our current understanding of the Universe.
We present new software to cross-match low-frequency radio catalogues: the Positional Update and Matching Algorithm. The Positional Update and Matching Algorithm combines a positional Bayesian probabilistic approach with spectral matching criteria, allowing for confusing sources in the matching process. We go on to create a radio sky model using Positional Update and Matching Algorithm based on the Murchison Widefield Array Commissioning Survey, and are able to automatically cross-match ~ 98.5% of sources. Using the characteristics of this sky model, we create simple simulated mock catalogues on which to test the Positional Update and Matching Algorithm, and find that Positional Update and Matching Algorithm can reliably find the correct spectral indices of sources, along with being able to recover ionospheric offsets. Finally, we use this sky model to calibrate and remove foreground sources from simulated interferometric data, generated using OSKAR (the Oxford University visibility generator). We demonstrate that there is a substantial improvement in foreground source removal when using higher frequency and higher resolution source positions, even when correcting positions by an average of 0.3 arcmin given a synthesised beam-width of ~ 2.3 arcmin.
I introduce Profiler, a user-friendly program designed to analyse the radial surface brightness profiles of galaxies. With an intuitive graphical user interface, Profiler can accurately model galaxies of a broad range of morphological types, with various parametric functions routinely employed in the field (Sérsic, core-Sérsic, exponential, Gaussian, Moffat, and Ferrers). In addition to these, Profiler can employ the broken exponential model for disc truncations or anti–truncations, and two special cases of the edge-on disc model: along the disc's major or minor axis. The convolution of (circular or elliptical) models with the point spread function is performed in 2D, and offers a choice between Gaussian, Moffat or a user-provided profile for the point spread function. Profiler is optimised to work with galaxy light profiles obtained from isophotal measurements, which allow for radial gradients in the geometric parameters of the isophotes, and are thus often better at capturing the total light than 2D image-fitting programs. Additionally, the 1D approach is generally less computationally expensive and more stable. I demonstrate Profiler's features by decomposing three case-study galaxies: the cored elliptical galaxy NGC 3348, the nucleated dwarf Seyfert I galaxy Pox 52, and NGC 2549, a double-barred galaxy with an edge-on, truncated disc.
In this paper, we introduce Nicil: Non-Ideal magnetohydrodynamics Coefficients and Ionisation Library. Nicil is a stand-alone Fortran90 module that calculates the ionisation values and the coefficients of the non-ideal magnetohydrodynamics terms of Ohmic resistivity, the Hall effect, and ambipolar diffusion. The module is fully parameterised such that the user can decide which processes to include and decide upon the values of the free parameters, making this a versatile and customisable code. The module includes both cosmic ray and thermal ionisation; the former includes two ion species and three species of dust grains (positively charged, negatively charged, and neutral), and the latter includes five elements which can be doubly ionised. We demonstrate tests of the module, and then describe how to implement it into an existing numerical code.
The Evolutionary Map of the Universe (EMU) is a proposed radio continuum survey
of the Southern Hemisphere up to declination + 30°, with the Australian
Square Kilometre Array Pathfinder (ASKAP). EMU will use an automated source
identification and measurement approach that is demonstrably optimal, to
maximise the reliability and robustness of the resulting radio source
catalogues. As a step toward this goal we conducted a “Data
Challenge” to test a variety of source finders on simulated images. The
aim is to quantify the accuracy and limitations of existing automated source
finding and measurement approaches. The Challenge initiators also tested the
current ASKAPsoft source-finding tool to establish how it could benefit from
incorporating successful features of the other tools. As expected, most finders
show completeness around 100% at ≈ 10σ dropping to about 10% by
≈ 5σ. Reliability is typically close to 100% at ≈
10σ, with performance to lower sensitivities varying between finders. All
finders show the expected trade-off, where a high completeness at low
signal-to-noise gives a corresponding reduction in reliability, and vice versa.
We conclude with a series of recommendations for improving the performance of
the ASKAPsoft source-finding tool.