We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper presents an exploratory case study where video-based pose estimation is used to analyse human motion to support data-driven design. It provides two example use cases related to design. Results are compared to ground truth measurements showing high correlation for the estimated pose, with an RMSE of 65.5 mm. The paper exemplifies how design projects can benefit from a simple, flexible, and cost-effective approach to capture human-object interactions. This also entails the possibility of implementing interaction and body capturing in the earliest stages of design, at minimal effort.
Many short gamma-ray bursts (GRBs) originate from binary neutron star mergers, and there are several theories that predict the production of coherent, prompt radio signals either prior, during, or shortly following the merger, as well as persistent pulsar-like emission from the spin-down of a magnetar remnant. Here we present a low frequency (170–200 MHz) search for coherent radio emission associated with nine short GRBs detected by the Swift and/or Fermi satellites using the Murchison Widefield Array (MWA) rapid-response observing mode. The MWA began observing these events within 30–60 s of their high-energy detection, enabling us to capture any dispersion delayed signals emitted by short GRBs for a typical range of redshifts. We conducted transient searches at the GRB positions on timescales of 5 s, 30 s, and 2 min, resulting in the most constraining flux density limits on any associated transient of 0.42, 0.29, and 0.084 Jy, respectively. We also searched for dispersed signals at a temporal and spectral resolution of 0.5 s and 1.28 MHz, but none were detected. However, the fluence limit of 80–100 Jy ms derived for GRB 190627A is the most stringent to date for a short GRB. Assuming the formation of a stable magnetar for this GRB, we compared the fluence and persistent emission limits to short GRB coherent emission models, placing constraints on key parameters including the radio emission efficiency of the nearly merged neutron stars (
$\epsilon_r\lesssim10^{-4}$
), the fraction of magnetic energy in the GRB jet (
$\epsilon_B\lesssim2\times10^{-4}$
), and the radio emission efficiency of the magnetar remnant (
$\epsilon_r\lesssim10^{-3}$
). Comparing the limits derived for our full GRB sample (along with those in the literature) to the same emission models, we demonstrate that our fluence limits only place weak constraints on the prompt emission predicted from the interaction between the relativistic GRB jet and the interstellar medium for a subset of magnetar parameters. However, the 30-min flux density limits were sensitive enough to theoretically detect the persistent radio emission from magnetar remnants up to a redshift of
$z\sim0.6$
. Our non-detection of this emission could imply that some GRBs in the sample were not genuinely short or did not result from a binary neutron star merger, the GRBs were at high redshifts, these mergers formed atypical magnetars, the radiation beams of the magnetar remnants were pointing away from Earth, or the majority did not form magnetars but rather collapse directly into black holes.
We performed an epidemiological investigation and genome sequencing of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) to define the source and scope of an outbreak in a cluster of hospitalized patients. Lack of appropriate respiratory hygiene led to SARS-CoV-2 transmission to patients and healthcare workers during a single hemodialysis session, highlighting the importance of infection prevention precautions.
One of the principal systematic constraints on the Epoch of Reionisation (EoR) experiment is the accuracy of the foreground calibration model. Recent results have shown that highly accurate models of extended foreground sources, and including models for sources in both the primary beam and its sidelobes, are necessary for reducing foreground power. To improve the accuracy of the source models for the EoR fields observed by the Murchison Widefield Array (MWA), we conducted the MWA Long Baseline Epoch of Reionisation Survey (LoBES). This survey consists of multi-frequency observations of the main MWA EoR fields and their eight neighbouring fields using the MWA Phase II extended array. We present the results of the first half of this survey centred on the MWA EoR0 observing field (centred at RA (J2000)
$0^\mathrm{h}$
, Dec (J2000)
$-27^{\circ}$
). This half of the survey covers an area of 3 069 degrees
$^2$
, with an average rms of 2.1 mJy beam–1. The resulting catalogue contains a total of 80 824 sources, with 16 separate spectral measurements between 100 and 230 MHz, and spectral modelling for 78
$\%$
of these sources. Over this region we estimate that the catalogue is 90
$\%$
complete at 32 mJy, and 70
$\%$
complete at 10.5 mJy. The overall normalised source counts are found to be in good agreement with previous low-frequency surveys at similar sensitivities. Testing the performance of the new source models we measure lower residual rms values for peeled sources, particularly for extended sources, in a set of MWA Phase I data. The 2-dimensional power spectrum of these data residuals also show improvement on small angular scales—consistent with the better angular resolution of the LoBES catalogue. It is clear that the LoBES sky models improve upon the current sky model used by the Australian MWA EoR group for the EoR0 field.
In this era of spatially resolved observations of planet-forming disks with Atacama Large Millimeter Array (ALMA) and large ground-based telescopes such as the Very Large Telescope (VLT), Keck, and Subaru, we still lack statistically relevant information on the quantity and composition of the material that is building the planets, such as the total disk gas mass, the ice content of dust, and the state of water in planetesimals. SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is an infrared space mission concept developed jointly by Japan Aerospace Exploration Agency (JAXA) and European Space Agency (ESA) to address these questions. The key unique capabilities of SPICA that enable this research are (1) the wide spectral coverage
$10{-}220\,\mu\mathrm{m}$
, (2) the high line detection sensitivity of
$(1{-}2) \times 10^{-19}\,\mathrm{W\,m}^{-2}$
with
$R \sim 2\,000{-}5\,000$
in the far-IR (SAFARI), and
$10^{-20}\,\mathrm{W\,m}^{-2}$
with
$R \sim 29\,000$
in the mid-IR (SPICA Mid-infrared Instrument (SMI), spectrally resolving line profiles), (3) the high far-IR continuum sensitivity of 0.45 mJy (SAFARI), and (4) the observing efficiency for point source surveys. This paper details how mid- to far-IR infrared spectra will be unique in measuring the gas masses and water/ice content of disks and how these quantities evolve during the planet-forming period. These observations will clarify the crucial transition when disks exhaust their primordial gas and further planet formation requires secondary gas produced from planetesimals. The high spectral resolution mid-IR is also unique for determining the location of the snowline dividing the rocky and icy mass reservoirs within the disk and how the divide evolves during the build-up of planetary systems. Infrared spectroscopy (mid- to far-IR) of key solid-state bands is crucial for assessing whether extensive radial mixing, which is part of our Solar System history, is a general process occurring in most planetary systems and whether extrasolar planetesimals are similar to our Solar System comets/asteroids. We demonstrate that the SPICA mission concept would allow us to achieve the above ambitious science goals through large surveys of several hundred disks within
$\sim\!2.5$
months of observing time.
Galaxy clusters have been found to host a range of diffuse, non-thermal emission components, generally with steep, power law spectra. In this work we report on the detection and follow-up of radio halos, relics, remnant radio galaxies, and other fossil radio plasmas in Southern Sky galaxy clusters using the Murchison Widefield Array and the Australian Square Kilometre Array Pathfinder. We make use of the frequency coverage between the two radio interferometers—from 88 to
$\sim\!900$
MHz—to characterise the integrated spectra of these sources within this frequency range. Highlights from the sample include the detection of a double relic system in Abell 3186, a mini-halo in RXC J0137.2–0912, a candidate halo and relic in Abell 3399, and a complex multi-episodic head-tail radio galaxy in Abell 3164. We compare this selection of sources and candidates to the literature sample, finding sources consistent with established radio power–cluster mass scaling relations. Finally, we use the low-frequency integrated spectral index,
$\alpha$
(
$S_v \propto v^\alpha$
), of the detected sample of cluster remnants and fossil sources to compare with samples of known halos, relics, remnants and fossils to investigate a possible link between their electron populations. We find the distributions of
$\alpha$
to be consistent with relic and halo emission generated by seed electrons that originated in fossil or remnant sources. However, the present sample sizes are insufficient to rule out other scenarios.
We present a broadband radio study of the transient jets ejected from the black hole candidate X-ray binary MAXI J1535–571, which underwent a prolonged outburst beginning on 2017 September 2. We monitored MAXI J1535–571 with the Murchison Widefield Array (MWA) at frequencies from 119 to 186 MHz over six epochs from 2017 September 20 to 2017 October 14. The source was quasi-simultaneously observed over the frequency range 0.84–19 GHz by UTMOST (the Upgraded Molonglo Observatory Synthesis Telescope) the Australian Square Kilometre Array Pathfinder (ASKAP), the Australia Telescope Compact Array (ATCA), and the Australian Long Baseline Array (LBA). Using the LBA observations from 2017 September 23, we measured the source size to be
$34\pm1$
mas. During the brightest radio flare on 2017 September 21, the source was detected down to 119 MHz by the MWA, and the radio spectrum indicates a turnover between 250 and 500 MHz, which is most likely due to synchrotron self-absorption (SSA). By fitting the radio spectrum with a SSA model and using the LBA size measurement, we determined various physical parameters of the jet knot (identified in ATCA data), including the jet opening angle (
$\phi_{\rm op} = 4.5\pm1.2^{\circ}$
) and the magnetic field strength (
$B_{\rm s} = 104^{+80}_{-78}$
mG). Our fitted magnetic field strength agrees reasonably well with that inferred from the standard equipartition approach, suggesting the jet knot to be close to equipartition. Our study highlights the capabilities of the Australian suite of radio telescopes to jointly probe radio jets in black hole X-ray binaries via simultaneous observations over a broad frequency range, and with differing angular resolutions. This suite allows us to determine the physical properties of X-ray binary jets. Finally, our study emphasises the potential contributions that can be made by the low-frequency part of the Square Kilometre Array (SKA-Low) in the study of black hole X-ray binaries.
The GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) is a radio continuum survey at 76–227 MHz of the entire southern sky (Declination
$<\!{+}30^{\circ}$
) with an angular resolution of
${\approx}2$
arcmin. In this paper, we combine GLEAM data with optical spectroscopy from the 6dF Galaxy Survey to construct a sample of 1 590 local (median
$z \approx 0.064$
) radio sources with
$S_{200\,\mathrm{MHz}} > 55$
mJy across an area of
${\approx}16\,700\,\mathrm{deg}^{2}$
. From the optical spectra, we identify the dominant physical process responsible for the radio emission from each galaxy: 73% are fuelled by an active galactic nucleus (AGN) and 27% by star formation. We present the local radio luminosity function for AGN and star-forming (SF) galaxies at 200 MHz and characterise the typical radio spectra of these two populations between 76 MHz and
${\sim}1$
GHz. For the AGN, the median spectral index between 200 MHz and
${\sim}1$
GHz,
$\alpha_{\mathrm{high}}$
, is
$-0.600 \pm 0.010$
(where
$S \propto \nu^{\alpha}$
) and the median spectral index within the GLEAM band,
$\alpha_{\mathrm{low}}$
, is
$-0.704 \pm 0.011$
. For the SF galaxies, the median value of
$\alpha_{\mathrm{high}}$
is
$-0.650 \pm 0.010$
and the median value of
$\alpha_{\mathrm{low}}$
is
$-0.596 \pm 0.015$
. Among the AGN population, flat-spectrum sources are more common at lower radio luminosity, suggesting the existence of a significant population of weak radio AGN that remain core-dominated even at low frequencies. However, around 4% of local radio AGN have ultra-steep radio spectra at low frequencies (
$\alpha_{\mathrm{low}} < -1.2$
). These ultra-steep-spectrum sources span a wide range in radio luminosity, and further work is needed to clarify their nature.
As described in Chapter 2, after Congress failed to pass climate change legislation in 2009, the Obama administration EPA moved very quickly to find that GHG emissions from autos and light trucks were reasonably likely to endanger human health or welfare. This – the Endangerment Finding – then provided the legal basis for a number of subsequent regulations promulgated under the Clean Air Act. Most directly, the EPA first imposed GHG emission limits for cars (using the same section of the CAA under which the Endangerment Finding had been made). In so doing, the EPA arrogated to itself the job of setting mileage standards for cars and light trucks, a job that Congress had explicitly given to a very different agency, the National Highway Transportation Safety administration, that had a very different historical mission. Moreover, the automobile mileage standards set under the Obama administration’s so-called tailpipe rule were essentially the standards that California and some northeastern states had imposed on themselves.
Long before the Obama administration EPA’s Endangerment Finding, early IPCC assessment reports were a call to action for American tort lawyers. After all, the IPCC ARs attributed all sorts of human and environmental harms to climate change, and confidently linked climate change to anthropogenic CO2 emissions. These reports thus established a causal connection between CO2 emissions and harm. Such a causal connection held out the prospect of holding CO2 emitters legally liable under American tort law for harm from climate change.
The EPA’s Endangerment Finding relied entirely upon IPCC Assessment Reports (and, to a lesser extent, USGCRP Assessments) as supplying more than enough evidence that rising atmospheric CO2 concentrations have caused changes in various measures of climate and that without steps to reduce anthropogenic CO2, climate changes will get event worse in the future. Having already explained how both the IPCC and USGCRP have developed into science advocacy organizations, rather than assessment institutions, in this chapter I begin my brief critical analysis of the output produced by the IPCC. I focus on the IPCC because its reports are the primary basis for not only the Endangerment Finding but for precautionary US climate policy more generally.
As known not only by lawyers, judges, and regulators but also businesspeople, scientists and pretty much anybody who has ever had to make a decision on the basis of technical assessments of the relevant world, in any tough decision of this sort, there is conflicting evidence and opinion. Some things are known with certainty. We know for certain, for example, that the sun will rise in the east. But most important decisions are taken under conditions of uncertainty. A farmer deciding when to plant his fields with corn should consider evidence about the likelihood of a late, crop-killing frost; in deciding how to design a new product for market, a prudent business will get evidence about likely consumer demand for alternative designs at varying prices; a court deciding whether or not the defendant committed a crime will hear not only the prosecution’s evidence but also the defendant’s. Even after hearing evidence on both sides, a decision maker will not be certain about the state of the world relevant to her decision.
In this part of the book, I take a critical look at the beliefs about the state of scientific knowledge about climate change and its economic impacts that are commonly taken to support the precautionary policies explicated in Part I. Such an inquiry is necessary because a balanced and rational climate change policy must be based on both what is actually known and what is unknown about climate change and its impacts.
A very typical story leading to domestic environmental legislation in countries such as the United States is one in which people in a number of localities within the country begin to experience harm from a particular kind of pollution and then legislators from those places take up the cause and ask scientists to identify which pollutant or pollutants are causing that harm so that legislation can be passed to curb the harmful pollution. As discussed earlier, precisely such a process led to the passage of the Clean Air Act of 1970. One might suppose that a similar pattern occurred on the international level with global warming: some countries might have been experiencing harmful warming, leading them to ask for international scientific cooperation to identify the cause of the warming. Such a pattern – a problem is identified, and scientists are asked to figure out what is causing it and what might be done to eliminate it – is what might be called the standard model of how science relates to policy.
As long as our species has been on planet Earth, humans have had to adapt to their external physical environment. During the cold upper Paleolithic era – which runs from about 40,000–10,000 years ago – Neanderthals died out, and most remaining homo sapiens were hunter-gatherers. They lived in widely dispersed bands that followed herds of animals such as reindeer. Such hunter-gatherer Paleolithic life was precarious, subject to cycles in the population of prey animals. Still, even during these cold and grim Paleolithic times, recent research has found evidence that in favorable locations, some human groups adopted agriculture and expanded rapidly after the Last Glacial Maximum (about 25,000 years ago) or perhaps even earlier, 60,000–80,000 years ago. Moreover, even during this, the stone age, humans colonized new territories, such as Australia and the Americas, and by the end of the Paleolithic had domesticated dogs.
As we have seen, the CPP represented a decision by the EPA that the EPA should assume the job of transforming the energy basis of the US electric power industry away from fossil fuels and toward renewables. Congress did not give the EPA this task. To the contrary, as we have seen, for half a century, since the creation of the EPA in 1970, the EPA’s role in regulating the electric power generation industry has been limited to imposing pollution reduction requirements. On several occasions, however, Congress has passed laws directly mandating the use of particular fuels, both by power plants, and by automobiles. The history of these laws displays two patterns: the EPA has never been given a primary role, and typically was given no role, in implementing them; more importantly, such laws have virtually always had perverse effects, causing environmental harm rather than averting it.
In the Summary for Policymakers to its AR5, the IPCC makes a number of unqualified statements attributing observed climate change to human GHG emissions.
In early 2008, then-presidential candidate Barack Obama described his climate change regulatory agenda by saying that “if somebody wants to build a coal-fired [electricity generating] plant they can. It’s just that it will bankrupt them.” After his election, President Obama made good on his promise.