To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This state-of-the-art account unifies material developed in journal articles over the last 35 years, with two central thrusts: It describes a broad class of system models that the authors call 'stochastic processing networks' (SPNs), which include queueing networks and bandwidth sharing networks as prominent special cases; and in that context it explains and illustrates a method for stability analysis based on fluid models. The central mathematical result is a theorem that can be paraphrased as follows: If the fluid model derived from an SPN is stable, then the SPN itself is stable. Two topics discussed in detail are (a) the derivation of fluid models by means of fluid limit analysis, and (b) stability analysis for fluid models using Lyapunov functions. With regard to applications, there are chapters devoted to max-weight and back-pressure control, proportionally fair resource allocation, data center operations, and flow management in packet networks. Geared toward researchers and graduate students in engineering and applied mathematics, especially in electrical engineering and computer science, this compact text gives readers full command of the methods.
Our knowledge and understanding of the structure and function of complex host-associated communities has grown exponentially in the last decade through improvements in sequencing technologies. Despite this, there are still many outstanding research questions, which will undoubtably lead to many more. Concerted effort is required to elucidate the composition and function of taxonomic groups other than bacteria that constitute host microbiomes, and to extend our current cataloguing efforts to non-model and field-based host organisms. Further to this, we need to continue to move beyond the 'who?' question provided by relatively cheap amplicon sequencing to gain a better understanding of 'what?' the microbiome is doing, using metatranscriptomics approaches. Critically, we need to understand how members of the microbiome interact to confer function. Given the current unprecedented environmental change, microbiome plasticity may prove vital to host resilience and fitness. Furthermore, there is considerable potential for microbial biotechnology to improve numerous aspects of humanity, although care must be taken to ensure environmental and social justice prevail.
Two major outstanding questions in microbiome research ask what microbes are present in a community and how they interact with each other and their hosts. Recent, rapid improvements in nucleic acid (DNA and RNA) sequencing allow us to study the composition and function of microbiomes in unprecedented detail, leading to a step change in our understanding of host–microbe interactions. This chapter gives a broad overview of the basic toolkit available to modern microbiologists and microbial ecologists, exploring their application to key questions about microbiome structure and function. We cover tools based on nucleic acid sequencing (e.g. amplicon sequencing, metagenomics, metatranscriptomics) as well as approaches targeting larger molecules such as metabolomics and proteomics. We discuss the use of microbial culture as a means of measuring functional capacity of individual microbes, or building artificial communities to understand emergent properties of consortia. We emphasise the advantages of combining multiple techniques alongside robust experimental design to garner powerful quantitative estimates of microbiome structure, and how this relates to host–microbe interactions.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Through a long history of co-evolution, multicellular organisms form a complex of host cells plus many associated microorganism species. Consisting of algae, bacteria, archaea, fungi, protists and viruses, and collectively referred to as the microbiome, these microorganisms contribute to a range of important functions in their hosts, from nutrition, to behaviour and disease susceptibility. In this book, a diverse and international group of active researchers outline how multicellular organisms have become reliant on their microbiomes to function, and explore this vital interdependence across the breadth of soil, plant, animal and human hosts. They draw parallels and contrasts across hosts in different environments, and discuss how this invisible microbial ecosystem influences everything from the food we eat, to our health, to the correct functioning of ecosystems we depend on. This insightful read also pertinently encourages students and researchers in microbial ecology, ecology, and microbiology to consider how this interdependence may be key to mitigating environmental changes and developing microbial biotechnology to improve life on Earth.
We present a detailed overview of the cosmological surveys that we aim to carry out with Phase 1 of the Square Kilometre Array (SKA1) and the science that they will enable. We highlight three main surveys: a medium-deep continuum weak lensing and low-redshift spectroscopic HI galaxy survey over 5 000 deg2; a wide and deep continuum galaxy and HI intensity mapping (IM) survey over 20 000 deg2 from
$z = 0.35$
to 3; and a deep, high-redshift HI IM survey over 100 deg2 from
$z = 3$
to 6. Taken together, these surveys will achieve an array of important scientific goals: measuring the equation of state of dark energy out to
$z \sim 3$
with percent-level precision measurements of the cosmic expansion rate; constraining possible deviations from General Relativity on cosmological scales by measuring the growth rate of structure through multiple independent methods; mapping the structure of the Universe on the largest accessible scales, thus constraining fundamental properties such as isotropy, homogeneity, and non-Gaussianity; and measuring the HI density and bias out to
$z = 6$
. These surveys will also provide highly complementary clustering and weak lensing measurements that have independent systematic uncertainties to those of optical and near-infrared (NIR) surveys like Euclid, LSST, and WFIRST leading to a multitude of synergies that can improve constraints significantly beyond what optical or radio surveys can achieve on their own. This document, the 2018 Red Book, provides reference technical specifications, cosmological parameter forecasts, and an overview of relevant systematic effects for the three key surveys and will be regularly updated by the Cosmology Science Working Group in the run up to start of operations and the Key Science Programme of SKA1.
The Square Kilometre Array (SKA) is a planned large radio interferometer designed to operate over a wide range of frequencies, and with an order of magnitude greater sensitivity and survey speed than any current radio telescope. The SKA will address many important topics in astronomy, ranging from planet formation to distant galaxies. However, in this work, we consider the perspective of the SKA as a facility for studying physics. We review four areas in which the SKA is expected to make major contributions to our understanding of fundamental physics: cosmic dawn and reionisation; gravity and gravitational radiation; cosmology and dark energy; and dark matter and astroparticle physics. These discussions demonstrate that the SKA will be a spectacular physics machine, which will provide many new breakthroughs and novel insights on matter, energy, and spacetime.
This chapter discusses and reviews research on the relationship between two closely aligned concepts: intelligence and reasoning. We begin by defining reasoning in a general sense. Next, we review prominent theories and models of intelligence and reasoning in both the psychometric and cognitive psychological traditions, highlighting how the two constructs are both intertwined yet nonetheless conceptually discriminable. We follow by discussing issues involved in validly measuring reasoning, touching on considerations, concerns, and evidence informed by the cognitive and psychometric perspectives. Then, we review the relationship between reasoning and allied constructs and domains, including expertise, practical outcomes (e.g., educational and workplace achievement), working memory, and critical thinking. We conclude by sketching multiple avenues for future research.
The dissolution of the United Kingdom’s vitrified high-level-waste simulant, CaZn MW28, was investigated following the Product Consistency Test-B protocol for 112 d at 90 °C and in ultra-high-quality water. Residual rate dissolution (stage II) and rate resumption (stage III), after 28 d, was observed. Thermodynamic modelling suggested that solutions were saturated with respect to Mg- and Zn-bearing phases, and the presence of Mg- and Zn-smectite clays was tentatively observed. The formation of these phases was concurrent with a significant increase in the dissolution rate, similar to Stage III behavior seen in other nuclear waste simulant glass materials, indicating that the addition of Mg and Zn to high-level-waste glass (7.3 wt. % combined) significantly influences the dissolution rate.
Technological advances have led to better patient outcomes and the expansion of clinical services in paediatric cardiology. This expansion creates an ever-growing workload for clinicians, which has led to workflow and staffing issues that need to be addressed. The objective of this study was the development of a novel tool to measure the clinical workload of a paediatric cardiology service in Cape Town, South Africa: The patient encounter index is a tool designed to quantify clinical workload. It is defined as a ratio of the measured duration of clinical work to the total time available for such work. This index was implemented as part of a prospective cross-sectional study design. Clinical workload data were collected over a 10-day period using time-and-motion sampling. Clinicians were contractually expected to spend 50% of their daily workload on patient care. The median patient encounter index for the Western Cape Paediatric Cardiac Service was 0.81 (range 0.19–1.09), reflecting that 81% of total contractual working time was spent on clinical activities. This study describes the development and implementation of a novel tool for clinical workload quantification and describes its application to a busy paediatric cardiology service in Cape Town, South Africa. This tool prospectively quantifies clinical workload which may directly influence patient outcomes. Implementation of this novel tool in the described setting clearly demonstrated the excessive workload of the clinical service and facilitated effective motivation for improved allocation of resources.
Stationary cross-flow vortex N-factors were calculated over the surface of a yawed circular cone using computationally predicted and experimentally observed wavenumber distributions. Surface heat-flux data were obtained on a
half-angle circular cone to investigate the behaviour of the stationary waves at different angles of attack and Reynolds numbers at Mach 6 under quiet-flow conditions in the Boeing/AFOSR Mach-6 Quiet Tunnel at Purdue University. A wavelet analysis was conducted on the experimental surface heat-flux data to construct a spatial mapping of the local largest amplitude wavenumbers of the stationary cross-flow waves, which were between 40 and 80 per circumference. Significant axial and azimuthal variation was observed. The results from the wavelet analysis were used to inform the stability analysis. The computed integration marching directions demonstrated very good agreement with the experimentally observed paths. N-factors were first calculated by integrating the local amplification rate corresponding to the most amplified experimental wavenumbers. The calculations were repeated based on non-dimensional computationally varying wavenumber ratios, which were dimensionalized by the experimental data. The computed N-factors showed good agreement between the two techniques. N-factors were also computed using the computationally predicted most unstable wavenumbers. The results showed decreased agreement with the other two cases, suggesting that this assumption does not properly model the cross-flow transition process.
How landscapes respond to, and evolve from, large jökulhlaups (glacial outburst floods) is poorly constrained due to limited observations and detailed monitoring. We investigate how melt of glacier ice transported and deposited by multiple jökulhlaups during the 2010 eruption of Eyjafjallajökull, Iceland, modified the volume and surface elevation of jökulhlaup deposits. Jökulhlaups generated by the eruption deposited large volumes of sediment and ice, causing significant geomorphic change in the Gígjökull proglacial basin over a 4-week period. Observation of these events enabled robust constraints on the physical properties of the floods which informs our understanding of the deposits. Using ground-based LiDAR, GPS observations and the satellite-image-derived ArcticDEMs, we quantify the post-depositional response of the 60 m-thick Gígjökull sediment package to the meltout of buried ice and other geomorphic processes. Between 2010 and 2016, total deposit volume reduced by −0.95 × 106 m3 a−1, with significant surface lowering of up to 1.88 m a−1. Surface lowering and volumetric loss of the deposits is attributed to three factors: (i) meltout of ice deposited by the jökulhlaups; (ii) rapid melting of the buried Gígjökull glacier snout; and (iii) incision of the proglacial meltwater system into the jökulhlaup deposits.