To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Astrophysics Telescope for Large Area Spectroscopy Probe is a concept for a National Aeronautics and Space Administration probe-class space mission that will achieve ground-breaking science in the fields of galaxy evolution, cosmology, Milky Way, and the Solar System. It is the follow-up space mission to Wide Field Infrared Survey Telescope (WFIRST), boosting its scientific return by obtaining deep 1–4 μm slit spectroscopy for ∼70% of all galaxies imaged by the ∼2 000 deg2 WFIRST High Latitude Survey at z > 0.5. Astrophysics Telescope for Large Area Spectroscopy will measure accurate and precise redshifts for ∼200 M galaxies out to z < 7, and deliver spectra that enable a wide range of diagnostic studies of the physical properties of galaxies over most of cosmic history. Astrophysics Telescope for Large Area Spectroscopy Probe and WFIRST together will produce a 3D map of the Universe over 2 000 deg2, the definitive data sets for studying galaxy evolution, probing dark matter, dark energy and modifications of General Relativity, and quantifying the 3D structure and stellar content of the Milky Way. Astrophysics Telescope for Large Area Spectroscopy Probe science spans four broad categories: (1) Revolutionising galaxy evolution studies by tracing the relation between galaxies and dark matter from galaxy groups to cosmic voids and filaments, from the epoch of reionisation through the peak era of galaxy assembly; (2) Opening a new window into the dark Universe by weighing the dark matter filaments using 3D weak lensing with spectroscopic redshifts, and obtaining definitive measurements of dark energy and modification of General Relativity using galaxy clustering; (3) Probing the Milky Way’s dust-enshrouded regions, reaching the far side of our Galaxy; and (4) Exploring the formation history of the outer Solar System by characterising Kuiper Belt Objects. Astrophysics Telescope for Large Area Spectroscopy Probe is a 1.5 m telescope with a field of view of 0.4 deg2, and uses digital micro-mirror devices as slit selectors. It has a spectroscopic resolution of R = 1 000, and a wavelength range of 1–4 μm. The lack of slit spectroscopy from space over a wide field of view is the obvious gap in current and planned future space missions; Astrophysics Telescope for Large Area Spectroscopy fills this big gap with an unprecedented spectroscopic capability based on digital micro-mirror devices (with an estimated spectroscopic multiplex factor greater than 5 000). Astrophysics Telescope for Large Area Spectroscopy is designed to fit within the National Aeronautics and Space Administration probe-class space mission cost envelope; it has a single instrument, a telescope aperture that allows for a lighter launch vehicle, and mature technology (we have identified a path for digital micro-mirror devices to reach Technology Readiness Level 6 within 2 yr). Astrophysics Telescope for Large Area Spectroscopy Probe will lead to transformative science over the entire range of astrophysics: from galaxy evolution to the dark Universe, from Solar System objects to the dusty regions of the Milky Way.
The present communication demonstrates that even if individuals are answering a pre/post survey at random, the percentage of individuals showing improvement from the pre- to the post-survey can be surprisingly high. Some simple formulas and tables are presented that will allow analysts to quickly determine the expected percentage of individuals showing improvement if participants just answered the survey at random. This benchmark percentage, in turn, defines the appropriate null hypothesis for testing if the actual percentage observed is greater than the expected random answering percentage.
The analysis is demonstrated by testing if actual improvement in a component of the US Department of Agriculture’s (USDA) Expanded Food and Nutrition Education Program is significantly different from random answering improvement.
From 2011 to 2014, 364320 adults completed a standardized pre- and post-survey administered by the USDA.
For each year, the statement that the actual number of improvements is less than the expected number if the questions were just answered at random cannot be rejected. This does not mean that the pre-/post-test survey instrument is flawed, only that the data are being inappropriately evaluated.
Knowing the percentage of individuals showing improvement on a pre/post survey instrument when questions are randomly answered is an important benchmark number to determine in order to draw valid inferences about nutrition interventions. The results presented here should help analysts in determining this benchmark number for some common survey structures and avoid drawing faulty inferences about the effectiveness of an intervention.
The concept of information has penetrated almost all areas of human inquiry, from physics, chemistry, and engineering through biology to the social sciences. And yet its status as a physical entity remains obscure. Traditionally, information has been treated as a derived or secondary concept. In physics especially, the fundamental bedrock of reality is normally vested in the material building blocks of the universe, be they particles, strings, or fields. Because bits of information are always instantiated in material degrees of freedom, the properties of information could, it seems, always be reduced to those of the material substrate. Nevertheless, over several decades there have been attempts to invert this interdependence and root reality in information rather than matter. This contrarian perspective is most famously associated with the name of John Archibald Wheeler, who encapsulated his proposal in the pithy dictum ‘it from bit?’ (Wheeler, 1999).
In a practical, everyday sense, information is often treated as a primary entity, as a ‘thing in its own right’ with a measure of autonomy; indeed, it is bought and sold as a commodity alongside gas and steel. In the life sciences, informational narratives are indispensable: biologists talk about the genetic code, about translation and transcription, about chemical signals and sensory data processing, all of which treat information as the currency of activity, the ‘oil’ that makes the ‘biological wheels go round’. The burgeoning fields of genomic and metagenomic sequencing and bioinformatics are based on the notion that informational bits are literally vital. But beneath this familiar practicality lies a stark paradox. If information makes a difference in the physical world, which it surely does, then should we not attribute to it causal powers? However, in physics causation is invariably understood at the level of particle and field interactions, not in the realm of abstract bits (or qubits, their quantum counterparts). Can we have both? Can two causal chains coexist compatibly? Are the twin narratives of material causation and informational causation comfortable bedfellows? If so, what are the laws and principles governing informational dynamics to place alongside the laws of material dynamics?
Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life's nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science.
An important debate in the literature is whether or not higher energy-dense foods are cheaper than less energy-dense foods. The present communication develops and applies an easy statistical test to determine if the relationship between food price and energy density is an artifact of how the data units are constructed (i.e. is it ‘spurious’ or ‘real’?).
After matching data on 4430 different foods from the National Health and Nutrition Examination Survey with corresponding prices from the Center for Nutrition Policy and Promotion’s Food Prices Database, we use a simple regression model to test if the relationship between food price and energy density is ‘real’ or ‘spurious’.
Total sample size is 4430 observations of consumed foods from 4578 participants from the non-institutionalized US adult population (aged 19 years and over).
Over all 4430 foods, the null hypothesis of a spurious inverse relationship between food price per energy density and energy density is not rejected. When the analysis is broken down by twenty-five food groups, there are only two cases where the inverse relationship is not spurious. In fact, the majority of non-spurious relationships between food price and energy density are positive, not negative.
One of the main arguments put forth regarding the poor diet quality of low-income households is that high energy-dense food is cheaper than lower energy-dense food. We find almost no statistical support for higher energy-dense food being cheaper than low energy-dense food. While economics certainly plays a role in explaining low nutritional quality, more sophisticated economic arguments are required and discussed.
A number of copy number variants (CNVs) have been suggested as
susceptibility factors for schizophrenia. For some of these the data
remain equivocal, and the frequency in individuals with schizophrenia is
To determine the contribution of CNVs at 15 schizophrenia-associated loci
(a) using a large new data-set of patients with schizophrenia
(n = 6882) and controls (n = 6316),
and (b) combining our results with those from previous studies.
We used Illumina microarrays to analyse our data. Analyses were
restricted to 520 766 probes common to all arrays used in the different
We found higher rates in participants with schizophrenia than in controls
for 13 of the 15 previously implicated CNVs. Six were nominally
significantly associated (P<0.05) in this new
data-set: deletions at 1q21.1, NRXN1, 15q11.2 and
22q11.2 and duplications at 16p11.2 and the Angelman/Prader–Willi
Syndrome (AS/PWS) region. All eight AS/PWS duplications in patients were
of maternal origin. When combined with published data, 11 of the 15 loci
showed highly significant evidence for association with schizophrenia
We strengthen the support for the majority of the previously implicated
CNVs in schizophrenia. About 2.5% of patients with schizophrenia and 0.9%
of controls carry a large, detectable CNV at one of these loci. Routine
CNV screening may be clinically appropriate given the high rate of known
deleterious mutations in the disorder and the comorbidity associated with
these heritable mutations.
The effectiveness of the Expanded Food and Nutrition Education Program in achieving its goals at the national, regional, and state level is unknown. Using US Department of Agriculture (USDA) data from all states and territories for the years 2000-2006, the impact of program and participant characteristics and returns to scale on the three outcome indicators used by the USDA are estimated. Program and participant characteristics do not seem to be as important as the amount of money spent on the program. Generally speaking, there are constant and increasing returns to scale for two of the three federal outcome indices for most states but not all.