To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Correlated light and electron microscopy (CLEM) has become a popular technique for combining the protein-specific labeling of fluorescence with electron microscopy, both at room and cryogenic temperatures. Fluorescence applications at cryo-temperatures have typically been limited to localization of tagged protein oligomers due to known issues of extended triplet state duration, spectral shifts, and reduced photon capture through cryo-CLEM objectives. Here, we consider fluorophore characteristics and behaviors that could enable more extended applications. We describe how dialkylcarbocanine DiD, and its autoquenching by resonant energy transfer (RET), can be used to distinguish the fusion state of a lipid bilayer at cryo-temperatures. By adapting an established fusion assay to work under cryo-CLEM conditions, we identified areas of fusion between influenza virus-like particles and fluorescently labeled lipid vesicles on a cryo-EM grid. This result demonstrates that cryo-CLEM can be used to localize functions in addition to tagged proteins, and that fluorescence autoquenching by RET can be incorporated successfully into cryo-CLEM approaches. In the case of membrane fusion applications, this method provides both an orthogonal confirmation of functional state independent of the morphological description from cryo-EM and a way to bridge room-temperature kinetic assays and the cryo-EM images.
Objectives: Obstructive sleep apnea (OSA) is associated with cognitive impairment but the relationships between specific biomarkers and neurocognitive domains remain unclear. The present study examined the influence of common health comorbidities on these relationships. Adults with suspected OSA (N=60; 53% male; M age=52 years; SD=14) underwent neuropsychological evaluation before baseline polysomnography (PSG). Apneic syndrome severity, hypoxic strain, and sleep architecture disturbance were assessed through PSG. Methods: Depression (Center for Epidemiological Studies Depression Scale, CESD), pain, and medical comorbidity (Charlson Comorbidity Index) were measured via questionnaires. Processing speed, attention, vigilance, memory, executive functioning, and motor dexterity were evaluated with cognitive testing. A winnowing approach identified 9 potential moderation models comprised of a correlated PSG variable, comorbid health factor, and cognitive performance. Results: Regression analyses identified one significant moderation model: average blood oxygen saturation (AVO2) and depression predicting recall memory, accounting for 31% of the performance variance, p<.001. Depression was a significant predictor of recall memory, p<.001, but AVO2 was not a significant predictor. The interaction between depression and AVO2 was significant, accounting for an additional 10% of the variance, p<.001. The relationship between low AVO2 and low recall memory performance emerged when depression severity ratings approached a previously established clinical cutoff score (CESD=16). Conclusions: This study examined sleep biomarkers with specific neurocognitive functions among individuals with suspected OSA. Findings revealed that depression burden uniquely influence this pathophysiological relationship, which may aid clinical management. (JINS, 2018, 28, 864–875)
To investigate the effectiveness and usability of automated procedural guidance during virtual temporal bone surgery.
Two randomised controlled trials were performed to evaluate the effectiveness, for medical students, of two presentation modalities of automated real-time procedural guidance in virtual reality simulation: full and step-by-step visual presentation of drillable areas. Presentation modality effectiveness was determined through a comparison of participants’ dissection quality, evaluated by a blinded otologist, using a validated assessment scale.
While the provision of automated guidance on procedure improved performance (full presentation, p = 0.03; step-by-step presentation, p < 0.001), usage of the two different presentation modalities was vastly different (full presentation, 3.73 per cent; step-by-step presentation, 60.40 per cent).
Automated procedural guidance in virtual temporal bone surgery is effective in improving trainee performance. Step-by-step presentation of procedural guidance was engaging, and therefore more likely to be used by the participants.
Considerable numbers of exceptionally preserved conodont apparatuses with hyaline elements are present in the middle-upper Darriwilian (Middle Ordovician, Whiterockian) Winneshiek Konservat-Lagerstätte in northeastern Iowa. These fossils, which are associated with a restricted biota including other conodonts, occur in fine-grained clastic sediments deposited in a meteorite impact crater. Among these conodont apparatuses, the common ones are identified as Archeognathus primus Cullison, 1938 and Iowagnathus grandis new genus new species. The 6-element apparatus of A. primus comprises two pairs of archeognathiform (P) and one pair of coleodiform (S) elements. The 15-element apparatus of I. grandis n. gen. n. sp. is somewhat reminiscent of the prioniodinid type and contains ramiform elements of alate (one element) and digyrate, bipennate, or tertiopedate types (7 pairs). Both conodont taxa are characterized by giant elements and the preservation of both crowns and basal bodies, the latter not previously reported in Ordovician conodont apparatuses. Comparison of the apparatus size in the Winneshiek specimens with that of the Scottish Carboniferous soft-part-preserved conodont animals suggests that the Iowa animals were significantly larger than the latter. The apparatus of A. primus differs conspicuously from the apparatuses of the prioniodontid Promissum from the Upper Ordovician Soom Shale of South Africa although the apparatus architecture of I. grandis n. gen. n. sp. shows some similarity to it. Based on the Winneshiek collections, a new family Iowagnathidae in Conodonta is proposed.
Machine learning is a lively academic discipline and a key player in the continuous pursuit for new technological developments. The editorial in the first issue of the journal Machine Learning, published in March 1986, described the discipline as that field of inquiry concerned with the processes by which intelligent systems improve their performance over time (Langley, 1986). A glossary of terms published in the same journal in 1998 refined this to: Machine Learning is the field of scientific study that concentrates on induction algorithms and on other algorithms that can be said to ‘learn’ (Kohavi and Provost, 1998).
Thomas J. Watson, the brilliant salesman who from 1914 to 1956 oversaw the remarkable growth and success of IBM, serving as both CEO and chairman, was famously quoted as saying in 1943, ‘I think there is a world market for maybe five computers.’ With more than one billion computers now in use worldwide (Virki, 2008), this quote is often referenced to illustrate how vastly their usefulness had been underestimated.No area of computer science is making progress more rapidly than machine learning, with computers being capable of tasks that were a few decades ago only mentioned in science-fiction stories. Watson brought to IBM from his previous employment his trademark motto ‘Think’. It would at the time have been reasonable for Watson to suppose that only humans could really ‘think’. While computers could surpass humans in adding, subtracting, multiplying, and dividing, they were hardly thought of as being good at human tasks, such as playing chess, which required thinking. This begs the question, ‘What is thinking?’ In February 1996 World Chess Champion Garry Kasparov took on the IBM computer Deep Blue in Philadelphia. Even with the IBM engineers allowed to reprogram the computer between games, the world champion won, but only just, losing one game, drawing two, and winning three. His victory was short-lived. The following year he played a rematch. With the score even after the first five of six games, Kasparov allowed Deep Blue to commit a knight sacrifice, which wrecked his Caro–Kann defence and forced him to resign in fewer than twenty moves.
A number of laser facilities coming online all over the world promise the capability of high-power laser experiments with shot repetition rates between 1 and 10 Hz. Target availability and technical issues related to the interaction environment could become a bottleneck for the exploitation of such facilities. In this paper, we report on target needs for three different classes of experiments: dynamic compression physics, electron transport and isochoric heating, and laser-driven particle and radiation sources. We also review some of the most challenging issues in target fabrication and high repetition rate operation. Finally, we discuss current target supply strategies and future perspectives to establish a sustainable target provision infrastructure for advanced laser facilities.
Carnivores are valued by conservationists globally but protecting them can impose direct costs on rural, livestock-dependent communities. Financial incentives are increasingly used with the goal of increasing people's tolerance of predators, but the definition of tolerance has been vague and inconsistent. Empirical correlations between attitudinal and behavioural measures of tolerance imply that attitudes may be a valid proxy for behaviours. However, theoretical differences between the concepts suggest that attitudinal tolerance and behavioural intention to kill cats would have different underlying determinants. We surveyed 112 residents within a forest–farm mosaic in northern Belize inhabited by jaguars Panthera onca and four other species of wild cats. A conservation payment programme pays local landowners when camera traps record cat presence on their land. Results indicated that tolerance was associated with gender and participation in the camera-trapping programme, whereas intention to kill cats was associated with cultural group (Mennonites vs Mestizos), presence of children in the home and, to a lesser extent, tolerance. Neither dependent variable was significantly related to depredation losses or economic factors. Results suggest that monetary payments alone are unlikely to affect attitudes and behaviours towards carnivores. Payment programmes may be enhanced by accentuating non-monetary incentives, leveraging social norms and targeting specific groups with information about risks and benefits associated with carnivores. By empirically separating two concepts commonly conflated as ‘tolerance’ we clarify understanding of how social forces interact with financial incentives to shape people's relationships with predators.
A single specimen of a new species of the chasmataspidid Diploaspis Størmer, 1972 is described from the upper Silurian (Pridoli) Phelps Member of the Fiddlers Green Formation (Bertie Group) in Herkimer County, New York State, USA. Diploaspis praecursor sp. nov. is distinguished by the shape of the posterolateral margins of the buckler, which are drawn out into angular epimera, and by the lack of elongate tubercles on the postabdomen. This discovery increases the taxonomic diversity of the Bertie Group by extending the geographic extent of Diploaspididae into North America. D. praecursor pre-dates previously known species of Diploaspis by more than 10 million years.
We have compiled a catalogue of H ii regions detected with the Murchison Widefield Array between 72 and 231 MHz. The multiple frequency bands provided by the Murchison Widefield Array allow us identify the characteristic spectrum generated by the thermal Bremsstrahlung process in H ii regions. We detect 306 H ii regions between 260° < l < 340° and report on the positions, sizes, peak, integrated flux density, and spectral indices of these H ii regions. By identifying the point at which H ii regions transition from the optically thin to thick regime, we derive the physical properties including the electron density, ionised gas mass, and ionising photon flux, towards 61 H ii regions. This catalogue of H ii regions represents the most extensive and uniform low frequency survey of H ii regions in the Galaxy to date.
Spectropolarimetric observations of the 10 arc second region surrounding Eta Carinae have been made in the 8-13 μm wavelength band. The observed polarization, of a few percent, is due to emission of radiation from aligned grains. The radial position angle of the polarization suggests that gas streaming is responsible for the grain alignment.
We compare first-order (refractive) ionospheric effects seen by the MWA with the ionosphere as inferred from GPS data. The first-order ionosphere manifests itself as a bulk position shift of the observed sources across an MWA field of view. These effects can be computed from global ionosphere maps provided by GPS analysis centres, namely the CODE. However, for precision radio astronomy applications, data from local GPS networks needs to be incorporated into ionospheric modelling. For GPS observations, the ionospheric parameters are biased by GPS receiver instrument delays, among other effects, also known as receiver DCBs. The receiver DCBs need to be estimated for any non-CODE GPS station used for ionosphere modelling. In this work, single GPS station-based ionospheric modelling is performed at a time resolution of 10 min. Also the receiver DCBs are estimated for selected Geoscience Australia GPS receivers, located at Murchison Radio Observatory, Yarragadee, Mount Magnet and Wiluna. The ionospheric gradients estimated from GPS are compared with that inferred from MWA. The ionospheric gradients at all the GPS stations show a correlation with the gradients observed with the MWA. The ionosphere estimates obtained using GPS measurements show promise in terms of providing calibration information for the MWA.
GLEAM, the GaLactic and Extragalactic All-sky MWA survey, is a survey of the entire radio sky south of declination + 25° at frequencies between 72 and 231 MHz, made with the MWA using a drift scan method that makes efficient use of the MWA’s very large field-of-view. We present the observation details, imaging strategies, and theoretical sensitivity for GLEAM. The survey ran for two years, the first year using 40-kHz frequency resolution and 0.5-s time resolution; the second year using 10-kHz frequency resolution and 2 s time resolution. The resulting image resolution and sensitivity depends on observing frequency, sky pointing, and image weighting scheme. At 154 MHz, the image resolution is approximately 2.5 × 2.2/cos (δ + 26.7°) arcmin with sensitivity to structures up to ~ 10° in angular size. We provide tables to calculate the expected thermal noise for GLEAM mosaics depending on pointing and frequency and discuss limitations to achieving theoretical noise in Stokes I images. We discuss challenges, and their solutions, that arise for GLEAM including ionospheric effects on source positions and linearly polarised emission, and the instrumental polarisation effects inherent to the MWA’s primary beam.
The Murchison Widefield Array is a Square Kilometre Array Precursor. The telescope is located at the Murchison Radio–astronomy Observatory in Western Australia. The MWA consists of 4 096 dipoles arranged into 128 dual polarisation aperture arrays forming a connected element interferometer that cross-correlates signals from all 256 inputs. A hybrid approach to the correlation task is employed, with some processing stages being performed by bespoke hardware, based on Field Programmable Gate Arrays, and others by Graphics Processing Units housed in general purpose rack mounted servers. The correlation capability required is approximately 8 tera floating point operations per second. The MWA has commenced operations and the correlator is generating 8.3 TB day−1 of correlation products, that are subsequently transferred 700 km from the MRO to Perth (WA) in real-time for storage and offline processing. In this paper, we outline the correlator design, signal path, and processing elements and present the data format for the internal and external interfaces.
The Murchison Widefield Array is a new low-frequency interferometric radio telescope built in Western Australia at one of the locations of the future Square Kilometre Array. We describe the automated radio-frequency interference detection strategy implemented for the Murchison Widefield Array, which is based on the aoflagger platform, and present 72–231 MHz radio-frequency interference statistics from 10 observing nights. Radio-frequency interference detection removes 1.1% of the data. Radio-frequency interference from digital TV is observed 3% of the time due to occasional ionospheric or atmospheric propagation. After radio-frequency interference detection and excision, almost all data can be calibrated and imaged without further radio-frequency interference mitigation efforts, including observations within the FM and digital TV bands. The results are compared to a previously published Low-Frequency Array radio-frequency interference survey. The remote location of the Murchison Widefield Array results in a substantially cleaner radio-frequency interference environment compared to Low-Frequency Array’s radio environment, but adequate detection of radio-frequency interference is still required before data can be analysed. We include specific recommendations designed to make the Square Kilometre Array more robust to radio-frequency interference, including: the availability of sufficient computing power for radio-frequency interference detection; accounting for radio-frequency interference in the receiver design; a smooth band-pass response; and the capability of radio-frequency interference detection at high time and frequency resolution (second and kHz-scale respectively).
The science cases for incorporating high time resolution capabilities into modern radio telescopes are as numerous as they are compelling. Science targets range from exotic sources such as pulsars, to our Sun, to recently detected possible extragalactic bursts of radio emission, the so-called fast radio bursts (FRBs). Originally conceived purely as an imaging telescope, the initial design of the Murchison Widefield Array (MWA) did not include the ability to access high time and frequency resolution voltage data. However, the flexibility of the MWA’s software correlator allowed an off-the-shelf solution for adding this capability. This paper describes the system that records the 100 μs and 10 kHz resolution voltage data from the MWA. Example science applications, where this capability is critical, are presented, as well as accompanying commissioning results from this mode to demonstrate verification.
Since Dolf Seilacher coined the term Konservat-Lagerstätten in 1970, these deposits have migrated from the margins to the mainstream of paleontological research. With greater understanding of the controls on their occurrence, new examples of exceptional preservation continue to be discovered. They provide critical data for phylogenies and stratigraphic ranges. Together with molecular data, they calibrate the history of many infrequently preserved taxa. Ostracods, tiny crustaceans with a biomineralized carapace, illustrate the importance of recent discoveries in Konservat-Lagerstätten. The rare examples with fossilized appendages are preserved in a diversity of ways, organically or through authigenic mineralization. They confirm that ostracods were present at least by the late Ordovician, provide evidence of relationships obscured by the morphology of the routinely preserved valves, and extend the stratigraphic range of particular groups. They reveal extraordinary features of the soft-tissue anatomy of ostracods, including reproductive morphology and strategy. While other taxa would provide equally compelling examples of research progress, it is clear that the concept of exceptional preservation is expanding. Future discoveries and new analytical methods will match the reconstruction of coloration in feathered dinosaurs, for example, for unexpected novelty.
The comments of Stanford and Bradley (above) do not address our criticisms and obfuscate the topic at hand with irrelevant data (e.g. the south-to-north movement of fluted points through the Ice Free Corridor), nonexistent data (e.g. ‘under the water’ or ‘destroyed sites’), and questionable data (e.g. Meadowcroft and Cactus Hill are by no means widely accepted, nor are Stanford and Bradley's ‘eight LGM sites’ in the mid-Atlantic region). Before touching on some of these points, we direct the reader to several recent articles (e.g. Morrow 2014; Raff & Bolnick 2014) that provide new evidence or arguments inconsistent with a trans-Atlantic migration, including the fact that DNA from the Clovis Anzick child (Montana) shows no European ancestry (Rasmussen et al. 2014). Although Stanford and Bradley describe their Solutrean ‘solution’ (Stanford & Bradley 1999) to the Pleistocene colonisation of North America as ‘testable’, their position is that the idea is correct until falsified. They propose that their colleagues have yet to provide sufficient ‘critiques’ or ‘challenges’ to discount it (see also Collins 2012; Collins et al. 2013). Yet they are the ones proposing a hypothesis inconsistent with overwhelming multidisciplinary evidence, and they ignore results of tests that do not support their claims.
Across Atlantic ice: the origin of America's Clovis culture (Stanford & Bradley 2012) is the latest iteration of a controversial proposal that North America was first colonised by people from Europe rather than from East Asia, as most researchers accept. The authors, Dennis Stanford and Bruce Bradley, argue that Solutrean groups from southern France and the Iberian Peninsula used watercraft to make their way across the North Atlantic and into North America during the Last Glacial Maximum (LGM). According to Stanford and Bradley, this 6000km journey was facilitated by a continuous ice shelf that provided fresh water and a food supply. Across Atlantic ice has received a number of positive reviews. Shea (2012: 294), for example, suggests that it is “an excellent example of hypothesis-building in the best tradition of processual archaeology. It challenges American archaeology in a way that will require serious research by its opponents”. Runnels (2012) is equally enthusiastic.