We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Establishing the specific habitat requirements of forest specialists in fragmented natural habitats is vital for their conservation. We used camera-trap surveys and microhabitat-scale covariates to assess the habitat requirements, probability of occupancy and detection of two terrestrial forest specialist species, the Orange Ground-thrush Geokichla gurneyi and the Lemon Dove Aplopelia larvata during the breeding and non-breeding seasons of 2018–2019 in selected Southern Mistbelt Forests of KwaZulu-Natal and the Eastern Cape, South Africa. A series of camera-trap surveys over 21 days were conducted in conjunction with surveys of microhabitat structural covariates. During the wet season, percentage of leaf litter cover, short grass cover, short herb cover, tall herb cover and saplings 0–2 m, stem density of trees 6–10 m and trees 16–20 m were significant structural covariates for influencing Lemon Dove occupancy. In the dry season, stem density of 2–5 m and 10–15 m trees, percentage tall herb cover, short herb cover and 0–2 m saplings were significant covariates influencing Lemon Dove occupancy. Stem density of trees 2–5 m and 11–15 m, percentage of short grass cover and short herb cover were important site covariates influencing Orange Ground-thrush occupancy in the wet season. Our study highlighted the importance of a diverse habitat structure for both forest species. A high density of tall/mature trees was an essential microhabitat covariate, particularly for sufficient cover and food for these ground-dwelling birds. Avian forest specialists play a vital role in providing ecosystem services perpetuating forest habitat functioning. Conservation of the natural heterogeneity of their habitat is integral to management plans to prevent the decline of such species.
The criteria for objective memory impairment in mild cognitive impairment (MCI) are vaguely defined. Aggregating the number of abnormal memory scores (NAMS) is one way to operationalise memory impairment, which we hypothesised would predict progression to Alzheimer’s disease (AD) dementia.
Methods:
As part of the Australian Imaging, Biomarkers and Lifestyle Flagship Study of Ageing, 896 older adults who did not have dementia were administered a psychometric battery including three neuropsychological tests of memory, yielding 10 indices of memory. We calculated the number of memory scores corresponding to z ≤ −1.5 (i.e., NAMS) for each participant. Incident diagnosis of AD dementia was established by consensus of an expert panel after 3 years.
Results:
Of the 722 (80.6%) participants who were followed up, 54 (7.5%) developed AD dementia. There was a strong correlation between NAMS and probability of developing AD dementia (r = .91, p = .0003). Each abnormal memory score conferred an additional 9.8% risk of progressing to AD dementia. The area under the receiver operating characteristic curve for NAMS was 0.87 [95% confidence interval (CI) .81–.93, p < .01]. The odds ratio for NAMS was 1.67 (95% CI 1.40–2.01, p < .01) after correcting for age, sex, education, estimated intelligence quotient, subjective memory complaint, Mini-Mental State Exam (MMSE) score and apolipoprotein E ϵ4 status.
Conclusions:
Aggregation of abnormal memory scores may be a useful way of operationalising objective memory impairment, predicting incident AD dementia and providing prognostic stratification for individuals with MCI.
It is not clear to what extent associations between schizophrenia, cannabis use and cigarette use are due to a shared genetic etiology. We, therefore, examined whether schizophrenia genetic risk associates with longitudinal patterns of cigarette and cannabis use in adolescence and mediating pathways for any association to inform potential reduction strategies.
Methods
Associations between schizophrenia polygenic scores and longitudinal latent classes of cigarette and cannabis use from ages 14 to 19 years were investigated in up to 3925 individuals in the Avon Longitudinal Study of Parents and Children. Mediation models were estimated to assess the potential mediating effects of a range of cognitive, emotional, and behavioral phenotypes.
Results
The schizophrenia polygenic score, based on single nucleotide polymorphisms meeting a training-set p threshold of 0.05, was associated with late-onset cannabis use (OR = 1.23; 95% CI = 1.08,1.41), but not with cigarette or early-onset cannabis use classes. This association was not mediated through lower IQ, victimization, emotional difficulties, antisocial behavior, impulsivity, or poorer social relationships during childhood. Sensitivity analyses adjusting for genetic liability to cannabis or cigarette use, using polygenic scores excluding the CHRNA5-A3-B4 gene cluster, or basing scores on a 0.5 training-set p threshold, provided results consistent with our main analyses.
Conclusions
Our study provides evidence that genetic risk for schizophrenia is associated with patterns of cannabis use during adolescence. Investigation of pathways other than the cognitive, emotional, and behavioral phenotypes examined here is required to identify modifiable targets to reduce the public health burden of cannabis use in the population.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
This paper provides an up-to-date review of the problems related to the generation, detection and mitigation of strong electromagnetic pulses created in the interaction of high-power, high-energy laser pulses with different types of solid targets. It includes new experimental data obtained independently at several international laboratories. The mechanisms of electromagnetic field generation are analyzed and considered as a function of the intensity and the spectral range of emissions they produce. The major emphasis is put on the GHz frequency domain, which is the most damaging for electronics and may have important applications. The physics of electromagnetic emissions in other spectral domains, in particular THz and MHz, is also discussed. The theoretical models and numerical simulations are compared with the results of experimental measurements, with special attention to the methodology of measurements and complementary diagnostics. Understanding the underlying physical processes is the basis for developing techniques to mitigate the electromagnetic threat and to harness electromagnetic emissions, which may have promising applications.
Increasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
Field studies were conducted in 2016 and 2017 in Clinton, NC, to determine the interspecific and intraspecific interference of Palmer amaranth (Amaranthus palmeri S. Watson) or large crabgrass [Digitaria sanguinalis (L.) Scop.] in ‘Covington’ sweetpotato [Ipomoea batatas (L.) Lam.]. Amaranthus palmeri and D. sanguinalis were established 1 d after sweetpotato transplanting and maintained season-long at 0, 1, 2, 4, 8 and 0, 1, 2, 4, 16 plants m−1 of row in the presence and absence of sweetpotato, respectively. Predicted yield loss for sweetpotato was 35% to 76% for D. sanguinalis at 1 to 16 plants m−1 of row and 50% to 79% for A. palmeri at 1 to 8 plants m−1 of row. Weed dry biomass per meter of row increased linearly with increasing weed density. Individual dry biomass of A. palmeri and D. sanguinalis was not affected by weed density when grown in the presence of sweetpotato. When grown without sweetpotato, individual weed dry biomass decreased 71% and 62% from 1 to 4 plants m−1 row for A. palmeri and D. sanguinalis, respectively. Individual weed dry biomass was not affected above 4 plants m−1 row to the highest densities of 8 and 16 plants m−1 row for A. palmeri and D. sanguinalis, respectively.
We describe the motivation and design details of the ‘Phase II’ upgrade of the Murchison Widefield Array radio telescope. The expansion doubles to 256 the number of antenna tiles deployed in the array. The new antenna tiles enhance the capabilities of the Murchison Widefield Array in several key science areas. Seventy-two of the new tiles are deployed in a regular configuration near the existing array core. These new tiles enhance the surface brightness sensitivity of the array and will improve the ability of the Murchison Widefield Array to estimate the slope of the Epoch of Reionisation power spectrum by a factor of ∼3.5. The remaining 56 tiles are deployed on long baselines, doubling the maximum baseline of the array and improving the array u, v coverage. The improved imaging capabilities will provide an order of magnitude improvement in the noise floor of Murchison Widefield Array continuum images. The upgrade retains all of the features that have underpinned the Murchison Widefield Array’s success (large field of view, snapshot image quality, and pointing agility) and boosts the scientific potential with enhanced imaging capabilities and by enabling new calibration strategies.
Introduction: Situational awareness (SA) is essential for maintenance of scene safety and effective resource allocation in mass casualty incidents (MCI). Unmanned aerial vehicles (UAV) can potentially enhance SA with real-time visual feedback during chaotic and evolving or inaccessible events. The purpose of this study was to test the ability of paramedics to use UAV video from a simulated MCI to identify scene hazards, initiate patient triage, and designate key operational locations. Methods: A simulated MCI, including fifteen patients of varying acuity (blast type injuries), plus four hazards, was created on a college campus. The scene was surveyed by UAV capturing video of all patients, hazards, surrounding buildings and streets. Attendees of a provincial paramedic meeting were invited to participate. Participants received a lecture on SALT Triage and the principles of MCI scene management. Next, they watched the UAV video footage. Participants were directed to sort patients according to SALT Triage step one, identify injuries, and localize the patients within the campus. Additionally, they were asked to select a start point for SALT Triage step two, identify and locate hazards, and designate locations for an Incident Command Post, Treatment Area, Transport Area and Access/Egress routes. Summary statistics were performed and a linear regression model was used to assess relationships between demographic variables and both patient triage and localization. Results: Ninety-six individuals participated. Mean age was 35 years (SD 11), 46% (44) were female, and 49% (47) were Primary Care Paramedics. Most participants (80 (84%)) correctly sorted at least 12 of 15 patients. Increased age was associated with decreased triage accuracy [-0.04(-0.07,-0.01);p=0.031]. Fifty-two (54%) were able to localize 12 or more of the 15 patients to a 27x 20m grid area. Advanced paramedic certification, and local residency were associated with improved patient localization [2.47(0.23,4.72);p=0.031], [-3.36(-5.61,-1.1);p=0.004]. The majority of participants (78 (81%)) chose an acceptable location to start SALT triage step two and 84% (80) identified at least three of four hazards. Approximately half (53 (55%)) of participants designated four or more of five key operational areas in appropriate locations. Conclusion: This study demonstrates the potential of UAV technology to remotely provide emergency responders with SA in a MCI. Additional research is required to further investigate optimal strategies to deploy UAVs in this context.
This paper summarises developments in understanding sea level change during the Quaternary in Scotland since the publication of the Quaternary of Scotland Geological Conservation Review volume in 1993. We present a review of progress in methodology, particularly in the study of sediments in isolation basins and estuaries as well as in techniques in the field and laboratory, which have together disclosed greater detail in the record of relative sea level (RSL) change than was available in 1993. However, progress in determining the record of RSL change varies in different areas. Studies of sediments and stratigraphy offshore on the continental shelf have increased greatly, but the record of RSL change there remains patchy. Studies onshore have resulted in improvements in the knowledge of rock shorelines, including the processes by which they are formed, but much remains to be understood. Studies of Late Devensian and Holocene RSLs around present coasts have improved knowledge of both the extent and age range of the evidence. The record of RSL change on the W and NW coasts has disclosed a much longer dated RSL record than was available before 1993, possibly with evidence of Meltwater Pulse 1A, while studies in estuaries on the E and SW coasts have disclosed widespread and consistent fluctuations in Holocene RSLs. Evidence for the meltwater pulse associated with the Early Holocene discharge of Lakes Agassiz–Ojibway in N America has been found on both E and W coasts. The effects of the impact of storminess, in particular in cliff-top storm deposits, have been widely identified. Further information on the Holocene Storegga Slide tsunami has enabled a better understanding of the event, but evidence for other tsunami events on Scottish coasts remains uncertain. Methodological developments have led to new reconstructions of RSL change for the last 2000 years, utilising state-of-the-art GIA models and alongside coastal biostratigraphy to determine trends to compare with modern tide gauge and documentary evidence. Developments in GIA modelling have provided valuable information on patterns of land uplift during and following deglaciation. The studies undertaken raise a number of research questions which will require addressing in future work.
Life has been described as information flowing in molecular streams (Dawkins, 1996).Our growing understanding of the impact of horizontal gene transfer on evolutionary dynamics reinforces this fluid-like flow of molecular information (Joyce, 2002). The diversity of nucleic acid sequences, those known and yet to be characterized across Earth's varied environments, along with the vast repertoire of catalytic and structural proteins, presents as more of a dynamic molecular river than a tree of life. These informational biopolymers function as a mutualistic union so universal as to have been termed the Central Dogma (Crick, 1958). It is the distinct folding dynamics-the digital-like base pairing dominating nucleic acids, and the environmentally responsive and diverse range of analog-like interactions dictating protein folding (Goodwin et al., 2012)-that provides the basis for the mutualism. The intertwined functioning of these analog and digital forms of information (Goodwin et al., 2012) unified within diverse chemical networks is heralded as the Darwinian threshold of cellular life (Woese, 2002).
The discovery of prion diseases (Chien et al., 2004; Jablonka and Raz, 2009; Paravastu et al., 2008) introduced the paradigm of protein templates that propagate conformational information, suggesting a new context for Darwinian evolution. When taking both protein and nucleic acid moelcular evolution into consideration (Cairns- Smith, 1966; Joyce, 2002), the conceptual framework for chemical evolution can be generalized into three orthogonal dimensions as shown in Figure 5.1 (Goodwin et al., 2014). The 1st dimension manifests structural order through covalent polymerization reactions and includes chain length, sequence, and linkage chemistry inherent to a dynamic chemical network. The 2nd dimension extends the order in dynamic conformational networks through noncovalent interactions of the polymers. This dimension includes intramolecular and intermolecular forces, from macromolecular folding to supramolecular assembly to multicomponent quaternary structure. Folding in this 2nd dimension certainly depends on the primary polymer sequence, and the folding/assembly diversity yields an additional set of environmentally constrained supramolecular folding codes. For example, double-stranded DNA assemblies are dominated by the rules of complementary base pairing, while the self-propagating conformations of prions are based on additional noncovalent, environmentally-dependent interactions.
This article reports on a case study of a decade-long organizing forms response to the need for groundbreaking innovation while maintaining existing operational performance – the explore–exploit conundrum. Employing ‘grounded research,’ data were collected on the experiences of the Asia-Pacific arm of a multinational professional service firm’s key decision-makers, innovators and entrepreneurs. The findings reveal a three-tiered organizing forms response to the explore–exploit paradox, characterized by a novel combination of heavy exploitation-driven actions alongside deep exploration projects. This case suggests that one successful approach to delivering on both explore and exploit focuses on a productive tension that emerges by enacting innovative organizing forms with contextual awareness. This productive tension was sufficiently powerful to impel individuals to innovate, but also sufficiently contained to avoid interfering with commercial outcomes. An explore–exploit framework conceptualizes organizational changes incorporating complexity and contradiction, without the implicit emphasis on removing or denying the existing tension.
This work reports the growth of crystalline SrHfxTi1−xO3 (SHTO) films on Ge (001) substrates by atomic layer deposition. Samples were prepared with different Hf content x to explore if strain, from tensile (x = 0) to compressive (x = 1), affected film crystallization temperature and how composition affected properties. Amorphous films grew at 225 °C and crystallized into epitaxial layers at annealing temperatures that varied monotonically with composition from ~530 °C (x = 0) to ~660 °C (x = 1). Transmission electron microscopy revealed abrupt interfaces. Electrical measurements revealed 0.1 A/cm2 leakage current at 1 MV/cm for x = 0.55.
We have built an imaging polarimeter for use at mid-infrared wavelengths (i.e. N band or 8–13 μm). The detecting element is a 128 × 128 element Si:Ga Focal Plane Array, supplied by Amber Engineering, USA. The polarimeter itself provides diffraction limited images on a 4-m class telescope and has a field of view of about 32 arcsec of sky with 0·25 arcsec pixels. We describe the optical design, control electronics, observing modes and detector sensitivities. Also presented are some observational results to demonstrate the power of this new imaging polarimetric system.
To assess the impact of an emergency intensive care unit (EICU) established concomitantly with a freestanding emergency department (ED) during the aftermath of Hurricane Sandy.
Methods
We retrospectively reviewed records of all patients in Bellevue’s EICU from freestanding ED opening (December 10, 2012) until hospital inpatient reopening (February 7, 2013). Temporal and clinical data, and disposition upon EICU arrival, and ultimate disposition were evaluated.
Results
Two hundred twenty-seven patients utilized the EICU, representing approximately 1.8% of freestanding ED patients. Ambulance arrival occurred in 31.6% of all EICU patients. Median length of stay was 11.55 hours; this was significantly longer for patients requiring airborne isolation (25.60 versus 11.37 hours, P<0.0001 by Wilcoxon rank sum test). After stabilization and treatment, 39% of EICU patients had an improvement in their disposition status (P<0.0001 by Wilcoxon signed rank test); upon interhospital transfer, the absolute proportion of patients requiring ICU and SDU resources decreased from 37.8% to 27.1% and from 22.2% to 2.7%, respectively.
Conclusions
An EICU attached to a freestanding ED achieved significant reductions in resource-intensive medical care. Flexible, adaptable care systems should be explored for implementation in disaster response. (Disaster Med Public Health Preparedness. 2016;10:496–502)
The Church of Jesus Christ of Latter-day Saints, which Leo Tolstoy called “the American religion,” is also the most violently persecuted religion in American history. Between the appearance of Joseph Smith's Book of Mormon in 1830 and Smith's murder in 1844, the Mormons were chased out of New York, Ohio, and Missouri by neighbors enraged at their aggressive proselytism and the extraordinary claims of Joseph Smith. The Mormons’ flight from Missouri in 1838 was prompted by a series of bloody skirmishes with well-organized opponents; Governor Lilburn Boggs responded with a quasi-genocidal executive order declaring that the Mormons had “made war upon the people of this state” and “must be exterminated or driven from the state if necessary for the public peace.”
A few years after the Mormons had fled to Illinois and established a thriving city, an anti-Mormon mob lynched Smith with the assistance of a local militia. Facing the prospect of more violence and coercion, most of the remaining Mormons undertook a long, hazardous journey to Utah under the leadership of Brigham Young in 1847, believing they would be beyond the reach of their enemies in the United States. The federal government initially encouraged the move, seeing the Mormon exodus as an opportunity to establish an American presence in the western territories newly captured from Mexico. In return for Mormon allegiance to the United States, the Fillmore Administration appointed Brigham Young as governor of Utah Territory. Relations deteriorated as the Mormon leadership became increasingly assertive over the territory, making life intolerable for non-Mormon federal officials and leading to the Buchanan Administration's brief invasion of Utah in 1857.
The abortive invasion was followed by a series of increasingly aggressive congressional actions designed to force the Mormons to conform to American anti-bigamy laws. These culminated in the Edmunds Act of 1882 and the Edmunds-Tucker Act of 1887, which forbade Mormons from voting, holding elected office, or serving on juries, and authorized the federal government to confiscate Church property, including temples. These last measures induced the Church leadership to abandon polygamy in 1890, though it continued in secret for a while; Congress held up the seating of Senator Reed Smoot for seven years over the issue of whether the Mormon leadership was still allowing plural marriages to take place at the turn of the century.