To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Subglacial Antarctic Lakes Scientific Access (SALSA) Project accessed Mercer Subglacial Lake using environmentally clean hot-water drilling to examine interactions among ice, water, sediment, rock, microbes and carbon reservoirs within the lake water column and underlying sediments. A ~0.4 m diameter borehole was melted through 1087 m of ice and maintained over ~10 days, allowing observation of ice properties and collection of water and sediment with various tools. Over this period, SALSA collected: 60 L of lake water and 10 L of deep borehole water; microbes >0.2 μm in diameter from in situ filtration of ~100 L of lake water; 10 multicores 0.32–0.49 m long; 1.0 and 1.76 m long gravity cores; three conductivity–temperature–depth profiles of borehole and lake water; five discrete depth current meter measurements in the lake and images of ice, the lake water–ice interface and lake sediments. Temperature and conductivity data showed the hydrodynamic character of water mixing between the borehole and lake after entry. Models simulating melting of the ~6 m thick basal accreted ice layer imply that debris fall-out through the ~15 m water column to the lake sediments from borehole melting had little effect on the stratigraphy of surficial sediment cores.
Current treatments for schizophrenia are often associated with increased rates of metabolic syndrome (MetSy). MetSy is defined as meeting 3 of the following 5 criteria: waist circumference >40in (men) or >35in (women), triglycerides =150mg/dL, high density lipoprotein cholesterol (HDL) <40mg/dL (men) or <50mg/dL (women), systolic blood pressure (BP) =130mmHg or diastolic BP =85mmHg, fasting glucose =100mg/dL. Patients with MetSy have an elevated risk of developing type II diabetes and increased mortality due to cardiovascular disease. Lumateperone (lumateperone tosylate, ITI−007), a mechanistically novel antipsychotic that simultaneously modulates serotonin, dopamine, and glutamate neurotransmission, is FDA approved for the treatment of schizophrenia. This distinct pharmacological profile has been associated with favorable tolerability and a low risk of adverse metabolic effects in clinical trials. This post hoc analysis of 2 randomized, double-blind, placebo-controlled studies of patients with an acute exacerbation of schizophrenia compared rates of MetSy with lumateperone and risperidone. Data from an open-label long-term trial of lumateperone were also evaluated.
The incidence and shift in MetSy were analyzed in data pooled from 2 short-term (4 or 6 week) placebo- and active-controlled (risperidone 4mg) studies of lumateperone 42mg (Studies 005 and 302). The pooled lumateperone data were compared with data for risperidone. Data from an open-label 1-year trial (Study 303) evaluated MetSy in patients with stable schizophrenia switched from prior antipsychotic (PA) treatment to lumateperone 42mg.
In the acute studies (n=256 lumateperone 42mg, n=255 risperidone 4mg), rates of MetSy were similar between groups at baseline (16% lumateperone, 19% risperidone). At the end of treatment (EOT), MetSy was less common with lumateperone than with risperidone (13% vs 25%). More lumateperone patients (46%) compared with risperidone (25%) patients improved from having MetSy at baseline to no longer meeting MetSy criteria at EOT. Conversely, more patients on risperidone than on lumateperone developed MetSy during treatment (13% vs 5%). Differences in MetSy conversion rates were driven by changes in triglycerides and glucose. In the long-term study (n=602 lumateperone 42mg), 33% of patients had MetSy at PA baseline. Thirty-six percent of patients (36%) with MetSy at PA baseline improved to no longer meeting criteria at EOT. Fewer than half that percentage shifted from not meeting MetSy criteria to having MetSy (15%).
In this post hoc analysis, lumateperone 42mg patients had reduced rates of MetSy compared with risperidone patients. In the long-term study, patients with MetSy on PA switched to lumateperone 42mg had a reduction in the risk of MetSy. These results suggest that lumateperone 42mg is a promising new treatment for schizophrenia with a favorable metabolic profile.
OBJECTIVES/GOALS: There are two objectives: 1) To identify healthcare providers’ (HCP) barriers and potential solutions towards rural adolescents’ access to mental healthcare. Healthcare providers include pharmacists, physicians, and mental healthcare providers (MHPs). 2) To identify rural high schoolers’ barriers and potential solutions towards access to mental healthcare. METHODS/STUDY POPULATION: Fifteen HCPs will be recruited via email listserv and the snowball method. Perceived barriers of rural adolescents, personal barriers, current practices to address mental health in adolescents, and preferred solutions will be discussed. Twenty student and parent dyads will be recruited using fliers in school systems and will be interviewed individually outside of class time on school grounds or over the phone. Barriers to care and preferred solutions will be discussed. All interviews will be semi-structured, recorded, conducted in person or over the phone, and last for 30 minutes to an hour. Compensation will be $25 for students and parents each, $50 for pharmacists and mental health providers and $100 for physicians. Thematic qualitative data analysis will be performed using Atlas.ti software. RESULTS/ANTICIPATED RESULTS: Data collection is ongoing. Anticipated results for barriers include absence of mental healthcare providers in rural areas, inability to access mental healthcare providers further away, stigma towards mental healthcare, and lack of knowledge of mental health conditions and treatment. Anticipated results for potential solutions may include promoting mobile applications to assist with telehealth and self-care. Other solutions may be collaboration among rural healthcare providers for adolescents with mental health conditions. Preferred solutions may also include pharmacists disseminating knowledge to rural adolescents and their parents or referrals to mental healthcare providers. DISCUSSION/SIGNIFICANCE OF IMPACT: This project will identify barriers and solutions to access to mental healthcare among rural adolescents. These solutions can then be applied towards the creation of programs that address salient issues within rural communities with a greater chance of uptake and use so that rates of depression and suicide will decrease. CONFLICT OF INTEREST DESCRIPTION: Funding through UAB TL1 award.
An ever-increasing number of laboratory facilities are enabling in situ spectral reflectance measurements of materials under conditions relevant to all the bodies in the Solar System, from Mercury to Pluto and beyond. Results derived from these facilities demonstrate that exposure of different materials to various planetary surface conditions can provide insights into the endogenic and exogenic processes that operate to modify their surface spectra, and their relative importance. Temperature, surface atmospheric pressure, atmospheric composition, radiation environment, and exposure to the space environment have all been shown to measurably affect reflectance and emittance spectra of a wide range of materials. Planetary surfaces are dynamic environments, and as our ability to reproduce a wider range of planetary surface conditions improves, so will our ability to better determine the surface composition of these bodies, and by extension, their geologic history.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
This research explores media reporting of Indigenous students’ Programme for International Student Assessment (PISA) results in two national and 11 metropolitan Australian newspapers from 2001 to 2015. Of almost 300 articles on PISA, only 10 focused on reporting of Indigenous PISA results. While general or non-Indigenous PISA results featured in media reports, especially at the time of the publication of PISA results, there was overwhelming neglect of Indigenous results and the performance gap. A thematic analysis of articles showed mainstream PISA reporting had critical commentary which is not found in the Indigenous PISA articles. The three themes identified include: a lack of teacher quality in remote and rural schools; the debate on Gonski funding recommendations and the PISA achievement gap between Indigenous and non-Indigenous students. This study concluded the overwhelming neglect is linked to media bias, which continues to drive mainstream media coverage of Indigenous Australians.
Palaeoecology has been prominent in studies of environmental change during the Holocene epoch in Scotland. These studies have been dominated by palynology (pollen, spore and related bio-and litho-stratigraphic analyses) as a key approach to multi- and inter-disciplinary investigations of topics such as vegetation, climate and landscape change. This paper highlights some key dimensions of the pollen- and vegetation-based archive, with a focus upon woodland dynamics, blanket peat, human impacts, biodiversity and conservation. Following a brief discussion of chronological, climatic, faunal and landscape contexts, the migration, survival and nature of the woodland cover through time is assessed, emphasising its time-transgressiveness and altitudinal variation. While agriculture led to the demise of woodland in lowland areas of the south and east, the spread of blanket peat was especially a phenomenon of the north and west, including the Western and Northern Isles. Almost a quarter of Scotland is covered by blanket peat and the cause(s) of its spread continue(s) to evoke recourse to climatic, topographic, pedogenic, hydrological, biotic or anthropogenic influences, while we remain insufficiently knowledgeable about the timing of the formation processes. Humans have been implicated in vegetational change throughout the Holocene, with prehistoric woodland removal, woodland management, agricultural impacts arising from arable and pastoral activities, potential heathland development and afforestation. The viability of many current vegetation communities remains a concern, in that Scottish data show reductions in plant diversity over the last 400 years, which recent conservation efforts have yet to reverse. Palaeoecological evidence can be used to test whether conservation baselines and restoration targets are appropriate to longer-term ecosystem variability and can help identify when modern conditions have no past analogues.
Depression is a prevalent long-term condition that is associated with substantial resource use. Telehealth may offer a cost-effective means of supporting the management of people with depression.
To investigate the cost-effectiveness of a telehealth intervention (‘Healthlines’) for patients with depression.
A prospective patient-level economic evaluation conducted alongside a randomised controlled trial. Patients were recruited through primary care, and the intervention was delivered via a telehealth service. Participants with a confirmed diagnosis of depression and PHQ-9 score ≥10 were recruited from 43 English general practices. A series of up to 10 scripted, theory-led, telephone encounters with health information advisers supported participants to effect a behaviour change, use online resources, optimise medication and improve adherence. The intervention was delivered alongside usual care and was designed to support rather than duplicate primary care. Cost-effectiveness from a combined health and social care perspective was measured by net monetary benefit at the end of 12 months of follow-up, calculated from incremental cost and incremental quality-adjusted life years (QALYs). Cost–consequence analysis included cost of lost productivity, participant out-of-pocket expenditure and the clinical outcome.
A total of 609 participants were randomised – 307 to receive the Healthlines intervention plus usual care and 302 to receive usual care alone. Forty-five per cent of participants had missing quality of life data, 41% had missing cost data and 51% of participants had missing data on either cost or utility, or both. Multiple imputation was used for the base-case analysis. The intervention was associated with incremental mean per-patient National Health Service/personal social services cost of £168 (95% CI £43 to £294) and an incremental QALY gain of 0.001 (95% CI −0.023 to 0.026). The incremental cost-effectiveness ratio was £132 630. Net monetary benefit at a cost-effectiveness threshold of £20 000 was –£143 (95% CI –£164 to –£122) and the probability of the intervention being cost-effective at this threshold value was 0.30. Productivity costs were higher in the intervention arm, but out-of-pocket expenses were lower.
The Healthlines service was acceptable to patients as a means of condition management, and response to treatment after 4 months was higher for participants randomised to the intervention. However, the positive average intervention effect size was modest, and incremental costs were high relative to a small incremental QALY gain at 12 months. The intervention is not likely to be cost-effective in its current form.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
The Leptarctinae are an enigmatic subfamily of mustelids present in North America and Eurasia during the Miocene (Arikareean to Hemphillian North American Land Mammal Ages). Their diet and ecology have been particularly controversial. Some workers have suggested they were similar to koalas, whereas others suggested they were crushing omnivores analogous to raccoons. Leptarctus oregonensis Stock, 1930, a poorly known leptarctine from the early Barstovian, is represented by fragmented cranial elements and isolated teeth from the Mascall Formation of Oregon, and some fairly complete but undescribed material from the Olcott Formation of western Nebraska. Herein, we describe the first well-preserved skull of L. oregonensis from the type formation. Based on this new specimen, we confirm that L. oregonensis is a distinct species from L. primus Leidy, 1856 and L. ancipidens White, 1941 that is characterized by a distinct morphology of its tympanic projections and first upper molars. We are also able to describe intraspecific variation within L. oregonensis coinciding with the geographic distribution of the specimens (Oregon and Nebraska). The most variable characters are concentrated in the morphology of the frontals and the upper fourth premolar. Additional specimens will be needed to settle the debate over sexual dimorphism in this species, but this new specimen suggests that Leptarctus oregonensis, despite being one of the smallest members of the Leptarctinae, was an animal-dominated omnivore with considerable crushing ability.
The late Pleistocene megafaunal extinctions may have been the first extinctions directly related to human activity, but in North America the close temporal proximity of human arrival and the Younger Dryas climate event has hindered efforts to identify the ultimate extinction cause. Previous work evaluating the roles of climate change and human activity in the North American megafaunal extinction has been stymied by a reliance on geographic binning, yielding contradictory results among researchers. We used a fine-scale geospatial approach in combination with 95 megafaunal last-appearance and 75 human first-appearance radiocarbon dates to evaluate the North American megafaunal extinction. We used kriging to create interpolated first- and last-appearance surfaces from calibrated radiocarbon dates in combination with their geographic autocorrelation. We found substantial evidence for overlap between megafaunal and human populations in many but not all areas, in some cases exceeding 3000 years of predicted overlap. We also found that overlap was highly regional: megafauna had last appearances in Alaska before humans first appeared, but did not have last appearances in the Great Lakes region until several thousand years after the first recorded human appearances. Overlap in the Great Lakes region exceeds uncertainty in radiocarbon measurements or methodological uncertainty and would be even greater with sampling-derived confidence intervals. The kriged maps of last megafaunal occurrence are consistent with climate as a primary driver in some areas, but we cannot eliminate human influence from all regions. The late Pleistocene megafaunal extinction was highly variable in timing and duration of human overlap across the continent, and future analyses should take these regional trends into account.
Holocene tephrostratigraphy in Alaska provides independent chronology and stratigraphic correlation in a region where reworked old (Holocene) organic carbon can significantly distort radiocarbon chronologies. Here, we present new glass chemistry and chronology for Holocene tephras preserved in three Alaskan lakes: one in the eastern interior and two in the southern Brooks Range. Tephra beds in the eastern interior lake-sediment core are correlated with the White River Ash and the Hayes tephra set H (~4200–3700 cal yr BP), and an additional discrete tephra bed is likely from the Aleutian arc/Alaska Peninsula. Cryptotephras (nonvisible tephras) found in the Brooks Range include the informally named “Ruppert tephra” (~2700–2300 cal yr BP) and the Aniakchak caldera-forming event II (CFE II) tephra (~3600 cal yr BP). A third underlying Brooks Range cryptotephra is chemically indistinguishable from the Aniakchak CFE II tephra (4070–3760 cal yr BP) and is likely to be from an earlier eruption of the Aniakchak volcano.
In 1977, a large group of musicians performed David Dunn’s Sky Drift while moving slowly across a Southern California desert, documenting the concert with four stationary microphones. A year later, Dunn presented the work in New York as a ‘performance/documentation’, playing back the audio recording for a seated audience. This article explores issues of ‘liveness’ in recorded sound, ‘transparency’, ‘aura’ and ‘the work itself’ in order to examine the consequences of this act: what does it mean for a recording of an outdoor performance to be shared at an indoor concert event? Can such a complex and interactive experience – with widely dispersed musicians and mobile audience members – be successfully converted into a fixed document? What does a recording capture and what must it exclude? Because Sky Drift constantly shifts the physical relationships between musicians and audience across a vast outdoor landscape, each listener’s experience represents an equally valid sonic perspective on the piece. As a result, it is unclear how a satisfying recording might be made or what it might even mean to ‘hear the music’ at all. When relocated – away from its original outdoor context – Sky Drift is deprived of much of its potential to communicate meaning.
Training for the clinical research workforce does not sufficiently prepare workers for today’s scientific complexity; deficiencies may be ameliorated with training. The Enhancing Clinical Research Professionals’ Training and Qualifications developed competency standards for principal investigators and clinical research coordinators.
Clinical and Translational Science Awards representatives refined competency statements. Working groups developed assessments, identified training, and highlighted gaps.
Forty-eight competency statements in 8 domains were developed.
Training is primarily investigator focused with few programs for clinical research coordinators. Lack of training is felt in new technologies and data management. There are no standardized assessments of competence.
The translation of discoveries to drugs, devices, and behavioral interventions requires well-prepared study teams. Execution of clinical trials remains suboptimal due to varied quality in design, execution, analysis, and reporting. A critical impediment is inconsistent, or even absent, competency-based training for clinical trial personnel.
In 2014, the National Center for Advancing Translational Science (NCATS) funded the project, Enhancing Clinical Research Professionals’ Training and Qualifications (ECRPTQ), aimed at addressing this deficit. The goal was to ensure all personnel are competent to execute clinical trials. A phased structure was utilized.
This paper focuses on training recommendations in Good Clinical Practice (GCP). Leveraging input from all Clinical and Translational Science Award hubs, the following was recommended to NCATS: all investigators and study coordinators executing a clinical trial should understand GCP principles and undergo training every 3 years, with the training method meeting the minimum criteria identified by the International Conference on Harmonisation GCP.
We anticipate that industry sponsors will acknowledge such training, eliminating redundant training requests. We proposed metrics to be tracked that required further study. A separate task force was composed to define recommendations for metrics to be reported to NCATS.