To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many novel therapeutic options for depression exist that are either not mentioned in clinical guidelines or recommended only for use in highly specialist services. The challenge faced by clinicians is when it might be appropriate to consider such ‘non-standard’ interventions. This analysis proposes a framework to aid this decision.
Declaration of interest
In the past 3 years R.H.M.W. has received support for research, expenses to attend conferences and fees for lecturing and consultancy work (including attending advisory boards) from various pharmaceutical companies including Astra Zeneca, Cyberonics, Eli Lilly, Janssen, LivaNova, Lundbeck, MyTomorrows, Otsuka, Pfizer, Roche, Servier, SPIMACO and Sunovion. D.M.B.C. has received fees from LivaNova for attending an advisory board. In the past 3 years A.J.C. has received fees for lecturing from Astra Zeneca and Lundbeck; fees for consulting from LivaNova, Janssen and Allergan; and research grant support from Lundbeck.
In the past 3 years A.C. has received fees for lecturing from pharmaceutical companies namely Lundbeck and Sunovion. In the past 3 years A.L.M. has received support for attending seminars and fees for consultancy work (including advisory board) from Medtronic Inc and LivaNova. R.M. holds joint research grants with a number of digital companies that investigate devices for depression including Alpha-stim, Big White Wall, P1vital, Intel, Johnson and Johnson and Lundbeck through his mindTech and CLAHRC EM roles. M.S. is an associate at Blueriver Consulting providing intelligence to NHS organisations, pharmaceutical and devices companies. He has received honoraria for presentations and advisory boards with Lundbeck, Eli Lilly, URGO, AstraZeneca, Phillips and Sanofi and holds shares in Johnson and Johnson. In the past 3 years P.R.A.S. has received support for research, expenses to attend conferences and fees for lecturing and consultancy work (including attending an advisory board) from life sciences companies including Corcept Therapeutics, Indivior and LivaNova. In the past 3 years P.S.T. has received consultancy fees as an advisory board member from the following companies: Galen Limited, Sunovion Pharmaceuticals Europe Ltd, myTomorrows and LivaNova. A.H.Y. has undertaken paid lectures and advisory boards for all major pharmaceutical companies with drugs used in affective and related disorders and LivaNova. He has received funding for investigator initiated studies from AstraZeneca, Eli Lilly, Lundbeck and Wyeth.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
The chronology and cause of millennial depositional oscillations within last glacial loess of the Central Lowlands of the United States are uncertain. Here, we present a new age model that indicates the Peoria Silt along the Illinois River Valley accumulated episodically from ~28,500 to 16,000 cal yr BP, as the Lake Michigan Lobe margin fluctuated within northeastern Illinois. The age model indicates accelerated loess deposition coincident with regional glacial advances during the local last glacial maximum. A weakly developed paleosol, the Jules Geosol, represents a period of significantly slower deposition, from 23,700 to 22,000 cal yr BP. A gastropod assemblage-based reconstruction of mean July temperature shows temperatures 6–10°C cooler than modern during Peoria Silt deposition. Stable oxygen and carbon isotope values (δ18O and δ13C) of gastropod carbonate do not vary significantly across the pedostratigraphic boundary of the Jules Geosol, suggesting slower loess accumulation was a result of reduced glacial sediment supply rather than direct climatic factors. However, a decrease in δ18O values occurred between 26,000 and 24,000 cal yr BP, synchronous with the Lake Michigan Lobe’s southernmost advance. This δ18O decrease suggests a coupling of regional summer hydroclimate and ice lobe position during the late glacial period.
Giardiasis is a treatable disease, caused by the flagellated protozoan parasite, Giardia duodenalis (G. duodenalis). It is one of the most common enteric parasites found globally to cause gastrointestinal disturbances, and infections may result in long-term irritable bowel syndrome-like symptoms. It is a common misconception that giardiasis is associated with foreign travel, which results in locally acquired cases in the UK being underdiagnosed. This report highlights the findings from one large Scottish Health Board, arising from a change in testing methodology, which resulted in the screening of all stools submitted for enteric investigations for G. duodenalis. Previous selection criteria were restricted to patients with a travel history to specific regions of the world, or on the basis of certain clinical details. In this report, clinical details were recorded from samples shown to be positive using two methods: an ELISA-based antigen detection assay and microscopy. Clinical details were assessed for a total of 28 laboratory-confirmed positive cases against the original selection criteria. Twenty-six cases (93%) would have been excluded from Giardia testing if the previous selection criteria had been applied. Although nine cases stated foreign travel, only two had been to regions deemed to be ‘high risk’. Therefore, those seven cases that travelled to perceived ‘low-risk’ regions would have been excluded from testing for this reason. This summary highlights the need for significant improvements to the selection criteria for Giardia testing. Laboratories should be encouraged towards the testing of all routinely submitted stools for this neglected pathogen to ensure cases that are acquired locally are properly identified and treated effectively.
Giardia duodenalis and Cryptosporidium species are protozoan parasites capable of causing gastrointestinal disease in humans and animals through the ingestion of infective faeces. Whereas Cryptosporidium species can be acquired locally or through foreign travel, there is the mis-conception that giardiasis is considered to be largely travel-associated, which results in differences in laboratory testing algorithms. In order to determine the level of variation in testing criteria and detection methods between diagnostic laboratories for both pathogens across Scotland, an audit was performed. Twenty Scottish diagnostic microbiology laboratories were invited to participate with questions on sample acceptance criteria, testing methods, testing rates and future plans for pathogen detection. Reponses were received from 19 of the 20 laboratories representing each of the 14 territorial Health Boards. Detection methods varied between laboratories with the majority performing microscopy, one using a lateral flow immunochromatographic antigen assay, another using a manually washed plate-based enzyme immunoassay (EIA) and one laboratory trialling a plate-based EIA automated with an EIA plate washer. Whereas all laboratories except one screened every stool for Cryptosporidium species, an important finding was that significant variation in the testing algorithm for detecting Giardia was noted with only four laboratories testing all diagnostic stools. The most common criteria were ‘travel history’ (11 laboratories) and/or ‘when requested’ (14 laboratories). Despite only a small proportion of stools being examined in 15 laboratories for Giardia (2%–18% of the total number of stools submitted), of interest is the finding that a higher positivity rate was observed for Giardia than Cryptosporidium in 10 of these 15 laboratories. These findings highlight that the underreporting of Giardia in Scotland is likely based on current selection and testing algorithms.
Due to large uncertainties in many of the parameters used to model sea ice,
it is possible that models with significantly different physical processes
can be tuned to obtain realistic present-day simulations. However, in
studies of climate change, it is the response of the model it various
perturbations that is important, in studies response can be significantly
different in sea-ice models that include or exclude various physical
feedback mechanisms. Because simplifications in sea-ice physics are
necessary for general circulation model experiments, it is important to
assess which physical processes are essential for the accurate determination
of the sensitivity of the ice pack to climate perturbations. We have
attempted to address these issues using a new coupled ice-thickness
distribution ocean mixed-layer model. The sensitivity of the model to
surface heat-flux perturbations is examined and the importance of the ice
ocean and ice-albedo feedback mechanisms in determining this sensitivity is
analyzed. We find that the ice ocean and ice-albedo feedback processes are
not mutually exclusive, and that they both significantly alter the model
response to surface heat flux perturbations.
Snow falling uniformly on a distribution of ice thicknesses results in a
distribution of snow-cover thicknesses. These snow depths depend on the
amount of snow-fall, the time of year at which it falls, and the thickness
of the underlying ice. The effect of snowfall on snow-ice-thickness
distribution is examined using a single-column ice ocean model. The time at
which snow begins to accumulate, and melt ponds and leads freeze, affects
the surface albedo. The rate of snowfall affects ice-growth rates and, as a
result, the ice-thickness distribution. During the period of rapid ice
growth between the autumn freeze and mid-winter, snow falling on newly
formed ice is rapidly depleted due to sublimation. Snow falling on thicker
ice remains throughout the winter to create a source of melt-water for ponds
and runoff into the ocean.
Little is known about the tooth wear of South American theropod dinosaurs. This paper describes wear facets in Abelisauridae, Carcharodontosauridae and some indeterminate theropods teeth, from the Marília Formation. Four types of wear facets are proposed: vertically-oriented attritional striations; perpendicular attritional surfaces; oval wear facets; and apical grooves. All these worn surfaces were produced by dental occlusion, except the apical grooves, which are produced by the contact between predator teeth and the prey bone during predator–prey interaction. More detailed biomechanical and hardness testing of teeth and bone may further elucidate the pattern of tooth wear in theropods.
We present an overview of recent multidisciplinary, multi-institutional efforts to identify and date major sources of combustion aerosol in the current and paleoatmospheres. The work was stimulated, in part, by an atmospheric particle “sample of opportunity” collected at Summit, Greenland in August 1994, that bore the 14C imprint of biomass burning. During the summer field seasons of 1995 and 1996, we collected air filter, surface snow and snowpit samples to investigate chemical and isotopic evidence of combustion particles that had been transported from distant fires. Among the chemical tracers employed for source identification are organic acids, potassium and ammonium ions, and elemental and organic components of carbonaceous particles. Ion chromatography, performed by members of the Climate Change Research Center (University of New Hampshire), has been especially valuable in indicating periods at Summit that were likely to have been affected by the long range transport of biomass burning aerosol. Univariate and multivariate patterns of the ion concentrations in the snow and ice pinpointed surface and snowpit samples for the direct analysis of particulate (soot) carbon and carbon isotopes. The research at NIST is focusing on graphitic and polycyclic aromatic carbon, which serve as almost certain indicators of fire, and measurements of carbon isotopes, especially 14C, to distinguish fossil and biomass combustion sources.
Complementing the chemical and isotopic record, are direct “visual” (satellite imagery) records and less direct backtrajectory records, to indicate geographic source regions and transport paths. In this paper we illustrate the unique way in which the synthesis of the chemical, isotopic, satellite and trajectory data enhances our ability to develop the recent history of the formation and transport of soot deposited in the polar snow and ice.
Atmospheric gas samples (0.1m3) were collected at ground level during January/February 1984 in Las Vegas, Nevada for 14C/13C accelerator mass spectrometry and total abundance measurements of CO and CH4. During winter months in this locale, CO concentrations can occur at 10 to 100 times background, occasionally exceeding the National Ambient Air Quality Standard (NAAQS). Methane concentrations show a slight enhancement (∼24%) above the background (non-urban troposphere) level. A comparison of CO and CH4 concentrations shows a good linear correlation which may indicate a common source. Preliminary 14C/13C results of the two species suggest that fossil emissions are the predominant source of excess CO and CH4 in the samples taken. Estimates of anthropogenic CO and CH4 are important for source apportionment of combustion emissions. In addition, this information is valuable for understanding the global CO and CH4 cycles and, therefore, human impact on climate and the stratospheric ozone layer.
Recent progress in graphite target production for sub-milligram environmental samples in our facility is presented. We describe an optimized hydrolysis procedure now routinely used for the preparation of CO2 from inorganic samples, a new high-vacuum line dedicated to small sample processing (combining sample distillation and graphitization units), as well as a modified graphitization procedure. Although measurements of graphite targets as small as 35 μg C have been achieved, system background and measurement uncertainties increase significantly below 150 μg C. As target lifetime can become critically short for targets <150 μg C, the facility currently only processes inorganic samples down to 150 μg C. All radiocarbon measurements are made at the Scottish Universities Environmental Research Centre (SUERC) accelerator mass spectrometry (AMS) facility. Sample processing and analysis are labor-intensive, taking approximately 3 times longer than samples ≥ 500 μg C. The technical details of the new system, graphitization yield, fractionation introduced during the process, and the system blank are discussed in detail.
Does radioactive decay follow the Poisson distribution?—a fundamental question, to which the theoretical answer seems to be, Yes. On the practical side, the answer to this question impacts the best achievable precision in well-controlled counting experiments. There have been some noteworthy experimental tests of the Poisson assumption, using systems carefully designed for the analysis of individual pulses from stable radioactive sources; thus far, experiment supports theory. For low-level counting, the nature of the background distribution can be of profound practical importance, especially for very long counting experiments where validation by an adequate number of full replicates may be impracticable. One is tempted in such cases to assume that the variance is equal to the mean, in order to estimate the measurement uncertainty. Background radiation, however, has multiple components, only some of which are governed by the laws of radioactive decay.
A specially designed low-level gas counting system at NIST for interactive, retrospective individual pulse shape and time series analysis makes possible the investigation of the empirical distribution function of the background radiation, in a manner similar to the previous empirical distribution studies of radioactive decay. Benefits of individual pulse analysis are that there is no information loss due to averaging and that two independent tests of the Poisson hypothesis can be performed using data from a single, extended measurement period without the need for replication; namely, tests of the distribution of arrival times, expected to be uniform, and the distribution of inter-arrival times, expected to be exponential. For low-level counting the second test has a very interesting and very informative complement: the distribution of coincidence-anticoincidence inter-arrival times.
Key outcomes from the study were that: 1) nonstationarity in the mean background rate over extended periods of time could be compensated by an on-line paired counter technique, which is far preferable to the questionable practice of using an “error-multiplier” that presumes the wandering (nonstationary) background to be random; and 2) individual empirical pulse distributions differed from the ideal GM and Poisson processes by exhibiting giant pulses, a continuum of small pulses, afterpulses, and in certain circumstances bursts of pulses and transient relaxation processes. The afterpulses constituted ca. 8% of the anticoincidence background events, yet they escaped detection by the conventional distributional tests.
When 14C signals approach background levels, the validity of assumptions concerning Poisson counting statistics and measurement system stability becomes crucial in interpreting the resultant low-level counting observations. This has been demonstrated in our previous work on detection limits for non-Poisson error and it is critical in our current studies of carbonaceous pollutants, where the 14C signal from just 5 mg C is comparable to that of the background for our miniature gas proportional counters. To assure data quality, our multi-detector system is designed for the on-line monitoring of critical parameters that reflect both the (statistical) nature of the non-Poisson errors and the underlying (physical) causes. It sends >60 bits of information/pulse to a microprocessor which automatically generates, for each counting period, two-dimensional spectra and multiparameter correlation and control charts. To evaluate the validity of long-term counting of 1–10 mg C we use robust (statistical) estimators, optimal counting interval subdivision, and time series analysis of the individual pulses. New opportunities for selective sampling and chemical fractionation which come with the small sample measurement capability have led us to give special attention also to higher control levels, involving e g, isotonic heterogeneity and representative standard materials.
During the past three years radiocarbon assay has emerged as a primary tool in the quantitative assignment of sources of urban and rural particulate pollution. Its use in several major field studies has come about because of its excellent (fossil/biogenic) discriminating power, because of advances in 14C measurements of small samples, and because of the increased significance of carbonaceous particles in the atmosphere. The problem is especially important in the cities, where increased concentrations of fine particles lead to pollution episodes characterized by poor visibility and changes in the radiation balance (absorption, scattering), and immediate and possibly long-term health effects. Efforts in source apportionment in such affected areas have been based on emissions inventories, dispersion modeling, and receptor modeling – ie, chemical and physical (and statistical) characterization of particles collected at designated receptor sites. It is in the last category that 14C has become quite effective in helping to resolve particle sources. Results are presented for studies carried out in Los Angeles, Denver, and Houston which incorporated 14C measurements, inorganic and organic chemical characterization, and receptor modeling. The 14C data indicated wide ranging contributions of biogenic and fossil carbon sources – eg, <10% to 60% contemporary (biogenic) in Houston – depending on meteorological, biological, and anthropological activity. The combined (chemical, isotopic, statistical) data point to sources such as vehicles, wood combustion, power plants, and vegetation.
Ice cores and snow pits of the cryosphere contain particles that detail the history of past atmospheric air compositions. Some of these particles result from combustion processes and have undergone long-range transport to arrive in the Arctic. Recent research has focused on the separation of particulate matter from ice and snow, as well as the subsequent analysis of the separated particles for 14C with accelerator mass spectrometry (AMS) and for individual particle compositions with laser microprobe mass analysis (LAMMA). The very low particulate concentrations in Arctic samples make these measurements a challenge. The first task is to separate the particles from the ice core. Two major options exist to accomplish this separation. One option is to melt the ice and then filter the meltwater. A second option is to sublimate the ice core directly, depositing the particles onto a surface. This work demonstrates that greater control is obtained through sublimation. A suite of analytical methods has been used for the measurement of the carbon in snow and ice. Total carbon was analyzed with a carbon/nitrogen/hydrogen (CHN) analyzer. AMS was used for the determination of carbon isotopes. Since source identification of the carbonaceous particles is of primary importance here, the use of LAMMA was incorporated to link individual particle molecular-structural patterns to the same group of particles that were measured by the other techniques. Prior to this study, neither AMS nor LAMMA had been applied to particles contained in snow. This paper discusses the development and limitations of the methodology required to make these measurements.
It has been shown that contamination from humic acids, chitin, fungal products, etc., contributing young carbon, and from bitumen and carbonate, contributing old carbon, may not be completely removed from wood and char samples by the usual hydrochloric acid and sodium hydroxide pretreatments of the samples. A procedure is offered for the isolation of a pure chemical substance from such samples, cellulose from wood and uncombined carbon from char, that must represent the original material. Cellulose is prepared by boiling the resin-free sample in 1.25% H2SO4 and 1.25% NaOH, adding Schweitzer's reagent, filtering, and precipitating from the filtrate by acidification. Uncombined carbon is separated from char samples as the flocculant precipitate remaining after boiling in 70% HNO3, followed by settling overnight from a large volume of 6M HNO3. A simple procedure for the chemical examination of char samples is also offered for the estimation of the amounts of bitumen, carbonate, combined, and uncombined carbon in char.
Recent progress in preparation/combustion of submilligram organic samples at our laboratories is presented. Routine methods had to be modified/refined to achieve acceptable and consistent procedural blanks for organic samples smaller than 1000 μg C. A description of the process leading to a modified combustion method for smaller organic samples is given in detail. In addition to analyzing different background materials, the influence of different chemical reagents on the overall radiocarbon background level was investigated, such as carbon contamination arising from copper oxide of different purities and from different suppliers. Using the modified combustion method, small amounts of background materials and known-age standard IAEA-C5 were individually combusted to CO2. Below 1000 μg C, organic background levels follow an inverse mass dependency when combusted with the modified method, increasing from 0.13 ± 0.05 pMC up to 1.20 ± 0.04 pMC for 80 μg C. Results for a given carbon mass were lower for combustion of etched Iceland spar calcite mineral, indicating that part of the observed background of bituminous coal was probably introduced by handling the material in atmosphere prior to combustion. Using the modified combustion method, the background-corrected activity of IAEA-C5 agreed to within 2 σ of the consensus value of 23.05 pMC down to a sample mass of 55 μg C.