To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
The paper is based on a feasibility study to determine the suitability of various techniques for the non-destructive measurement of cladding thickness on uranium fuel elements. The techniques studied were: 1—the attentuation of the characteristic X-ray fluorescence from the uranium base metal by the cladding material, and 2—Compton scattering of X-rays from the cladding surface. The cladding materials used in the investigation were aluminum, 304 stainless steel and zirconium, providing a wide range of both atomic number and density.
Fully slatted concrete floors are prevalent in beef cattle housing. However, concerns have been raised about welfare of cattle accommodated on slats. The objective of this study was to evaluate the effect of diet and floor type on the intake, performance and cleanliness of dairy-origin bulls from a mean age of 8 months to slaughter at 15.5 months old. Forty-eight bulls, which had a mean initial live weight of 212 kg (SD = 23.7), were allocated one of four treatments which consisted of two floors and two diets, arranged in a 2×2 factorial design. The floors evaluated were a fully slatted concrete floor and a fully slatted concrete floor covered with rubber; while the diets offered were either a high concentrate diet or a grass silage-based diet supplemented with concentrates. Over the entire experimental period, floor type had no significant effect on intake. Interestingly, however, when bulls were offered concentrates ad libitum, those accommodated on rubber covered slats consumed more concentrates than those accommodated on concrete slats. No effect of floor type on intake was noted when bulls were offered the grass silage supplemented with concentrate diet. There were no significant interactions between floor and diet on animal performance. Animals accommodated on rubber covered slats had a significantly better performance than those accommodated on concrete slats, as assessed by live weight at slaughter and live weight gain/day (P < 0.01) and estimated carcass gain/day (P < 0.05). The diet offered had no significant effect on animal performance. Bulls accommodated on rubber covered slats were significantly cleaner than those accommodated on concrete slats on day 97 (P < 0.001), but there was no significant effect of floor type when measured at other time points in the experiment. It is concluded from this study that diet has an important role to play in assessing bulls’ responses in performance to the effect of covering concrete slatted floors with rubber. Bulls offered a high concentrate diet had a higher concentrate intake, higher performance but a similar feed conversion ratio (FCR) when accommodated on rubber covered slats compared to those accommodated on fully concrete slatted floors. Animals offered this intensive diet were less efficient (as measured by a higher FCR) than those offered a supplemented grass silage-based diet.
A double-blinded, randomised, placebo-controlled trial was conducted to determine whether routine pre-operative analgesia is beneficial in reducing post-operative ear pain following bilateral myringotomy and tube placement.
Forty-five children (aged 3–15 years) were randomised to receive either pre-operative analgesics (paracetamol and ibuprofen) (n = 21) or placebo (n = 24). All children underwent sevoflurane gas induction with intranasal fentanyl (2 mcg/kg) to reduce the incidence of emergence agitation. Post-operative pain scores were measured using the Wong-Baker Faces Pain Rating Scale. Median pain scores taken 90 minutes post-surgery, and the highest pain score recorded prior to 90 minutes, were analysed.
There were no statistical differences between the median pain scores at 90 minutes or subsequent need for rescue analgesia. Emergence agitation did not occur in any child. Inadvertent ear trauma, use of an intravenous cannula or airway adjunct did not affect pain scores.
Routine pre-operative analgesia does not reduce pain scores in the early post-operative period. Simple analgesics are effective for rescue analgesia in the minority of cases.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Introduction: Patients with concussion frequently present to the emergency department (ED). Studies of athletes and children indicate that concussion symptoms are often more severe and prolonged in females compared with males. To-date, study of sex-based concussion differences in general adult populations have been limited. This study examined sex-based differences in concussion outcomes. Methods: Adult (>17 years) patients presenting to one of three urban EDs in Edmonton, Alberta with Glasgow coma scale score 13 within 72 hours of a concussive event were recruited by on-site research assistants. Follow-up calls at 30 and 90 days post ED discharge captured extent of PCS using the Rivermead Post-Concussion questionnaire (RPQ), effect on daily living activities measured by the Rivermead Head Injury Questionnaire (RHIQ), and overall health-related quality of life using the 12-item Short Form Health Survey (SF-12). Dichotomous and categorical variables were compared using Fishers exact test; continuous variables were compared using t-tests or Mann-Whitney tests, as appropriate. Results: Overall, 130/250 enrolled patients were female. The median age was 35 years; men trended towards being younger (median=32 years; IQR: 23, 45) than women (median=40 years; IQR: 22, 52). Compared to women, more men were single (56% vs 38% (p=0.007) and employed (82% vs 71% (p=0.055). Men and women experienced different injury mechanisms (p=0.007) with more women reporting injury due to a fall (44% vs 26%), while more men were injured at work (16% vs 7%) or due to an assault (11% vs. 3%). Men had a higher return to ED rate (13% vs. 5%; p=0.015). Women had higher RPQ scores at baseline (p<0.001) and 30-day follow-up (p=0.001); this difference was not significant by 90 days (p=0.099). While women reported on the RHIQ at 30 days that their injury affected their usual activities significantly more than men (Median=5, IQR: 0, 11 vs. median=0.5, IQR: 0.5, 7; p=0.004), both groups had similar scores on the SF-12 physical composite and mental composite scales at all three measurement points. Conclusion: In a general ED concussion population, demographic differences exist between men and women. Based on self-reported and objective outcomes, womens usual activities may be more affected by concussion and PCS than men. Further analysis of these differences is required in order to identify different treatment options and ensure adequate care and treatment of injury.
We aimed to describe the clinical characteristics of West Nile patients reported in Québec in 2012 and 2013 and to document physical, mental and functional status 24 months after symptom onset according to illness severity. The cases were recruited by a public health professional. Data were collected from public health files, medical records and two standardised phone questionnaires: the Short Form-36 and the Instrumental Activities of Daily Living. In all, 92 persons participated in the study (25 had West Nile fever (WNF), 18 had meningitis and 49 had encephalitis). Encephalitis participants were older, had more underlying medical conditions, more neurological symptoms, worse hospital course and higher lethality than meningitis or WNF participants. Nearly half of the surviving hospitalised encephalitis patients required extra support upon discharge. At 24-month follow-up, encephalitis and meningitis patients had a lower score in two domains of the mental component: mental health and social functioning (P = 0.0025 and 0.0297, respectively) compared with the norms based on age- and sex-matched Canadians. Physical status was not affected by West Nile virus (WNV) infection. In addition, 5/36 (15%) of encephalitis, 1/17 (6%) of meningitis and 1/23 (5%) of WNF participants had new functional limitations 24 months after symptom onset. In summary, mental and functional sequelae in encephalitis patients are likely to represent a source of long-term morbidity. Preventive measures should target patients at higher risk of severe illness after WNV infection.
This study aimed to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on lameness. In all, 80 dairy origin bulls were blocked according to live weight and breed into 20 groups, and randomly allocated within groups to one of four treatments. The floor types studied were fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and then moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and then moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). The total duration of the study was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. During the growing period, there was a tendency for bulls accommodated on CS to have a higher locomotion score compared with those accommodated on RS (P=0.059). However, floor type had no significant effect on locomotion score during the finishing period. There was also no significant effect of floor type on digital dermatitis during both the growing or finishing period. Floor type had no significant effect on swelling at the leg joints at the end of the finishing period. Bulls accommodated on RS had the least probability of bruised soles during both the growing and finishing period (P<0.01). Growing bulls accommodated on CS had significantly greater front heel height net growth compared with those accommodated on RS (P<0.05). However, bulls accommodated on RS had a tendency to have greater front toe net growth compared with those accommodated on CS (P=0.087). Finishing bulls accommodated on CS-RS had the greatest front toe net growth (P<0.001). Heel height net growth was greatest in bulls accommodated on CS-S (P<0.001). Floor type had no significant effect on mean maximum hoof temperature during the growing period. Finishing bulls accommodated on CS-S had a significantly lower mean maximum hoof temperature compared with those accommodated on any other floor type (P<0.001). The study concluded that rubber flooring is a suitable alternative to fully slatted flooring, reducing the prevalence of bruised soles. Despite greater toe net growth in bulls accommodated on rubber flooring, there was no effect of floor type on locomotion score, suggesting that increased toe net growth does not adversely affect walking ability. In addition, although mean maximum hoof temperature was lowest in bulls accommodated on straw bedding, there was no evidence to suggest this is indicative of improved hoof health.
Despite lessons learned from the recent Ebola epidemic, attempts to survey and determine non-health care worker, industry-specific needs to address highly infectious diseases have been minimal. The aircraft rescue and fire fighting (ARFF) industry is often overlooked in highly infectious disease training and education, even though it is critical to their field due to elevated occupational exposure risk during their operations.
Supervisors perceived Frontline respondents to be more willing and comfortable to encounter potential highly infectious disease scenarios than the Frontline indicated. More than one-third of respondents incorrectly marked transmission routes of viral hemorrhagic fevers. There were discrepancies in self-reports on the existence of highly infectious disease orientation and skills demonstration, employee resources, and personal protective equipment policies, with a range of 7.5%-24.0% more Supervisors than Frontline respondents marking activities as conducted.
There are deficits in highly infectious disease knowledge, skills, and abilities among ARFF members that must be addressed to enhance member safety, health, and well-being. (Disaster Med Public Health Preparedness. 2018;12:675-679)
The aim of this study was to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on their performance, cleanliness, carcass characteristics and meat quality. In total, 80 dairy origin young bulls (mean initial live weight 224 kg (SD=28.4 kg)) were divided into 20 blocks with four animals each according to live weight. The total duration of the experimental period was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. Cattle were randomly assigned within blocks to one of four floor type treatments, which included fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). Bulls were offered ad libitum grass silage supplemented with concentrates during the growing period. During the finishing period, bulls were offered concentrates supplemented with chopped barley straw. There was no significant effect of floor type on total dry matter intake (DMI), feed conversion ratio, daily live weight gain or back fat depth during the growing and finishing periods. Compared with bulls accommodated on CS, RS and CS-RS, bulls accommodated on CS-S had a significantly lower straw DMI (P<0.01). Although bulls accommodated on CS and CS-S were significantly dirtier compared with those accommodated on RS and CS-RS on days 50 (P<0.05) and 151 (P<0.01), there was no effect of floor type on the cleanliness of bulls at the end of the growing and finishing periods. There was also no significant effect of floor type on carcass characteristics or meat quality. However, bulls accommodated on CS-S had a tendency for less channel, cod and kidney fat (P=0.084) compared with those accommodated on CS, RS and CS-RS. Overall, floor type had no effect on the performance, cleanliness, carcass characteristics or meat quality of growing or finishing beef cattle.
The aim of this 3 year study was to compare two suckler cow genotypes, namely Limousin×Holstein (LH) (sourced from the dairy herd) and Stabiliser (ST) (a composite breed), in terms performance at calving. Both dam genotypes were bred to a ST sire and calved in spring/early summer. There was no significant effect of dam genotype on concentrations of casein, lactose, protein or urea nitrogen in the colostrum. Colostrum from LH cows had a significantly higher fat concentration compared with ST cows (P<0.05). Dam genotype had no effect on incidence of calving difficulty, cow temperament or mothering ability score. There was a significant difference in milk supply scores between the two breeds of cows when the 3 years of data were combined (P=0.002), with a higher percentage of LH cows having a plentiful supply of milk compared with ST cows and conversely a higher percentage of ST having limited milk compared with LH cows. However this was not a consistent effect over the 3 years. This study demonstrated that both dam breeds exhibit high maternal attributes at calving. However further work is required to investigate if LH cows have a more plentiful milk supply since this has potential to influence growth rate of progeny.
Risk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.
Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.
Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.
Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.
To assess the impact of farm management on herd fertility, a survey of 105 beef farms in Northern Ireland was conducted to establish the relationship between management variables and fertility. Each herd's average calving interval (CI) and the proportion of cows with a CI > 450 days (extended calving interval, ECI) was calculated to establish herd fertility. The relationship between each response variable (CI and proportion ECI) and each explanatory variable (respondents’ answers to questionnaire) was examined using univariate linear regression analyses. All response variables found to be associated with the explanatory variables were modelled against each group in turn using a fully automated multivariate stepwise regression algorithm employing the method of forward selection with backward elimination. The optimum 365-day CI and a proportion of 0 cows per hundred calved ECI targets were not widely attained in the current study. The distribution of CI and proportion ECI in the current study suggests more realistic targets would be a 379-day CI and 5 cows per hundred calved with ECI in commercial beef breeding herds. Six management factors were found to be associated with herd fertility: herd vaccination, bull selection, fertility management, breeding female management, perception of extension service (rural education provided by the government) and record keeping. It was found that respondents who vaccinated cows had a reduction of 5 cows per hundred calved in the proportion of cows with ECI, and as the number of vaccines administered to a cow increased, the CI decreased. Regular vaccination of breeding bulls was associated with a 9-day reduction in CI. Bull selection strategy had several associations with herd fertility; most notable was that respondents who used visual selection rather than estimated breeding values (EBVs) to select bulls were found to have a 15-day longer CI and 7 cows per hundred calved higher proportion of cows with ECI. For each 0·01 increase in the proportion of cows served by artificial insemination, CI increased by 0·16 days. Respondents who rated their beef breeding herd fertility as ‘very good’ had lower ECI and CI than those who rated beef breeding herd fertility as poor or satisfactory. Condition scoring of cows at weaning lowered ECI by 5 cows per hundred calved. Those who perceived the extension service to be very useful had the lowest CI and lowest ECI. Respondents who did not keep a record of CI to assess herd fertility had an 11-day longer CI and 6 cows per hundred calved higher proportion ECI than those who did not. In conclusion, the survey found a number of important variables linked to improved fertility including selecting sires based on EBVs and using a robust vaccination programme.
In addition to the more or less steady solar wind, the Sun also ejects mass in highly time dependent events taking place in the corona once every few days at solar activity minimum and as often as three times a day at solar activity maximum (Hundhausen 1988, Low 1986). These events involve large scale reconfiguration of the corona with an expulsion of some 1015g of ionized material into interplanetary space. The High Altitude Observatory (HAO) operates a groundbased internally occulted coronagraph at Mauna Loa, Hawaii, with a field of view of the corona from 1.2 to 2.2R⊙ in heliocentric distance, registering polarization brightness. A second instrument at the same site in Hawaii observes the solar limb in Hα emission to detect chromospheric material from the limb out to 1.5R⊙. HAO also operates an externally occulted coronagraph/polarimeter onboard the NASA Solar Maximum Mission Satellite (SMM) launched in 1980, capitalizing on the advantage of space with a field of view from 1.5 to 6R⊙ to cover the fainter outer corona. Coronal mass ejections involve magnetic field reconfiguration from high in the corona down to the base lying below 1.1R⊙. Important physical insights can be had when simultaneous observations by HAO's three instruments are put together with a common scale and orientation to reveal a mass ejection in the full extent of the solar atmosphere from the limb outward. Combined observations of two mass ejections are presented, one associated with an eruptive prominence and the the other associated with a flare. The significance of these two events is that in both cases, the mass ejection was in fully developed motion and had traveled high into the corona well before the onset of the associated prominence or flare eruption, pointing to an instability in the large scale coronal magnetic field as the underlying cause of the global reconfiguration.
The radiocarbon dating of volcanic ash (tephra) deposits in New Zealand has been difficult on sites remote from the eruption, which contain either little carbon or degraded and contaminated charcoal. Although many studies of contamination removal from macroscopic charcoals from tephra sequences have been made, little attention has been paid to those containing no visible charcoal, because of the difficulty of obtaining sufficient carbon for radiometric dating. We report here experiments using accelerator mass spectrometry to establish a reliable method for dating a low-carbon aeolian and peat deposit containing a tephra horizon. Results so far demonstrate that improvements to existing chemical pretreatment methods are possible, and that dates obtained on oxidized fine-grained residues can approach the maximum age determined on good quality charred wood samples.
This paper outlines a dating program designed to test the reproducibility of radiocarbon dates on different materials of Late-Glacial age (plant macrofossils, fossil beetle remains, and the “humic” and “humin” chemical fractions of limnic sediments) using a combination of radiometric (beta counting) and accelerator mass spectrometry (AMS) techniques. The results have implications for the design of sampling strategies and for the development of improved dating protocols, both of which are important if a high-precision 14C chronology for the Late-Glacial is to be achieved.
14C measured in trace gases in clean air helps to determine the sources of such gases, their long-range transport in the atmosphere, and their exchange with other carbon cycle reservoirs. In order to separate sources, transport and exchange, it is necessary to interpret measurements using models of these processes. We present atmospheric 14CO2 measurements made in New Zealand since 1954 and at various Pacific Ocean sites for shorter periods. We analyze these for latitudinal and seasonal variation, the latter being consistent with a seasonally varying exchange rate between the stratosphere and troposphere. The observed seasonal cycle does not agree with that predicted by a zonally averaged global circulation model. We discuss recent accelerator mass spectrometry measurements of atmospheric 14CH4 and the problems involved in determining the fossil fuel methane source. Current data imply a fossil carbon contribution of ca 25%, and the major sources of uncertainty in this number are the uncertainty in the nuclear power source of 14CH4, and in the measured value for δ14C in atmospheric methane.
We present δ13C data from both bulk organic sediment samples and terrestrial plant macrofossils from five high-resolution sedimentary sequences from the United Kingdom from which extensive multiproxy data sets have been obtained. These span the last glacial-interglacial transition. Chronological control has been provided by radiocarbon dating and/or tephrochronology. The results demonstrate that significant shifts in bulk organic δ13C can be identified at key climatic transitions in most of the sites. The data are affected by site-specific influences that restrict their use as chronological markers. However, terrestrial plant macrofossil records are more consistent and reveal shifts that appear to be synchronous and which therefore offer a basis for interregional correlation as well as significant paleoenvironmental information.