To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Introduction: Acute heart failure (AHF) is a common emergency department (ED) presentation and may be associated with poor outcomes. Conversely, many patients rapidly improve with ED treatment and may not need hospital admission. Because there is little evidence to guide disposition decisions by ED and admitting physicians, we sought to create a risk score for predicting short-term serious outcomes (SSO) in patients with AHF. Methods: We conducted prospective cohort studies at 9 tertiary care hospital EDs from 2007 to 2019, and enrolled adult patients who required treatment for AHF. Each patient was assessed for standardized real-time clinical and laboratory variables, as well as for SSO (defined as death within 30 days or intubation, non-invasive ventilation (NIV), myocardial infarction, coronary bypass surgery, or new hemodialysis after admission). The fully pre-specified, logistic regression model with 13 predictors (age, pCO2, and SaO2 were modeled using spline functions with 3 knots and heart rate and creatinine with 5 knots) was fitted to the 10 multiple imputation datasets. Harrell's fast stepdown procedure reduced the number of variables. We calculated the potential impact on sensitivity (95% CI) for SSO and hospital admissions and estimated a sample size of 170 SSOs. Results: The 2,246 patients had mean age 77.4 years, male sex 54.5%, EMS arrival 41.1%, IV NTG 3.1%, ED NIV 5.2%, admission on initial visit 48.6%. Overall there were 174 (7.8%) SSOs including 70 deaths (3.1%). The final risk scale is comprised of five variables (points) and had c-statistic of 0.76 (95% CI: 0.73-0.80): 1.Valvular heart disease (1) 2.ED non-invasive ventilation (2) 3.Creatinine 150-300 (1) ≥300 (2) 4.Troponin 2x-4x URL (1) ≥5x URL (2) 5.Walk test failed (2) The probability of SSO ranged from 2.0% for a total score of 0 to 90.2% for a score of 10, showing good calibration. The model was stable over 1,000 bootstrap samples. Choosing a risk model total point admission threshold of >2 would yield a sensitivity of 80.5% (95% CI 73.9-86.1) for SSO with no change in admissions from current practice (48.6% vs 48.7%). Conclusion: Using a large prospectively collected dataset, we created a concise and sensitive risk scale to assist with admission decisions for patients with AHF in the ED. Implementation of this risk scoring scale should lead to safer and more efficient disposition decisions, with more high-risk patients being admitted and more low-risk patients being discharged.
Introduction: An important challenge physicians face when treating acute heart failure (AHF) patients in the emergency department (ED) is deciding whether to admit or discharge, with or without early follow-up. The overall goal of our project was to improve care for AHF patients seen in the ED while avoiding unnecessary hospital admissions. The specific goal was to introduce hospital rapid referral clinics to ensure AHF patients were seen within 7 days of ED discharge. Methods: This prospective before-after study was conducted at two campuses of a large tertiary care hospital, including the EDs and specialty outpatient clinics. We enrolled AHF patients ≥50 years who presented to the ED with shortness of breath (<7 days). The 12-month before (control) period was separated from the 12-month after (intervention) period by a 3-month implementation period. Implementation included creation of rapid access AHF clinics staffed by cardiology and internal medicine, and development of referral procedures. There was extensive in-servicing of all ED staff. The primary outcome measure was hospital admission at the index visit or within 30 days. Secondary outcomes included mortality and actual access to rapid follow-up. We used segmented autoregression analysis of the monthly proportions to determine whether there was a change in admissions coinciding with the introduction of the intervention and estimated a sample size of 700 patients. Results: The patients in the before period (N = 355) and the after period (N = 374) were similar for age (77.8 vs. 78.1 years), arrival by ambulance (48.7% vs 51.1%), comorbidities, current medications, and need for non-invasive ventilation (10.4% vs. 6.7%). Comparing the before to the after periods, we observed a decrease in hospital admissions on index visit (from 57.7% to 42.0%; P <0.01), as well as all admissions within 30 days (from 65.1% to 53.5% (P < 0.01). The autoregression analysis, however, demonstrated a pre-existing trend to fewer admissions and could not attribute this to the intervention (P = 0.91). Attendance at a specialty clinic, amongst those discharged increased from 17.8% to 42.1% (P < 0.01) and the median days to clinic decreased from 13 to 6 days (P < 0.01). 30-day mortality did not change (4.5% vs. 4.0%; P = 0.76). Conclusion: Implementation of rapid-access dedicated AHF clinics led to considerably increased access to specialist care, much reduced follow-up times, and possible reduction in hospital admissions. Widespread use of this approach can improve AHF care in Canada.
Psychometrically identified positive and negative schizotypy are differentially related to psychopathology, personality, and social functioning. However, little is known about the experience and expression of schizotypy in daily life. The present study employed the experience sampling method (ESM) to assess positive and negative schizotypy in daily life in a nonclinical sample of 412 young adults. ESM is a structured diary technique in which participants are prompted at random times during the day to complete an assessment of their current experiences. As hypothesized, positive schizotypy was associated with increased negative affect, thought impairment, suspiciousness, negative beliefs about current activities, and feelings of rejection, but not with social disinterest or decreased positive affect. Negative schizotypy, on the other hand, was associated with decreased positive affect and pleasure in daily life, increased negative affect, and marked decreases in social contact and interest. Both positive and negative schizotypy were associated with the desire to be alone when with others. However, this desire appeared to be moderated by anxiety in positive schizotypy and by diminished positive affect in negative schizotypy. The findings support the construct validity of a multidimensional model of schizotypy and the use of psychometric inventories for assessing these dimensions. ESM appears to be a promising method for examining the daily life experiences of schizotypic individuals.
Biases in cognition such as Jumping to Conclusions (JTC) and Verbal Self-Monitoring (VSM) are thought to underlie the formation of psychotic symptoms. This prospective study in people with an At Risk Mental State (ARMS) for psychosis examined how these cognitive biases changed over time, and predicted clinical and functional outcomes. Twenty-three participants were assessed at clinical presentation and a mean of 31 months later. Performance on a JTC and VSM tasks were measured at both time points. Relationships to symptom severity, level of function and the incidence of psychotic disorder were then examined. The levels of symptoms, function and VSM all improved over time, while JTC was stable. Five participants (22%) developed a psychotic disorder during the follow-up period, but the risk of transition was not related to performance on either task at baseline, or to longitudinal changes in task performance. JTC performance correlated with symptom severity at baseline and follow-up. Similarly, performance on the two tasks was not related to the level of functioning at follow-up. Thus, while the ARMS is associated with both VSM and JTC biases, neither predict the onset of psychosis or the overall functional outcome.
Understanding the behavioural ecology of Nycticebus menagensis is vital in conducting best-practice releases for those that are rescued from the illegal pet trade. In releasing protected species such as slow lorises, whose wild populations are severely affected by the wildlife trade, it is necessary to ensure wild survival and facilitate sustainable wild populations. Two important factors determining adaptation to wild conditions and natural habitat are consuming a natural diet and appropriate feeding behaviours (Cheyne, 2006; Grundmann and Didier, 2000). If they are to survive in the wild, it is important that the diets and feeding schedules of slow lorises undergoing rehabilitation meet the nutritional needs of the individuals while mirroring natural feeding behaviours.
Following an outbreak of highly pathogenic avian influenza virus (HPAIV) in a poultry house, control measures are put in place to prevent further spread. An essential part of the control measures based on the European Commission Avian Influenza Directive 2005/94/EC is the cleansing and disinfection (C&D) of infected premises. Cleansing and disinfection includes both preliminary and secondary C&D, and the dismantling of complex equipment during secondary C&D is also required, which is costly to the owner and also delays the secondary cleansing process, hence increasing the risk for onward spread. In this study, a quantitative risk assessment is presented to assess the risk of re-infection (recrudescence) occurring in an enriched colony-caged layer poultry house on restocking with chickens after different C&D scenarios. The risk is expressed as the number of restocked poultry houses expected before recrudescence occurs. Three C&D scenarios were considered, namely (i) preliminary C&D alone, (ii) preliminary C&D plus secondary C&D without dismantling and (iii) preliminary C&D plus secondary C&D with dismantling. The source-pathway-receptor framework was used to construct the model, and parameterisation was based on the three C&D scenarios. Two key operational variables in the model are (i) the time between depopulation of infected birds and restocking with new birds (TbDR) and (ii) the proportion of infected material that bypasses C&D, enabling virus to survive the process. Probability distributions were used to describe these two parameters for which there was recognised variability between premises in TbDR or uncertainty due to lack of information in the fraction of bypass. The risk assessment estimates that the median (95% credible intervals) number of repopulated poultry houses before recrudescence are 1.2 × 104 (50 to 2.8 × 106), 1.9 × 105 (780 to 5.7 × 107) and 1.1 × 106 (4.2 × 103 to 2.9 × 108) under C&D scenarios (i), (ii) and (iii), respectively. Thus for HPAIV in caged layers, undertaking secondary C&D without dismantling reduces the risk by 16-fold compared to preliminary C&D alone. Dismantling has an additional, although smaller, impact, reducing the risk by a further 6-fold and thus around 90-fold compared to preliminary C&D alone. On the basis of the 95% credible intervals, the model demonstrates the importance of secondary C&D (with or without dismantling) over preliminary C&D alone. However, the extra protection afforded by dismantling may not be cost beneficial in the context of reduced risk of onward spread.
Reduced occupational energy expenditure and increased energy intake are important contributors to the increasing prevalence of obesity. The aim of this study was to examine whether sedentary occupations, and specific indicators of energy intake and expenditure are associated with obesity risk in Australian women. Data were from 3,444 participants in the Australian Longitudinal Study on Women's Health, who reported their weight, dietary intake, physical activity and occupation in 2009 (at age 31–36), and weight in 2012. Participants were categorised as having a ‘less sedentary’ or ‘sedentary work’, based on occupation and activity patterns at work. Multivariate models were conducted to examine the odds of being obese (> 30 body-mass index) and risk of obesity in the two occupational groups based on energy balance factors (diet and physical activity). Models were adjusted for major confounders (smoking, education, income, number of children).There was no significant difference in the prevalence of obesity between groups (20.3% less sedentary vs 22.7% sedentary work, p = 0.11) at baseline. Being in the highest total energy intake tertile, saturated fat intake > 35g/d and drinking 3 or more sugar-sweetened beverages per week increased the odds of being obese in both groups. But to a higher extent in ‘less sedentary work’ (OR 2.11 95%CI 1.41–3.19; OR 3.04 95%CI 2.09–4.45; 2.07 95%CI 1.45–2.97, respectively). High physical activity (> 1000MET.min/week) was consistently associated with lower odds of being obese (OR 0.64 95%CI 0.43–0.97 ‘less sedentary’; OR 0.58 95%CI 0.36–0.93 ‘sedentary work’) but lower incidence of obesity only in ‘less sedentary work’ group (IRR 0.52 95%CI 0.30–0.88, absolute risk 14%). High sugar-sweetened beverages increased the incidence of obesity only in this group (IRR 1.72 95%CI 1.08–2.73, absolute risk 23%). Having a sedentary work per se did not play a major role in obesity prevalence and risk in women. Instead, high saturated fat and SSB intake, and physical inactivity remained the major contributors to obesity prevalence and risk, particularly for those in less sedentary jobs.
Eggs oviposited by Ascaridia galli females in artificial media are commonly used as a source of infective material. We investigated the rate of egg production by cultured mature females (n = 223), and changes in egg viability under different storage and incubation conditions. Eggs recovered after 1, 2 or 3 days of culture were subjected to either (1) storage in water at 4°C (1, 4 or 8 weeks) followed by incubation in 0.1 N H2SO4 at 26°C (2, 4 or 6 weeks); or (2) prolonged storage at 4°C (up to 14 weeks). Egg development and viability was assessed by morphology coupled with a viability dye exclusion test of hatched larvae. Of the 6,044 eggs recovered per mature female 49.2, 38.5 and 12.3% were recovered on days 1, 2 and 3 of worm incubation respectively with similar initial viability (≥99%) between days. Eggs recovered on different days had only minor differences in viability after storage. The prolonged storage period at 4°C significantly affected both viability and embryonation ability resulting in decline in viability of 5.7–6.2% per week. A smaller but significant decline in egg (2.0%) and hatched larval (1.4%) viability per week of incubation at 26°C was also observed. We conclude that storage and incubation conditions, not the day of egg recovery, are the main factors affecting A. galli egg viability. Our findings indicate that under aerobic conditions storage at 26°C may be preferable to 4°C whereas other studies indicate that under anaerobic conditions storage at 4°C is preferable.
We present ongoing work on the spectral energy distributions (SEDs) of active galactic nuclei (AGNs), derived from X-ray, ultraviolet, optical, infrared and radio photometry and spectroscopy. Our work is motivated by new wide-field imaging surveys that will identify vast numbers of AGNs, and by the need to benchmark AGN SED fitting codes. We have constructed 41 SEDs of individual AGNs and 80 additional SEDs that mimic Seyfert spectra. All of our SEDs span 0.09 to 30μm, while some extend into the X-ray and/or radio. We have tested the utility of the SEDs by using them to generate AGN photometric redshifts, and they outperform SEDs from the prior literature, including reduced redshift errors and flux density residuals.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
The initial classic Fontan utilising a direct right atrial appendage to pulmonary artery anastomosis led to numerous complications. Adults with such complications may benefit from conversion to a total cavo-pulmonary connection, the current standard palliation for children with univentricular hearts.
A single institution, retrospective chart review was conducted for all Fontan conversion procedures performed from July, 1999 through January, 2017. Variables analysed included age, sex, reason for Fontan conversion, age at Fontan conversion, and early mortality or heart transplant within 1 year after Fontan conversion.
A total of 41 Fontan conversion patients were identified. Average age at Fontan conversion was 24.5 ± 9.2 years. Dominant left ventricular physiology was present in 37/41 (90.2%) patients. Right-sided heart failure occurred in 39/41 (95.1%) patients and right atrial dilation was present in 33/41 (80.5%) patients. The most common causes for Fontan conversion included atrial arrhythmia in 37/41 (90.2%), NYHA class II HF or greater in 31/41 (75.6%), ventricular dysfunction in 23/41 (56.1%), and cirrhosis or fibrosis in 7/41 (17.1%) patients. Median post-surgical follow-up was 6.2 ± 4.9 years. Survival rates at 30 days, 1 year, and greater than 1-year post-Fontan conversion were 95.1, 92.7, and 87.8%, respectively. Two patients underwent heart transplant: the first within 1 year of Fontan conversion for heart failure and the second at 5.3 years for liver failure.
Fontan conversion should be considered early when atrial arrhythmias become common rather than waiting for severe heart failure to ensue, and Fontan conversion can be accomplished with an acceptable risk profile.
In 2013, the national surveillance case definition for West Nile virus (WNV) disease was revised to remove fever as a criterion for neuroinvasive disease and require at most subjective fever for non-neuroinvasive disease. The aims of this project were to determine how often afebrile WNV disease occurs and assess differences among patients with and without fever. We included cases with laboratory evidence of WNV disease reported from four states in 2014. We compared demographics, clinical symptoms and laboratory evidence for patients with and without fever and stratified the analysis by neuroinvasive and non-neuroinvasive presentations. Among 956 included patients, 39 (4%) had no fever; this proportion was similar among patients with and without neuroinvasive disease symptoms. For neuroinvasive and non-neuroinvasive patients, there were no differences in age, sex, or laboratory evidence between febrile and afebrile patients, but hospitalisations were more common among patients with fever (P < 0.01). The only significant difference in symptoms was for ataxia, which was more common in neuroinvasive patients without fever (P = 0.04). Only 5% of non-neuroinvasive patients did not meet the WNV case definition due to lack of fever. The evidence presented here supports the changes made to the national case definition in 2013.
Neighbourhood greenness or vegetative presence has been associated with indicators of health and well-being, but its relationship to depression in older adults has been less studied. Understanding the role of environmental factors in depression may inform and complement traditional depression interventions, including both prevention and treatment.
This study examines the relationship between neighbourhood greenness and depression diagnoses among older adults in Miami-Dade County, Florida, USA.
Analyses examined 249 405 beneficiaries enrolled in Medicare, a USA federal health insurance programme for older adults. Participants were 65 years and older, living in the same Miami location across 2 years (2010–2011). Multilevel analyses assessed the relationship between neighbourhood greenness, assessed by average block-level normalised difference vegetative index via satellite imagery, and depression diagnosis using USA Medicare claims data. Covariates were individual age, gender, race/ethnicity, number of comorbid health conditions and neighbourhood median household income.
Over 9% of beneficiaries had a depression diagnosis. Higher levels of greenness were associated with lower odds of depression, even after adjusting for demographics and health comorbidities. When compared with individuals residing in the lowest tertile of greenness, individuals from the middle tertile (medium greenness) had 8% lower odds of depression (odds ratio 0.92; 95% CI 0.88, 0.96; P = 0.0004) and those from the high tertile (high greenness) had 16% lower odds of depression (odds ratio 0.84; 95% CI 0.79, 0.88; P < 0.0001).
Higher levels of greenness may reduce depression odds among older adults. Increasing greenery – even to moderate levels – may enhance individual-level approaches to promoting wellness.
We measure the cosmic star formation history out to z = 1.3 using a sample of 918 radio-selected star-forming galaxies within the 2-deg2 COSMOS field. To increase our sample size, we combine 1.4-GHz flux densities from the VLA-COSMOS catalogue with flux densities measured from the VLA-COSMOS radio continuum image at the positions of I < 26.5 galaxies, enabling us to detect 1.4-GHz sources as faint as 40 μJy. We find that radio measurements of the cosmic star formation history are highly dependent on sample completeness and models used to extrapolate the faint end of the radio luminosity function. For our preferred model of the luminosity function, we find the star formation rate density increases from 0.017 M⊙ yr−1 Mpc−3 at z ∼ 0.225 to 0.092 M⊙ yr−1 Mpc−3 at z ∼ 1.1, which agrees to within 40% of recent UV, IR and 3-GHz measurements of the cosmic star formation history.
In Scotland, the base of the Ballagan Formation has traditionally been placed at the first grey mudstone within a contiguous Late Devonian to Carboniferous succession. This convention places the Devonian–Carboniferous boundary within the Old Red Sandstone (ORS) Kinnesswood Formation. The consequences of this placement are that tetrapods from the Ballagan Formation were dated as late Tournaisian in age and that the ranges of typically Devonian fish found in the Kinnesswood Formation continued into the Carboniferous. The Pease Bay specimen of the fish Remigolepis is from the Kinnesswood Formation. Comparisons with its range in Greenland, calibrated against spores, show it was Famennian in age. Detailed palynological sampling at Burnmouth from the base of the Ballagan Formation proves that the early Tournaisian spore zones (VI and HD plus Cl 1) are present. The Schopfites species that occurs through most of the succession is Schopfites delicatus rather than Schopfites claviger. The latter species defines the late Tournaisian CM spore zone. The first spore assemblage that has been found in Upper ‘ORS' strata underlying the Ballagan Formation (Preston, Whiteadder Water), contains Retispora lepidophyta and is from the early latest Famennian LL spore zone. The spore samples are interbedded with volcaniclastic debris, which shows that the Kelso Volcanic Formation is, in part, early latest Famennian in age. These findings demonstrate that the Ballagan Formation includes most of the Tournaisian with the Devonian–Carboniferous boundary positioned close to the top of the Kinnesswood Formation. The Stage 6 calcrete at Pease Bay can be correlated to the equivalent section at Carham, showing that it represents a time gap equivalent to the latest Famennian glaciation(s). Importantly, some of the recently described Ballagan Formation tetrapods are older than previously dated and now fill the key early part of Romer's Gap.
The lower Mississippian Ballagan Formation of northern Britain is one of only two successions worldwide to yield the earliest known tetrapods with terrestrial capability following the end-Devonian mass extinction event. Studies of the sedimentary environments and habitats in which these beasts lived have been an integral part of a major research project into how, why and under what circumstances this profound step in the evolution of life on Earth occurred. Here, a new palaeogeographic map is constructed from outcrop data integrated with new and archived borehole material. The map shows the extent of a very low-relief coastal wetland developed along the tropical southern continental margin of Laurussia. Coastal floodplains in the Midland Valley and Tweed basins were separated from the marginal marine seaway of the Northumberland–Solway Basin to the south by an archipelago of more elevated areas. A complex mosaic of sedimentary environments was juxtaposed, and included fresh and brackish to saline and hypersaline lakes, a diverse suite of floodplain palaeosols and a persistent fluvial system in the east of the region. The strongly seasonal climate led to the formation of evaporite deposits alternating with flooding events, both meteoric and marine. Storm surges drove marine floods from the SW into both the western Midland Valley and Northumberland–Solway Basin; marine water also flooded into the Tweed Basin and Tayside in the east. The Ballagan Formation is a rare example in the geological record of a tropical, seasonal coastal wetland that contains abundant, small-scale evaporite deposits. The diverse sedimentary environments and palaeosol types indicate a network of different terrestrial and aquatic habitats in which the tetrapods lived.